sha
stringlengths
40
40
text
stringlengths
1
13.4M
id
stringlengths
2
117
tags
listlengths
1
7.91k
created_at
stringlengths
25
25
metadata
stringlengths
2
875k
last_modified
stringlengths
25
25
arxiv
listlengths
0
25
languages
listlengths
0
7.91k
tags_str
stringlengths
17
159k
text_str
stringlengths
1
447k
text_lists
listlengths
0
352
processed_texts
listlengths
1
353
tokens_length
listlengths
1
353
input_texts
listlengths
1
40
c1cfaf133ec65cfd84213a990e313aff2f536e66
# Dataset Card for "mmlu-college_medicine-neg-prepend-fix" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
joey234/mmlu-college_medicine-neg-prepend-fix
[ "region:us" ]
2023-08-18T10:59:34+00:00
{"configs": [{"config_name": "default", "data_files": [{"split": "dev", "path": "data/dev-*"}, {"split": "test", "path": "data/test-*"}]}], "dataset_info": {"features": [{"name": "question", "dtype": "string"}, {"name": "choices", "sequence": "string"}, {"name": "answer", "dtype": {"class_label": {"names": {"0": "A", "1": "B", "2": "C", "3": "D"}}}}, {"name": "negate_openai_prompt", "struct": [{"name": "content", "dtype": "string"}, {"name": "role", "dtype": "string"}]}, {"name": "neg_question", "dtype": "string"}, {"name": "fewshot_context", "dtype": "string"}, {"name": "ori_prompt", "dtype": "string"}], "splits": [{"name": "dev", "num_bytes": 6685, "num_examples": 5}, {"name": "test", "num_bytes": 601115, "num_examples": 173}], "download_size": 16063, "dataset_size": 607800}}
2023-08-21T06:33:02+00:00
[]
[]
TAGS #region-us
# Dataset Card for "mmlu-college_medicine-neg-prepend-fix" More Information needed
[ "# Dataset Card for \"mmlu-college_medicine-neg-prepend-fix\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"mmlu-college_medicine-neg-prepend-fix\"\n\nMore Information needed" ]
[ 6, 27 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"mmlu-college_medicine-neg-prepend-fix\"\n\nMore Information needed" ]
b2585de8ac3fca81d5737fc0d6980b3159b1fe73
# Dataset Card for Evaluation run of xzuyn/Alpacino-SuperCOT-13B ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/xzuyn/Alpacino-SuperCOT-13B - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [xzuyn/Alpacino-SuperCOT-13B](https://huggingface.co/xzuyn/Alpacino-SuperCOT-13B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_xzuyn__Alpacino-SuperCOT-13B", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-10-14T21:25:30.263923](https://huggingface.co/datasets/open-llm-leaderboard/details_xzuyn__Alpacino-SuperCOT-13B/blob/main/results_2023-10-14T21-25-30.263923.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.034710570469798654, "em_stderr": 0.001874559812477868, "f1": 0.09780201342281894, "f1_stderr": 0.002213982622562692, "acc": 0.4222955971643869, "acc_stderr": 0.00954675265516166 }, "harness|drop|3": { "em": 0.034710570469798654, "em_stderr": 0.001874559812477868, "f1": 0.09780201342281894, "f1_stderr": 0.002213982622562692 }, "harness|gsm8k|5": { "acc": 0.07505686125852919, "acc_stderr": 0.007257633145486642 }, "harness|winogrande|5": { "acc": 0.7695343330702447, "acc_stderr": 0.011835872164836676 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_xzuyn__Alpacino-SuperCOT-13B
[ "region:us" ]
2023-08-18T10:59:35+00:00
{"pretty_name": "Evaluation run of xzuyn/Alpacino-SuperCOT-13B", "dataset_summary": "Dataset automatically created during the evaluation run of model [xzuyn/Alpacino-SuperCOT-13B](https://huggingface.co/xzuyn/Alpacino-SuperCOT-13B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_xzuyn__Alpacino-SuperCOT-13B\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-14T21:25:30.263923](https://huggingface.co/datasets/open-llm-leaderboard/details_xzuyn__Alpacino-SuperCOT-13B/blob/main/results_2023-10-14T21-25-30.263923.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.034710570469798654,\n \"em_stderr\": 0.001874559812477868,\n \"f1\": 0.09780201342281894,\n \"f1_stderr\": 0.002213982622562692,\n \"acc\": 0.4222955971643869,\n \"acc_stderr\": 0.00954675265516166\n },\n \"harness|drop|3\": {\n \"em\": 0.034710570469798654,\n \"em_stderr\": 0.001874559812477868,\n \"f1\": 0.09780201342281894,\n \"f1_stderr\": 0.002213982622562692\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.07505686125852919,\n \"acc_stderr\": 0.007257633145486642\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7695343330702447,\n \"acc_stderr\": 0.011835872164836676\n }\n}\n```", "repo_url": "https://huggingface.co/xzuyn/Alpacino-SuperCOT-13B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_18T14_16_23.975101", "path": ["**/details_harness|arc:challenge|25_2023-07-18T14:16:23.975101.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-18T14:16:23.975101.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_14T21_25_30.263923", "path": ["**/details_harness|drop|3_2023-10-14T21-25-30.263923.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-14T21-25-30.263923.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_14T21_25_30.263923", "path": ["**/details_harness|gsm8k|5_2023-10-14T21-25-30.263923.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-14T21-25-30.263923.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_18T14_16_23.975101", "path": ["**/details_harness|hellaswag|10_2023-07-18T14:16:23.975101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-18T14:16:23.975101.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_18T14_16_23.975101", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-18T14:16:23.975101.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-18T14:16:23.975101.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-18T14:16:23.975101.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-18T14:16:23.975101.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-18T14:16:23.975101.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-18T14:16:23.975101.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-18T14:16:23.975101.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-18T14:16:23.975101.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-18T14:16:23.975101.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-18T14:16:23.975101.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-18T14:16:23.975101.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-18T14:16:23.975101.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-18T14:16:23.975101.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-18T14:16:23.975101.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-18T14:16:23.975101.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-18T14:16:23.975101.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-18T14:16:23.975101.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-18T14:16:23.975101.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-18T14:16:23.975101.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-18T14:16:23.975101.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-18T14:16:23.975101.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-18T14:16:23.975101.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-18T14:16:23.975101.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-18T14:16:23.975101.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-18T14:16:23.975101.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-18T14:16:23.975101.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-18T14:16:23.975101.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-18T14:16:23.975101.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-18T14:16:23.975101.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-18T14:16:23.975101.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-18T14:16:23.975101.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-18T14:16:23.975101.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-18T14:16:23.975101.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-18T14:16:23.975101.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-18T14:16:23.975101.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-18T14:16:23.975101.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-18T14:16:23.975101.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-18T14:16:23.975101.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-18T14:16:23.975101.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-18T14:16:23.975101.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-18T14:16:23.975101.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-18T14:16:23.975101.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-18T14:16:23.975101.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-18T14:16:23.975101.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-18T14:16:23.975101.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-18T14:16:23.975101.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-18T14:16:23.975101.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-18T14:16:23.975101.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-18T14:16:23.975101.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-18T14:16:23.975101.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-18T14:16:23.975101.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-18T14:16:23.975101.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-18T14:16:23.975101.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-18T14:16:23.975101.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-18T14:16:23.975101.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-18T14:16:23.975101.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-18T14:16:23.975101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-18T14:16:23.975101.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-18T14:16:23.975101.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-18T14:16:23.975101.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-18T14:16:23.975101.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-18T14:16:23.975101.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-18T14:16:23.975101.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-18T14:16:23.975101.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-18T14:16:23.975101.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-18T14:16:23.975101.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-18T14:16:23.975101.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-18T14:16:23.975101.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-18T14:16:23.975101.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-18T14:16:23.975101.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-18T14:16:23.975101.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-18T14:16:23.975101.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-18T14:16:23.975101.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-18T14:16:23.975101.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-18T14:16:23.975101.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-18T14:16:23.975101.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-18T14:16:23.975101.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-18T14:16:23.975101.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-18T14:16:23.975101.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-18T14:16:23.975101.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-18T14:16:23.975101.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-18T14:16:23.975101.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-18T14:16:23.975101.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-18T14:16:23.975101.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-18T14:16:23.975101.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-18T14:16:23.975101.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-18T14:16:23.975101.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-18T14:16:23.975101.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-18T14:16:23.975101.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-18T14:16:23.975101.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-18T14:16:23.975101.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-18T14:16:23.975101.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-18T14:16:23.975101.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-18T14:16:23.975101.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-18T14:16:23.975101.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-18T14:16:23.975101.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-18T14:16:23.975101.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-18T14:16:23.975101.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-18T14:16:23.975101.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-18T14:16:23.975101.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-18T14:16:23.975101.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-18T14:16:23.975101.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-18T14:16:23.975101.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-18T14:16:23.975101.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-18T14:16:23.975101.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-18T14:16:23.975101.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-18T14:16:23.975101.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-18T14:16:23.975101.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-18T14:16:23.975101.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-18T14:16:23.975101.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-18T14:16:23.975101.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-18T14:16:23.975101.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-18T14:16:23.975101.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-18T14:16:23.975101.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_18T14_16_23.975101", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-18T14:16:23.975101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-18T14:16:23.975101.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_18T14_16_23.975101", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-18T14:16:23.975101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-18T14:16:23.975101.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_18T14_16_23.975101", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-18T14:16:23.975101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-18T14:16:23.975101.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_18T14_16_23.975101", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-18T14:16:23.975101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-18T14:16:23.975101.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_18T14_16_23.975101", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-18T14:16:23.975101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-18T14:16:23.975101.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_18T14_16_23.975101", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-18T14:16:23.975101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-18T14:16:23.975101.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_18T14_16_23.975101", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-18T14:16:23.975101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-18T14:16:23.975101.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_18T14_16_23.975101", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-18T14:16:23.975101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-18T14:16:23.975101.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_18T14_16_23.975101", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-18T14:16:23.975101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-18T14:16:23.975101.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_18T14_16_23.975101", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-18T14:16:23.975101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-18T14:16:23.975101.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_18T14_16_23.975101", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-18T14:16:23.975101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-18T14:16:23.975101.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_18T14_16_23.975101", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-18T14:16:23.975101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-18T14:16:23.975101.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_18T14_16_23.975101", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-18T14:16:23.975101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-18T14:16:23.975101.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_18T14_16_23.975101", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-18T14:16:23.975101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-18T14:16:23.975101.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_18T14_16_23.975101", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-18T14:16:23.975101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-18T14:16:23.975101.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_18T14_16_23.975101", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-18T14:16:23.975101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-18T14:16:23.975101.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_18T14_16_23.975101", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-18T14:16:23.975101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-18T14:16:23.975101.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_18T14_16_23.975101", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-18T14:16:23.975101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-18T14:16:23.975101.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_18T14_16_23.975101", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-18T14:16:23.975101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-18T14:16:23.975101.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_18T14_16_23.975101", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-18T14:16:23.975101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-18T14:16:23.975101.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_18T14_16_23.975101", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-18T14:16:23.975101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-18T14:16:23.975101.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_18T14_16_23.975101", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-18T14:16:23.975101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-18T14:16:23.975101.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_18T14_16_23.975101", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-18T14:16:23.975101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-18T14:16:23.975101.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_18T14_16_23.975101", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-18T14:16:23.975101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-18T14:16:23.975101.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_18T14_16_23.975101", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-18T14:16:23.975101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-18T14:16:23.975101.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_18T14_16_23.975101", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-18T14:16:23.975101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-18T14:16:23.975101.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_18T14_16_23.975101", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-18T14:16:23.975101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-18T14:16:23.975101.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_18T14_16_23.975101", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-18T14:16:23.975101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-18T14:16:23.975101.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_18T14_16_23.975101", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-18T14:16:23.975101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-18T14:16:23.975101.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_18T14_16_23.975101", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-18T14:16:23.975101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-18T14:16:23.975101.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_18T14_16_23.975101", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-18T14:16:23.975101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-18T14:16:23.975101.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_18T14_16_23.975101", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-18T14:16:23.975101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-18T14:16:23.975101.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_18T14_16_23.975101", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-18T14:16:23.975101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-18T14:16:23.975101.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_18T14_16_23.975101", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-18T14:16:23.975101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-18T14:16:23.975101.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_18T14_16_23.975101", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-18T14:16:23.975101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-18T14:16:23.975101.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_18T14_16_23.975101", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-18T14:16:23.975101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-18T14:16:23.975101.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_18T14_16_23.975101", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-18T14:16:23.975101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-18T14:16:23.975101.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_18T14_16_23.975101", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-18T14:16:23.975101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-18T14:16:23.975101.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_18T14_16_23.975101", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-18T14:16:23.975101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-18T14:16:23.975101.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_18T14_16_23.975101", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-18T14:16:23.975101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-18T14:16:23.975101.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_18T14_16_23.975101", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-18T14:16:23.975101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-18T14:16:23.975101.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_18T14_16_23.975101", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-18T14:16:23.975101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-18T14:16:23.975101.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_18T14_16_23.975101", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-18T14:16:23.975101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-18T14:16:23.975101.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_18T14_16_23.975101", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-18T14:16:23.975101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-18T14:16:23.975101.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_18T14_16_23.975101", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-18T14:16:23.975101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-18T14:16:23.975101.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_18T14_16_23.975101", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-18T14:16:23.975101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-18T14:16:23.975101.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_18T14_16_23.975101", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-18T14:16:23.975101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-18T14:16:23.975101.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_18T14_16_23.975101", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-18T14:16:23.975101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-18T14:16:23.975101.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_18T14_16_23.975101", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-18T14:16:23.975101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-18T14:16:23.975101.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_18T14_16_23.975101", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-18T14:16:23.975101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-18T14:16:23.975101.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_18T14_16_23.975101", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-18T14:16:23.975101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-18T14:16:23.975101.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_18T14_16_23.975101", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-18T14:16:23.975101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-18T14:16:23.975101.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_18T14_16_23.975101", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-18T14:16:23.975101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-18T14:16:23.975101.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_18T14_16_23.975101", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-18T14:16:23.975101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-18T14:16:23.975101.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_18T14_16_23.975101", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-18T14:16:23.975101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-18T14:16:23.975101.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_18T14_16_23.975101", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-18T14:16:23.975101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-18T14:16:23.975101.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_18T14_16_23.975101", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-18T14:16:23.975101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-18T14:16:23.975101.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_18T14_16_23.975101", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-18T14:16:23.975101.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-18T14:16:23.975101.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_14T21_25_30.263923", "path": ["**/details_harness|winogrande|5_2023-10-14T21-25-30.263923.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-14T21-25-30.263923.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_18T14_16_23.975101", "path": ["results_2023-07-18T14:16:23.975101.parquet"]}, {"split": "2023_10_14T21_25_30.263923", "path": ["results_2023-10-14T21-25-30.263923.parquet"]}, {"split": "latest", "path": ["results_2023-10-14T21-25-30.263923.parquet"]}]}]}
2023-10-14T20:25:42+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of xzuyn/Alpacino-SuperCOT-13B ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model xzuyn/Alpacino-SuperCOT-13B on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-10-14T21:25:30.263923(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of xzuyn/Alpacino-SuperCOT-13B", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model xzuyn/Alpacino-SuperCOT-13B on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-14T21:25:30.263923(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of xzuyn/Alpacino-SuperCOT-13B", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model xzuyn/Alpacino-SuperCOT-13B on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-14T21:25:30.263923(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 22, 31, 170, 68, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of xzuyn/Alpacino-SuperCOT-13B## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model xzuyn/Alpacino-SuperCOT-13B on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-14T21:25:30.263923(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
fc29790402a5888b8b3f40b97b1e47bc1fd8538e
# Dataset Card for Evaluation run of xzuyn/MedicWizard-7B ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/xzuyn/MedicWizard-7B - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [xzuyn/MedicWizard-7B](https://huggingface.co/xzuyn/MedicWizard-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_xzuyn__MedicWizard-7B", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-10-13T06:29:30.615749](https://huggingface.co/datasets/open-llm-leaderboard/details_xzuyn__MedicWizard-7B/blob/main/results_2023-10-13T06-29-30.615749.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.20186661073825504, "em_stderr": 0.00411064182515535, "f1": 0.27016044463087346, "f1_stderr": 0.004163667139435202, "acc": 0.3774417729343401, "acc_stderr": 0.009385788895082446 }, "harness|drop|3": { "em": 0.20186661073825504, "em_stderr": 0.00411064182515535, "f1": 0.27016044463087346, "f1_stderr": 0.004163667139435202 }, "harness|gsm8k|5": { "acc": 0.04927975739196361, "acc_stderr": 0.005962150655812479 }, "harness|winogrande|5": { "acc": 0.7056037884767167, "acc_stderr": 0.012809427134352413 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_xzuyn__MedicWizard-7B
[ "region:us" ]
2023-08-18T10:59:44+00:00
{"pretty_name": "Evaluation run of xzuyn/MedicWizard-7B", "dataset_summary": "Dataset automatically created during the evaluation run of model [xzuyn/MedicWizard-7B](https://huggingface.co/xzuyn/MedicWizard-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_xzuyn__MedicWizard-7B\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-13T06:29:30.615749](https://huggingface.co/datasets/open-llm-leaderboard/details_xzuyn__MedicWizard-7B/blob/main/results_2023-10-13T06-29-30.615749.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.20186661073825504,\n \"em_stderr\": 0.00411064182515535,\n \"f1\": 0.27016044463087346,\n \"f1_stderr\": 0.004163667139435202,\n \"acc\": 0.3774417729343401,\n \"acc_stderr\": 0.009385788895082446\n },\n \"harness|drop|3\": {\n \"em\": 0.20186661073825504,\n \"em_stderr\": 0.00411064182515535,\n \"f1\": 0.27016044463087346,\n \"f1_stderr\": 0.004163667139435202\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.04927975739196361,\n \"acc_stderr\": 0.005962150655812479\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7056037884767167,\n \"acc_stderr\": 0.012809427134352413\n }\n}\n```", "repo_url": "https://huggingface.co/xzuyn/MedicWizard-7B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_18T11_48_49.794718", "path": ["**/details_harness|arc:challenge|25_2023-07-18T11:48:49.794718.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-18T11:48:49.794718.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_13T06_29_30.615749", "path": ["**/details_harness|drop|3_2023-10-13T06-29-30.615749.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-13T06-29-30.615749.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_13T06_29_30.615749", "path": ["**/details_harness|gsm8k|5_2023-10-13T06-29-30.615749.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-13T06-29-30.615749.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_18T11_48_49.794718", "path": ["**/details_harness|hellaswag|10_2023-07-18T11:48:49.794718.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-18T11:48:49.794718.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_18T11_48_49.794718", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-18T11:48:49.794718.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-18T11:48:49.794718.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-18T11:48:49.794718.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-18T11:48:49.794718.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-18T11:48:49.794718.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-18T11:48:49.794718.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-18T11:48:49.794718.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-18T11:48:49.794718.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-18T11:48:49.794718.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-18T11:48:49.794718.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-18T11:48:49.794718.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-18T11:48:49.794718.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-18T11:48:49.794718.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-18T11:48:49.794718.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-18T11:48:49.794718.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-18T11:48:49.794718.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-18T11:48:49.794718.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-18T11:48:49.794718.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-18T11:48:49.794718.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-18T11:48:49.794718.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-18T11:48:49.794718.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-18T11:48:49.794718.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-18T11:48:49.794718.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-18T11:48:49.794718.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-18T11:48:49.794718.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-18T11:48:49.794718.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-18T11:48:49.794718.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-18T11:48:49.794718.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-18T11:48:49.794718.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-18T11:48:49.794718.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-18T11:48:49.794718.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-18T11:48:49.794718.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-18T11:48:49.794718.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-18T11:48:49.794718.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-18T11:48:49.794718.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-18T11:48:49.794718.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-18T11:48:49.794718.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-18T11:48:49.794718.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-18T11:48:49.794718.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-18T11:48:49.794718.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-18T11:48:49.794718.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-18T11:48:49.794718.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-18T11:48:49.794718.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-18T11:48:49.794718.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-18T11:48:49.794718.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-18T11:48:49.794718.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-18T11:48:49.794718.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-18T11:48:49.794718.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-18T11:48:49.794718.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-18T11:48:49.794718.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-18T11:48:49.794718.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-18T11:48:49.794718.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-18T11:48:49.794718.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-18T11:48:49.794718.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-18T11:48:49.794718.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-18T11:48:49.794718.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-18T11:48:49.794718.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-18T11:48:49.794718.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-18T11:48:49.794718.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-18T11:48:49.794718.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-18T11:48:49.794718.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-18T11:48:49.794718.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-18T11:48:49.794718.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-18T11:48:49.794718.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-18T11:48:49.794718.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-18T11:48:49.794718.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-18T11:48:49.794718.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-18T11:48:49.794718.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-18T11:48:49.794718.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-18T11:48:49.794718.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-18T11:48:49.794718.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-18T11:48:49.794718.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-18T11:48:49.794718.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-18T11:48:49.794718.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-18T11:48:49.794718.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-18T11:48:49.794718.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-18T11:48:49.794718.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-18T11:48:49.794718.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-18T11:48:49.794718.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-18T11:48:49.794718.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-18T11:48:49.794718.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-18T11:48:49.794718.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-18T11:48:49.794718.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-18T11:48:49.794718.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-18T11:48:49.794718.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-18T11:48:49.794718.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-18T11:48:49.794718.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-18T11:48:49.794718.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-18T11:48:49.794718.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-18T11:48:49.794718.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-18T11:48:49.794718.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-18T11:48:49.794718.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-18T11:48:49.794718.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-18T11:48:49.794718.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-18T11:48:49.794718.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-18T11:48:49.794718.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-18T11:48:49.794718.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-18T11:48:49.794718.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-18T11:48:49.794718.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-18T11:48:49.794718.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-18T11:48:49.794718.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-18T11:48:49.794718.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-18T11:48:49.794718.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-18T11:48:49.794718.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-18T11:48:49.794718.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-18T11:48:49.794718.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-18T11:48:49.794718.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-18T11:48:49.794718.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-18T11:48:49.794718.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-18T11:48:49.794718.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-18T11:48:49.794718.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-18T11:48:49.794718.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-18T11:48:49.794718.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-18T11:48:49.794718.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_18T11_48_49.794718", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-18T11:48:49.794718.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-18T11:48:49.794718.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_18T11_48_49.794718", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-18T11:48:49.794718.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-18T11:48:49.794718.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_18T11_48_49.794718", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-18T11:48:49.794718.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-18T11:48:49.794718.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_18T11_48_49.794718", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-18T11:48:49.794718.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-18T11:48:49.794718.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_18T11_48_49.794718", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-18T11:48:49.794718.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-18T11:48:49.794718.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_18T11_48_49.794718", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-18T11:48:49.794718.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-18T11:48:49.794718.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_18T11_48_49.794718", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-18T11:48:49.794718.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-18T11:48:49.794718.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_18T11_48_49.794718", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-18T11:48:49.794718.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-18T11:48:49.794718.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_18T11_48_49.794718", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-18T11:48:49.794718.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-18T11:48:49.794718.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_18T11_48_49.794718", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-18T11:48:49.794718.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-18T11:48:49.794718.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_18T11_48_49.794718", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-18T11:48:49.794718.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-18T11:48:49.794718.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_18T11_48_49.794718", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-18T11:48:49.794718.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-18T11:48:49.794718.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_18T11_48_49.794718", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-18T11:48:49.794718.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-18T11:48:49.794718.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_18T11_48_49.794718", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-18T11:48:49.794718.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-18T11:48:49.794718.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_18T11_48_49.794718", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-18T11:48:49.794718.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-18T11:48:49.794718.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_18T11_48_49.794718", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-18T11:48:49.794718.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-18T11:48:49.794718.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_18T11_48_49.794718", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-18T11:48:49.794718.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-18T11:48:49.794718.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_18T11_48_49.794718", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-18T11:48:49.794718.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-18T11:48:49.794718.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_18T11_48_49.794718", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-18T11:48:49.794718.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-18T11:48:49.794718.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_18T11_48_49.794718", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-18T11:48:49.794718.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-18T11:48:49.794718.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_18T11_48_49.794718", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-18T11:48:49.794718.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-18T11:48:49.794718.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_18T11_48_49.794718", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-18T11:48:49.794718.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-18T11:48:49.794718.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_18T11_48_49.794718", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-18T11:48:49.794718.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-18T11:48:49.794718.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_18T11_48_49.794718", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-18T11:48:49.794718.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-18T11:48:49.794718.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_18T11_48_49.794718", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-18T11:48:49.794718.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-18T11:48:49.794718.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_18T11_48_49.794718", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-18T11:48:49.794718.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-18T11:48:49.794718.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_18T11_48_49.794718", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-18T11:48:49.794718.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-18T11:48:49.794718.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_18T11_48_49.794718", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-18T11:48:49.794718.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-18T11:48:49.794718.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_18T11_48_49.794718", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-18T11:48:49.794718.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-18T11:48:49.794718.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_18T11_48_49.794718", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-18T11:48:49.794718.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-18T11:48:49.794718.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_18T11_48_49.794718", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-18T11:48:49.794718.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-18T11:48:49.794718.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_18T11_48_49.794718", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-18T11:48:49.794718.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-18T11:48:49.794718.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_18T11_48_49.794718", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-18T11:48:49.794718.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-18T11:48:49.794718.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_18T11_48_49.794718", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-18T11:48:49.794718.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-18T11:48:49.794718.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_18T11_48_49.794718", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-18T11:48:49.794718.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-18T11:48:49.794718.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_18T11_48_49.794718", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-18T11:48:49.794718.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-18T11:48:49.794718.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_18T11_48_49.794718", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-18T11:48:49.794718.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-18T11:48:49.794718.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_18T11_48_49.794718", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-18T11:48:49.794718.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-18T11:48:49.794718.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_18T11_48_49.794718", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-18T11:48:49.794718.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-18T11:48:49.794718.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_18T11_48_49.794718", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-18T11:48:49.794718.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-18T11:48:49.794718.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_18T11_48_49.794718", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-18T11:48:49.794718.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-18T11:48:49.794718.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_18T11_48_49.794718", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-18T11:48:49.794718.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-18T11:48:49.794718.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_18T11_48_49.794718", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-18T11:48:49.794718.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-18T11:48:49.794718.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_18T11_48_49.794718", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-18T11:48:49.794718.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-18T11:48:49.794718.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_18T11_48_49.794718", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-18T11:48:49.794718.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-18T11:48:49.794718.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_18T11_48_49.794718", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-18T11:48:49.794718.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-18T11:48:49.794718.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_18T11_48_49.794718", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-18T11:48:49.794718.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-18T11:48:49.794718.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_18T11_48_49.794718", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-18T11:48:49.794718.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-18T11:48:49.794718.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_18T11_48_49.794718", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-18T11:48:49.794718.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-18T11:48:49.794718.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_18T11_48_49.794718", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-18T11:48:49.794718.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-18T11:48:49.794718.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_18T11_48_49.794718", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-18T11:48:49.794718.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-18T11:48:49.794718.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_18T11_48_49.794718", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-18T11:48:49.794718.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-18T11:48:49.794718.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_18T11_48_49.794718", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-18T11:48:49.794718.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-18T11:48:49.794718.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_18T11_48_49.794718", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-18T11:48:49.794718.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-18T11:48:49.794718.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_18T11_48_49.794718", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-18T11:48:49.794718.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-18T11:48:49.794718.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_18T11_48_49.794718", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-18T11:48:49.794718.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-18T11:48:49.794718.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_18T11_48_49.794718", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-18T11:48:49.794718.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-18T11:48:49.794718.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_18T11_48_49.794718", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-18T11:48:49.794718.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-18T11:48:49.794718.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_13T06_29_30.615749", "path": ["**/details_harness|winogrande|5_2023-10-13T06-29-30.615749.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-13T06-29-30.615749.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_18T11_48_49.794718", "path": ["results_2023-07-18T11:48:49.794718.parquet"]}, {"split": "2023_10_13T06_29_30.615749", "path": ["results_2023-10-13T06-29-30.615749.parquet"]}, {"split": "latest", "path": ["results_2023-10-13T06-29-30.615749.parquet"]}]}]}
2023-10-13T05:29:42+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of xzuyn/MedicWizard-7B ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model xzuyn/MedicWizard-7B on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-10-13T06:29:30.615749(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of xzuyn/MedicWizard-7B", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model xzuyn/MedicWizard-7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-13T06:29:30.615749(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of xzuyn/MedicWizard-7B", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model xzuyn/MedicWizard-7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-13T06:29:30.615749(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 19, 31, 167, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of xzuyn/MedicWizard-7B## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model xzuyn/MedicWizard-7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-13T06:29:30.615749(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
2790851b2c74af6042a4736a2f9a24b90662d21c
# Dataset Card for "mmlu-college_physics-neg-prepend-fix" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
joey234/mmlu-college_physics-neg-prepend-fix
[ "region:us" ]
2023-08-18T10:59:48+00:00
{"configs": [{"config_name": "default", "data_files": [{"split": "dev", "path": "data/dev-*"}, {"split": "test", "path": "data/test-*"}]}], "dataset_info": {"features": [{"name": "question", "dtype": "string"}, {"name": "choices", "sequence": "string"}, {"name": "answer", "dtype": {"class_label": {"names": {"0": "A", "1": "B", "2": "C", "3": "D"}}}}, {"name": "negate_openai_prompt", "struct": [{"name": "content", "dtype": "string"}, {"name": "role", "dtype": "string"}]}, {"name": "neg_question", "dtype": "string"}, {"name": "fewshot_context", "dtype": "string"}, {"name": "ori_prompt", "dtype": "string"}], "splits": [{"name": "dev", "num_bytes": 7147, "num_examples": 5}, {"name": "test", "num_bytes": 282713, "num_examples": 102}], "download_size": 15541, "dataset_size": 289860}}
2023-08-21T06:33:13+00:00
[]
[]
TAGS #region-us
# Dataset Card for "mmlu-college_physics-neg-prepend-fix" More Information needed
[ "# Dataset Card for \"mmlu-college_physics-neg-prepend-fix\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"mmlu-college_physics-neg-prepend-fix\"\n\nMore Information needed" ]
[ 6, 28 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"mmlu-college_physics-neg-prepend-fix\"\n\nMore Information needed" ]
da750986a76a9751a6db701c5154167f267de32b
# Dataset Card for Evaluation run of gywy/llama2-13b-chinese-v1 ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/gywy/llama2-13b-chinese-v1 - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [gywy/llama2-13b-chinese-v1](https://huggingface.co/gywy/llama2-13b-chinese-v1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_gywy__llama2-13b-chinese-v1", "harness_truthfulqa_mc_0", split="train") ``` ## Latest results These are the [latest results from run 2023-07-26T15:10:00.921624](https://huggingface.co/datasets/open-llm-leaderboard/details_gywy__llama2-13b-chinese-v1/blob/main/results_2023-07-26T15%3A10%3A00.921624.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.5420814148370803, "acc_stderr": 0.03472894201222865, "acc_norm": 0.5463875639849175, "acc_norm_stderr": 0.03471430598894899, "mc1": 0.3182374541003672, "mc1_stderr": 0.016305988648920612, "mc2": 0.45724154700953135, "mc2_stderr": 0.015310459215672905 }, "harness|arc:challenge|25": { "acc": 0.5631399317406144, "acc_stderr": 0.014494421584256513, "acc_norm": 0.5981228668941979, "acc_norm_stderr": 0.014327268614578278 }, "harness|hellaswag|10": { "acc": 0.5381398127862975, "acc_stderr": 0.004975243508751998, "acc_norm": 0.7572196773551085, "acc_norm_stderr": 0.004278871104930374 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.4, "acc_stderr": 0.04923659639173309, "acc_norm": 0.4, "acc_norm_stderr": 0.04923659639173309 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.5037037037037037, "acc_stderr": 0.04319223625811331, "acc_norm": 0.5037037037037037, "acc_norm_stderr": 0.04319223625811331 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.5657894736842105, "acc_stderr": 0.04033565667848319, "acc_norm": 0.5657894736842105, "acc_norm_stderr": 0.04033565667848319 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.5, "acc_stderr": 0.050251890762960605, "acc_norm": 0.5, "acc_norm_stderr": 0.050251890762960605 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.5924528301886792, "acc_stderr": 0.030242233800854494, "acc_norm": 0.5924528301886792, "acc_norm_stderr": 0.030242233800854494 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.5555555555555556, "acc_stderr": 0.041553199555931467, "acc_norm": 0.5555555555555556, "acc_norm_stderr": 0.041553199555931467 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.39, "acc_stderr": 0.04902071300001974, "acc_norm": 0.39, "acc_norm_stderr": 0.04902071300001974 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.41, "acc_stderr": 0.04943110704237102, "acc_norm": 0.41, "acc_norm_stderr": 0.04943110704237102 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.27, "acc_stderr": 0.044619604333847394, "acc_norm": 0.27, "acc_norm_stderr": 0.044619604333847394 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.5086705202312138, "acc_stderr": 0.03811890988940412, "acc_norm": 0.5086705202312138, "acc_norm_stderr": 0.03811890988940412 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.3137254901960784, "acc_stderr": 0.04617034827006717, "acc_norm": 0.3137254901960784, "acc_norm_stderr": 0.04617034827006717 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.68, "acc_stderr": 0.04688261722621504, "acc_norm": 0.68, "acc_norm_stderr": 0.04688261722621504 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.42127659574468085, "acc_stderr": 0.03227834510146268, "acc_norm": 0.42127659574468085, "acc_norm_stderr": 0.03227834510146268 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.2982456140350877, "acc_stderr": 0.04303684033537314, "acc_norm": 0.2982456140350877, "acc_norm_stderr": 0.04303684033537314 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.503448275862069, "acc_stderr": 0.04166567577101579, "acc_norm": 0.503448275862069, "acc_norm_stderr": 0.04166567577101579 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.3201058201058201, "acc_stderr": 0.024026846392873506, "acc_norm": 0.3201058201058201, "acc_norm_stderr": 0.024026846392873506 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.30952380952380953, "acc_stderr": 0.04134913018303316, "acc_norm": 0.30952380952380953, "acc_norm_stderr": 0.04134913018303316 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.39, "acc_stderr": 0.04902071300001975, "acc_norm": 0.39, "acc_norm_stderr": 0.04902071300001975 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.6612903225806451, "acc_stderr": 0.026923446059302837, "acc_norm": 0.6612903225806451, "acc_norm_stderr": 0.026923446059302837 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.43349753694581283, "acc_stderr": 0.034867317274198714, "acc_norm": 0.43349753694581283, "acc_norm_stderr": 0.034867317274198714 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.52, "acc_stderr": 0.050211673156867795, "acc_norm": 0.52, "acc_norm_stderr": 0.050211673156867795 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.6727272727272727, "acc_stderr": 0.036639749943912434, "acc_norm": 0.6727272727272727, "acc_norm_stderr": 0.036639749943912434 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.6868686868686869, "acc_stderr": 0.033042050878136525, "acc_norm": 0.6868686868686869, "acc_norm_stderr": 0.033042050878136525 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.7772020725388601, "acc_stderr": 0.03003114797764154, "acc_norm": 0.7772020725388601, "acc_norm_stderr": 0.03003114797764154 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.4846153846153846, "acc_stderr": 0.025339003010106515, "acc_norm": 0.4846153846153846, "acc_norm_stderr": 0.025339003010106515 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.3037037037037037, "acc_stderr": 0.02803792996911499, "acc_norm": 0.3037037037037037, "acc_norm_stderr": 0.02803792996911499 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.6008403361344538, "acc_stderr": 0.03181110032413926, "acc_norm": 0.6008403361344538, "acc_norm_stderr": 0.03181110032413926 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.31788079470198677, "acc_stderr": 0.038020397601079024, "acc_norm": 0.31788079470198677, "acc_norm_stderr": 0.038020397601079024 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.7009174311926606, "acc_stderr": 0.019630417285415175, "acc_norm": 0.7009174311926606, "acc_norm_stderr": 0.019630417285415175 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.4027777777777778, "acc_stderr": 0.03344887382997867, "acc_norm": 0.4027777777777778, "acc_norm_stderr": 0.03344887382997867 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.7401960784313726, "acc_stderr": 0.03077855467869326, "acc_norm": 0.7401960784313726, "acc_norm_stderr": 0.03077855467869326 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.7426160337552743, "acc_stderr": 0.028458820991460302, "acc_norm": 0.7426160337552743, "acc_norm_stderr": 0.028458820991460302 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.6636771300448431, "acc_stderr": 0.031708824268455, "acc_norm": 0.6636771300448431, "acc_norm_stderr": 0.031708824268455 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.6335877862595419, "acc_stderr": 0.042258754519696365, "acc_norm": 0.6335877862595419, "acc_norm_stderr": 0.042258754519696365 }, "harness|hendrycksTest-international_law|5": { "acc": 0.71900826446281, "acc_stderr": 0.04103203830514512, "acc_norm": 0.71900826446281, "acc_norm_stderr": 0.04103203830514512 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.7129629629629629, "acc_stderr": 0.043733130409147614, "acc_norm": 0.7129629629629629, "acc_norm_stderr": 0.043733130409147614 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.6134969325153374, "acc_stderr": 0.03825825548848608, "acc_norm": 0.6134969325153374, "acc_norm_stderr": 0.03825825548848608 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.30357142857142855, "acc_stderr": 0.04364226155841044, "acc_norm": 0.30357142857142855, "acc_norm_stderr": 0.04364226155841044 }, "harness|hendrycksTest-management|5": { "acc": 0.7087378640776699, "acc_stderr": 0.044986763205729245, "acc_norm": 0.7087378640776699, "acc_norm_stderr": 0.044986763205729245 }, "harness|hendrycksTest-marketing|5": { "acc": 0.7564102564102564, "acc_stderr": 0.02812096650391442, "acc_norm": 0.7564102564102564, "acc_norm_stderr": 0.02812096650391442 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.48, "acc_stderr": 0.050211673156867795, "acc_norm": 0.48, "acc_norm_stderr": 0.050211673156867795 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.7254150702426565, "acc_stderr": 0.015959829933084025, "acc_norm": 0.7254150702426565, "acc_norm_stderr": 0.015959829933084025 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.6098265895953757, "acc_stderr": 0.026261677607806636, "acc_norm": 0.6098265895953757, "acc_norm_stderr": 0.026261677607806636 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.37094972067039106, "acc_stderr": 0.016155910721341774, "acc_norm": 0.37094972067039106, "acc_norm_stderr": 0.016155910721341774 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.5784313725490197, "acc_stderr": 0.028275490156791455, "acc_norm": 0.5784313725490197, "acc_norm_stderr": 0.028275490156791455 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.6237942122186495, "acc_stderr": 0.02751392568354943, "acc_norm": 0.6237942122186495, "acc_norm_stderr": 0.02751392568354943 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.5740740740740741, "acc_stderr": 0.02751374728437942, "acc_norm": 0.5740740740740741, "acc_norm_stderr": 0.02751374728437942 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.4432624113475177, "acc_stderr": 0.029634838473766, "acc_norm": 0.4432624113475177, "acc_norm_stderr": 0.029634838473766 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.3970013037809648, "acc_stderr": 0.012496346982909553, "acc_norm": 0.3970013037809648, "acc_norm_stderr": 0.012496346982909553 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.5073529411764706, "acc_stderr": 0.030369552523902173, "acc_norm": 0.5073529411764706, "acc_norm_stderr": 0.030369552523902173 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.5375816993464052, "acc_stderr": 0.02017061497496976, "acc_norm": 0.5375816993464052, "acc_norm_stderr": 0.02017061497496976 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6, "acc_stderr": 0.0469237132203465, "acc_norm": 0.6, "acc_norm_stderr": 0.0469237132203465 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.6122448979591837, "acc_stderr": 0.031192230726795656, "acc_norm": 0.6122448979591837, "acc_norm_stderr": 0.031192230726795656 }, "harness|hendrycksTest-sociology|5": { "acc": 0.7263681592039801, "acc_stderr": 0.031524391865554016, "acc_norm": 0.7263681592039801, "acc_norm_stderr": 0.031524391865554016 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.77, "acc_stderr": 0.04229525846816506, "acc_norm": 0.77, "acc_norm_stderr": 0.04229525846816506 }, "harness|hendrycksTest-virology|5": { "acc": 0.4759036144578313, "acc_stderr": 0.038879718495972646, "acc_norm": 0.4759036144578313, "acc_norm_stderr": 0.038879718495972646 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.7602339181286549, "acc_stderr": 0.03274485211946956, "acc_norm": 0.7602339181286549, "acc_norm_stderr": 0.03274485211946956 }, "harness|truthfulqa:mc|0": { "mc1": 0.3182374541003672, "mc1_stderr": 0.016305988648920612, "mc2": 0.45724154700953135, "mc2_stderr": 0.015310459215672905 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_gywy__llama2-13b-chinese-v1
[ "region:us" ]
2023-08-18T10:59:52+00:00
{"pretty_name": "Evaluation run of gywy/llama2-13b-chinese-v1", "dataset_summary": "Dataset automatically created during the evaluation run of model [gywy/llama2-13b-chinese-v1](https://huggingface.co/gywy/llama2-13b-chinese-v1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_gywy__llama2-13b-chinese-v1\",\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-07-26T15:10:00.921624](https://huggingface.co/datasets/open-llm-leaderboard/details_gywy__llama2-13b-chinese-v1/blob/main/results_2023-07-26T15%3A10%3A00.921624.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5420814148370803,\n \"acc_stderr\": 0.03472894201222865,\n \"acc_norm\": 0.5463875639849175,\n \"acc_norm_stderr\": 0.03471430598894899,\n \"mc1\": 0.3182374541003672,\n \"mc1_stderr\": 0.016305988648920612,\n \"mc2\": 0.45724154700953135,\n \"mc2_stderr\": 0.015310459215672905\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5631399317406144,\n \"acc_stderr\": 0.014494421584256513,\n \"acc_norm\": 0.5981228668941979,\n \"acc_norm_stderr\": 0.014327268614578278\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5381398127862975,\n \"acc_stderr\": 0.004975243508751998,\n \"acc_norm\": 0.7572196773551085,\n \"acc_norm_stderr\": 0.004278871104930374\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.4,\n \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5037037037037037,\n \"acc_stderr\": 0.04319223625811331,\n \"acc_norm\": 0.5037037037037037,\n \"acc_norm_stderr\": 0.04319223625811331\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.5657894736842105,\n \"acc_stderr\": 0.04033565667848319,\n \"acc_norm\": 0.5657894736842105,\n \"acc_norm_stderr\": 0.04033565667848319\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.5924528301886792,\n \"acc_stderr\": 0.030242233800854494,\n \"acc_norm\": 0.5924528301886792,\n \"acc_norm_stderr\": 0.030242233800854494\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5555555555555556,\n \"acc_stderr\": 0.041553199555931467,\n \"acc_norm\": 0.5555555555555556,\n \"acc_norm_stderr\": 0.041553199555931467\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001974,\n \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001974\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.41,\n \"acc_stderr\": 0.04943110704237102,\n \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.04943110704237102\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5086705202312138,\n \"acc_stderr\": 0.03811890988940412,\n \"acc_norm\": 0.5086705202312138,\n \"acc_norm_stderr\": 0.03811890988940412\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.3137254901960784,\n \"acc_stderr\": 0.04617034827006717,\n \"acc_norm\": 0.3137254901960784,\n \"acc_norm_stderr\": 0.04617034827006717\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621504,\n \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.04688261722621504\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.42127659574468085,\n \"acc_stderr\": 0.03227834510146268,\n \"acc_norm\": 0.42127659574468085,\n \"acc_norm_stderr\": 0.03227834510146268\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2982456140350877,\n \"acc_stderr\": 0.04303684033537314,\n \"acc_norm\": 0.2982456140350877,\n \"acc_norm_stderr\": 0.04303684033537314\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.503448275862069,\n \"acc_stderr\": 0.04166567577101579,\n \"acc_norm\": 0.503448275862069,\n \"acc_norm_stderr\": 0.04166567577101579\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.3201058201058201,\n \"acc_stderr\": 0.024026846392873506,\n \"acc_norm\": 0.3201058201058201,\n \"acc_norm_stderr\": 0.024026846392873506\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.30952380952380953,\n \"acc_stderr\": 0.04134913018303316,\n \"acc_norm\": 0.30952380952380953,\n \"acc_norm_stderr\": 0.04134913018303316\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6612903225806451,\n \"acc_stderr\": 0.026923446059302837,\n \"acc_norm\": 0.6612903225806451,\n \"acc_norm_stderr\": 0.026923446059302837\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.43349753694581283,\n \"acc_stderr\": 0.034867317274198714,\n \"acc_norm\": 0.43349753694581283,\n \"acc_norm_stderr\": 0.034867317274198714\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.6727272727272727,\n \"acc_stderr\": 0.036639749943912434,\n \"acc_norm\": 0.6727272727272727,\n \"acc_norm_stderr\": 0.036639749943912434\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.6868686868686869,\n \"acc_stderr\": 0.033042050878136525,\n \"acc_norm\": 0.6868686868686869,\n \"acc_norm_stderr\": 0.033042050878136525\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.7772020725388601,\n \"acc_stderr\": 0.03003114797764154,\n \"acc_norm\": 0.7772020725388601,\n \"acc_norm_stderr\": 0.03003114797764154\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.4846153846153846,\n \"acc_stderr\": 0.025339003010106515,\n \"acc_norm\": 0.4846153846153846,\n \"acc_norm_stderr\": 0.025339003010106515\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3037037037037037,\n \"acc_stderr\": 0.02803792996911499,\n \"acc_norm\": 0.3037037037037037,\n \"acc_norm_stderr\": 0.02803792996911499\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6008403361344538,\n \"acc_stderr\": 0.03181110032413926,\n \"acc_norm\": 0.6008403361344538,\n \"acc_norm_stderr\": 0.03181110032413926\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.31788079470198677,\n \"acc_stderr\": 0.038020397601079024,\n \"acc_norm\": 0.31788079470198677,\n \"acc_norm_stderr\": 0.038020397601079024\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.7009174311926606,\n \"acc_stderr\": 0.019630417285415175,\n \"acc_norm\": 0.7009174311926606,\n \"acc_norm_stderr\": 0.019630417285415175\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.4027777777777778,\n \"acc_stderr\": 0.03344887382997867,\n \"acc_norm\": 0.4027777777777778,\n \"acc_norm_stderr\": 0.03344887382997867\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7401960784313726,\n \"acc_stderr\": 0.03077855467869326,\n \"acc_norm\": 0.7401960784313726,\n \"acc_norm_stderr\": 0.03077855467869326\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7426160337552743,\n \"acc_stderr\": 0.028458820991460302,\n \"acc_norm\": 0.7426160337552743,\n \"acc_norm_stderr\": 0.028458820991460302\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6636771300448431,\n \"acc_stderr\": 0.031708824268455,\n \"acc_norm\": 0.6636771300448431,\n \"acc_norm_stderr\": 0.031708824268455\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.6335877862595419,\n \"acc_stderr\": 0.042258754519696365,\n \"acc_norm\": 0.6335877862595419,\n \"acc_norm_stderr\": 0.042258754519696365\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.71900826446281,\n \"acc_stderr\": 0.04103203830514512,\n \"acc_norm\": 0.71900826446281,\n \"acc_norm_stderr\": 0.04103203830514512\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7129629629629629,\n \"acc_stderr\": 0.043733130409147614,\n \"acc_norm\": 0.7129629629629629,\n \"acc_norm_stderr\": 0.043733130409147614\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.6134969325153374,\n \"acc_stderr\": 0.03825825548848608,\n \"acc_norm\": 0.6134969325153374,\n \"acc_norm_stderr\": 0.03825825548848608\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.30357142857142855,\n \"acc_stderr\": 0.04364226155841044,\n \"acc_norm\": 0.30357142857142855,\n \"acc_norm_stderr\": 0.04364226155841044\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7087378640776699,\n \"acc_stderr\": 0.044986763205729245,\n \"acc_norm\": 0.7087378640776699,\n \"acc_norm_stderr\": 0.044986763205729245\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7564102564102564,\n \"acc_stderr\": 0.02812096650391442,\n \"acc_norm\": 0.7564102564102564,\n \"acc_norm_stderr\": 0.02812096650391442\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7254150702426565,\n \"acc_stderr\": 0.015959829933084025,\n \"acc_norm\": 0.7254150702426565,\n \"acc_norm_stderr\": 0.015959829933084025\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.6098265895953757,\n \"acc_stderr\": 0.026261677607806636,\n \"acc_norm\": 0.6098265895953757,\n \"acc_norm_stderr\": 0.026261677607806636\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.37094972067039106,\n \"acc_stderr\": 0.016155910721341774,\n \"acc_norm\": 0.37094972067039106,\n \"acc_norm_stderr\": 0.016155910721341774\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.5784313725490197,\n \"acc_stderr\": 0.028275490156791455,\n \"acc_norm\": 0.5784313725490197,\n \"acc_norm_stderr\": 0.028275490156791455\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6237942122186495,\n \"acc_stderr\": 0.02751392568354943,\n \"acc_norm\": 0.6237942122186495,\n \"acc_norm_stderr\": 0.02751392568354943\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.5740740740740741,\n \"acc_stderr\": 0.02751374728437942,\n \"acc_norm\": 0.5740740740740741,\n \"acc_norm_stderr\": 0.02751374728437942\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4432624113475177,\n \"acc_stderr\": 0.029634838473766,\n \"acc_norm\": 0.4432624113475177,\n \"acc_norm_stderr\": 0.029634838473766\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3970013037809648,\n \"acc_stderr\": 0.012496346982909553,\n \"acc_norm\": 0.3970013037809648,\n \"acc_norm_stderr\": 0.012496346982909553\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.5073529411764706,\n \"acc_stderr\": 0.030369552523902173,\n \"acc_norm\": 0.5073529411764706,\n \"acc_norm_stderr\": 0.030369552523902173\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.5375816993464052,\n \"acc_stderr\": 0.02017061497496976,\n \"acc_norm\": 0.5375816993464052,\n \"acc_norm_stderr\": 0.02017061497496976\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.0469237132203465,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.0469237132203465\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.6122448979591837,\n \"acc_stderr\": 0.031192230726795656,\n \"acc_norm\": 0.6122448979591837,\n \"acc_norm_stderr\": 0.031192230726795656\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7263681592039801,\n \"acc_stderr\": 0.031524391865554016,\n \"acc_norm\": 0.7263681592039801,\n \"acc_norm_stderr\": 0.031524391865554016\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.77,\n \"acc_stderr\": 0.04229525846816506,\n \"acc_norm\": 0.77,\n \"acc_norm_stderr\": 0.04229525846816506\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4759036144578313,\n \"acc_stderr\": 0.038879718495972646,\n \"acc_norm\": 0.4759036144578313,\n \"acc_norm_stderr\": 0.038879718495972646\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.7602339181286549,\n \"acc_stderr\": 0.03274485211946956,\n \"acc_norm\": 0.7602339181286549,\n \"acc_norm_stderr\": 0.03274485211946956\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3182374541003672,\n \"mc1_stderr\": 0.016305988648920612,\n \"mc2\": 0.45724154700953135,\n \"mc2_stderr\": 0.015310459215672905\n }\n}\n```", "repo_url": "https://huggingface.co/gywy/llama2-13b-chinese-v1", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_26T15_10_00.921624", "path": ["**/details_harness|arc:challenge|25_2023-07-26T15:10:00.921624.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-26T15:10:00.921624.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_26T15_10_00.921624", "path": ["**/details_harness|hellaswag|10_2023-07-26T15:10:00.921624.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-26T15:10:00.921624.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_26T15_10_00.921624", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-26T15:10:00.921624.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-26T15:10:00.921624.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-26T15:10:00.921624.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-26T15:10:00.921624.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-26T15:10:00.921624.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-26T15:10:00.921624.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-26T15:10:00.921624.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-26T15:10:00.921624.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-26T15:10:00.921624.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-26T15:10:00.921624.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-26T15:10:00.921624.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-26T15:10:00.921624.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-26T15:10:00.921624.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-26T15:10:00.921624.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-26T15:10:00.921624.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-26T15:10:00.921624.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-26T15:10:00.921624.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-26T15:10:00.921624.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-26T15:10:00.921624.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-26T15:10:00.921624.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-26T15:10:00.921624.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-26T15:10:00.921624.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-26T15:10:00.921624.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-26T15:10:00.921624.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-26T15:10:00.921624.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-26T15:10:00.921624.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-26T15:10:00.921624.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-26T15:10:00.921624.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-26T15:10:00.921624.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-26T15:10:00.921624.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-26T15:10:00.921624.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-26T15:10:00.921624.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-26T15:10:00.921624.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-26T15:10:00.921624.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-26T15:10:00.921624.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-26T15:10:00.921624.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-26T15:10:00.921624.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-26T15:10:00.921624.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-26T15:10:00.921624.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-26T15:10:00.921624.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-26T15:10:00.921624.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-26T15:10:00.921624.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-26T15:10:00.921624.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-26T15:10:00.921624.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-26T15:10:00.921624.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-26T15:10:00.921624.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-26T15:10:00.921624.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-26T15:10:00.921624.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-26T15:10:00.921624.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-26T15:10:00.921624.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-26T15:10:00.921624.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-26T15:10:00.921624.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-26T15:10:00.921624.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-26T15:10:00.921624.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-26T15:10:00.921624.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-26T15:10:00.921624.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-26T15:10:00.921624.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-26T15:10:00.921624.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-26T15:10:00.921624.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-26T15:10:00.921624.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-26T15:10:00.921624.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-26T15:10:00.921624.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-26T15:10:00.921624.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-26T15:10:00.921624.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-26T15:10:00.921624.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-26T15:10:00.921624.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-26T15:10:00.921624.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-26T15:10:00.921624.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-26T15:10:00.921624.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-26T15:10:00.921624.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-26T15:10:00.921624.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-26T15:10:00.921624.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-26T15:10:00.921624.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-26T15:10:00.921624.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-26T15:10:00.921624.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-26T15:10:00.921624.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-26T15:10:00.921624.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-26T15:10:00.921624.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-26T15:10:00.921624.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-26T15:10:00.921624.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-26T15:10:00.921624.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-26T15:10:00.921624.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-26T15:10:00.921624.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-26T15:10:00.921624.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-26T15:10:00.921624.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-26T15:10:00.921624.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-26T15:10:00.921624.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-26T15:10:00.921624.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-26T15:10:00.921624.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-26T15:10:00.921624.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-26T15:10:00.921624.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-26T15:10:00.921624.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-26T15:10:00.921624.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-26T15:10:00.921624.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-26T15:10:00.921624.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-26T15:10:00.921624.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-26T15:10:00.921624.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-26T15:10:00.921624.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-26T15:10:00.921624.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-26T15:10:00.921624.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-26T15:10:00.921624.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-26T15:10:00.921624.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-26T15:10:00.921624.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-26T15:10:00.921624.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-26T15:10:00.921624.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-26T15:10:00.921624.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-26T15:10:00.921624.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-26T15:10:00.921624.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-26T15:10:00.921624.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-26T15:10:00.921624.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-26T15:10:00.921624.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-26T15:10:00.921624.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-26T15:10:00.921624.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-26T15:10:00.921624.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_26T15_10_00.921624", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-26T15:10:00.921624.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-26T15:10:00.921624.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_26T15_10_00.921624", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-26T15:10:00.921624.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-26T15:10:00.921624.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_26T15_10_00.921624", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-26T15:10:00.921624.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-26T15:10:00.921624.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_26T15_10_00.921624", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-26T15:10:00.921624.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-26T15:10:00.921624.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_26T15_10_00.921624", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-26T15:10:00.921624.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-26T15:10:00.921624.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_26T15_10_00.921624", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-26T15:10:00.921624.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-26T15:10:00.921624.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_26T15_10_00.921624", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-26T15:10:00.921624.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-26T15:10:00.921624.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_26T15_10_00.921624", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-26T15:10:00.921624.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-26T15:10:00.921624.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_26T15_10_00.921624", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-26T15:10:00.921624.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-26T15:10:00.921624.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_26T15_10_00.921624", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-26T15:10:00.921624.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-26T15:10:00.921624.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_26T15_10_00.921624", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-26T15:10:00.921624.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-26T15:10:00.921624.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_26T15_10_00.921624", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-26T15:10:00.921624.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-26T15:10:00.921624.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_26T15_10_00.921624", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-26T15:10:00.921624.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-26T15:10:00.921624.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_26T15_10_00.921624", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-26T15:10:00.921624.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-26T15:10:00.921624.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_26T15_10_00.921624", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-26T15:10:00.921624.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-26T15:10:00.921624.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_26T15_10_00.921624", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-26T15:10:00.921624.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-26T15:10:00.921624.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_26T15_10_00.921624", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-26T15:10:00.921624.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-26T15:10:00.921624.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_26T15_10_00.921624", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-26T15:10:00.921624.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-26T15:10:00.921624.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_26T15_10_00.921624", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-26T15:10:00.921624.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-26T15:10:00.921624.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_26T15_10_00.921624", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-26T15:10:00.921624.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-26T15:10:00.921624.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_26T15_10_00.921624", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-26T15:10:00.921624.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-26T15:10:00.921624.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_26T15_10_00.921624", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-26T15:10:00.921624.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-26T15:10:00.921624.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_26T15_10_00.921624", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-26T15:10:00.921624.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-26T15:10:00.921624.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_26T15_10_00.921624", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-26T15:10:00.921624.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-26T15:10:00.921624.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_26T15_10_00.921624", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-26T15:10:00.921624.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-26T15:10:00.921624.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_26T15_10_00.921624", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-26T15:10:00.921624.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-26T15:10:00.921624.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_26T15_10_00.921624", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-26T15:10:00.921624.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-26T15:10:00.921624.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_26T15_10_00.921624", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-26T15:10:00.921624.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-26T15:10:00.921624.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_26T15_10_00.921624", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-26T15:10:00.921624.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-26T15:10:00.921624.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_26T15_10_00.921624", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-26T15:10:00.921624.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-26T15:10:00.921624.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_26T15_10_00.921624", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-26T15:10:00.921624.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-26T15:10:00.921624.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_26T15_10_00.921624", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-26T15:10:00.921624.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-26T15:10:00.921624.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_26T15_10_00.921624", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-26T15:10:00.921624.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-26T15:10:00.921624.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_26T15_10_00.921624", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-26T15:10:00.921624.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-26T15:10:00.921624.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_26T15_10_00.921624", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-26T15:10:00.921624.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-26T15:10:00.921624.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_26T15_10_00.921624", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-26T15:10:00.921624.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-26T15:10:00.921624.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_26T15_10_00.921624", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-26T15:10:00.921624.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-26T15:10:00.921624.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_26T15_10_00.921624", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-26T15:10:00.921624.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-26T15:10:00.921624.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_26T15_10_00.921624", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-26T15:10:00.921624.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-26T15:10:00.921624.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_26T15_10_00.921624", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-26T15:10:00.921624.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-26T15:10:00.921624.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_26T15_10_00.921624", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-26T15:10:00.921624.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-26T15:10:00.921624.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_26T15_10_00.921624", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-26T15:10:00.921624.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-26T15:10:00.921624.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_26T15_10_00.921624", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-26T15:10:00.921624.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-26T15:10:00.921624.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_26T15_10_00.921624", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-26T15:10:00.921624.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-26T15:10:00.921624.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_26T15_10_00.921624", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-26T15:10:00.921624.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-26T15:10:00.921624.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_26T15_10_00.921624", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-26T15:10:00.921624.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-26T15:10:00.921624.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_26T15_10_00.921624", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-26T15:10:00.921624.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-26T15:10:00.921624.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_26T15_10_00.921624", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-26T15:10:00.921624.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-26T15:10:00.921624.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_26T15_10_00.921624", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-26T15:10:00.921624.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-26T15:10:00.921624.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_26T15_10_00.921624", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-26T15:10:00.921624.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-26T15:10:00.921624.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_26T15_10_00.921624", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-26T15:10:00.921624.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-26T15:10:00.921624.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_26T15_10_00.921624", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-26T15:10:00.921624.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-26T15:10:00.921624.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_26T15_10_00.921624", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-26T15:10:00.921624.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-26T15:10:00.921624.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_26T15_10_00.921624", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-26T15:10:00.921624.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-26T15:10:00.921624.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_26T15_10_00.921624", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-26T15:10:00.921624.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-26T15:10:00.921624.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_26T15_10_00.921624", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-26T15:10:00.921624.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-26T15:10:00.921624.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_26T15_10_00.921624", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-26T15:10:00.921624.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-26T15:10:00.921624.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_26T15_10_00.921624", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-26T15:10:00.921624.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-26T15:10:00.921624.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_26T15_10_00.921624", "path": ["results_2023-07-26T15:10:00.921624.parquet"]}, {"split": "latest", "path": ["results_2023-07-26T15:10:00.921624.parquet"]}]}]}
2023-08-27T11:39:03+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of gywy/llama2-13b-chinese-v1 ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model gywy/llama2-13b-chinese-v1 on the Open LLM Leaderboard. The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-07-26T15:10:00.921624 (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of gywy/llama2-13b-chinese-v1", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model gywy/llama2-13b-chinese-v1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-07-26T15:10:00.921624 (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of gywy/llama2-13b-chinese-v1", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model gywy/llama2-13b-chinese-v1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-07-26T15:10:00.921624 (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 23, 31, 171, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of gywy/llama2-13b-chinese-v1## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model gywy/llama2-13b-chinese-v1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-07-26T15:10:00.921624 (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
3e526fbfa77a27046cbfcdc96fef146073aca9ce
# Dataset Card for Evaluation run of GOAT-AI/GOAT-7B-Community ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/GOAT-AI/GOAT-7B-Community - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [GOAT-AI/GOAT-7B-Community](https://huggingface.co/GOAT-AI/GOAT-7B-Community) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_GOAT-AI__GOAT-7B-Community", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-09-22T17:14:52.967997](https://huggingface.co/datasets/open-llm-leaderboard/details_GOAT-AI__GOAT-7B-Community/blob/main/results_2023-09-22T17-14-52.967997.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.005243288590604027, "em_stderr": 0.0007396052260778031, "f1": 0.06909395973154382, "f1_stderr": 0.0015832414439852427, "acc": 0.3838492484021702, "acc_stderr": 0.009135888573374731 }, "harness|drop|3": { "em": 0.005243288590604027, "em_stderr": 0.0007396052260778031, "f1": 0.06909395973154382, "f1_stderr": 0.0015832414439852427 }, "harness|gsm8k|5": { "acc": 0.04473085670962851, "acc_stderr": 0.005693886131407048 }, "harness|winogrande|5": { "acc": 0.7229676400947119, "acc_stderr": 0.012577891015342414 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_GOAT-AI__GOAT-7B-Community
[ "region:us" ]
2023-08-18T11:00:01+00:00
{"pretty_name": "Evaluation run of GOAT-AI/GOAT-7B-Community", "dataset_summary": "Dataset automatically created during the evaluation run of model [GOAT-AI/GOAT-7B-Community](https://huggingface.co/GOAT-AI/GOAT-7B-Community) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_GOAT-AI__GOAT-7B-Community\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-09-22T17:14:52.967997](https://huggingface.co/datasets/open-llm-leaderboard/details_GOAT-AI__GOAT-7B-Community/blob/main/results_2023-09-22T17-14-52.967997.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.005243288590604027,\n \"em_stderr\": 0.0007396052260778031,\n \"f1\": 0.06909395973154382,\n \"f1_stderr\": 0.0015832414439852427,\n \"acc\": 0.3838492484021702,\n \"acc_stderr\": 0.009135888573374731\n },\n \"harness|drop|3\": {\n \"em\": 0.005243288590604027,\n \"em_stderr\": 0.0007396052260778031,\n \"f1\": 0.06909395973154382,\n \"f1_stderr\": 0.0015832414439852427\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.04473085670962851,\n \"acc_stderr\": 0.005693886131407048\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7229676400947119,\n \"acc_stderr\": 0.012577891015342414\n }\n}\n```", "repo_url": "https://huggingface.co/GOAT-AI/GOAT-7B-Community", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_25T12_51_32.230763", "path": ["**/details_harness|arc:challenge|25_2023-07-25T12:51:32.230763.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-25T12:51:32.230763.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_09_22T17_14_52.967997", "path": ["**/details_harness|drop|3_2023-09-22T17-14-52.967997.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-09-22T17-14-52.967997.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_09_22T17_14_52.967997", "path": ["**/details_harness|gsm8k|5_2023-09-22T17-14-52.967997.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-09-22T17-14-52.967997.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_25T12_51_32.230763", "path": ["**/details_harness|hellaswag|10_2023-07-25T12:51:32.230763.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-25T12:51:32.230763.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_25T12_51_32.230763", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-25T12:51:32.230763.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-25T12:51:32.230763.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-25T12:51:32.230763.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-25T12:51:32.230763.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-25T12:51:32.230763.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-25T12:51:32.230763.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-25T12:51:32.230763.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-25T12:51:32.230763.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-25T12:51:32.230763.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-25T12:51:32.230763.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-25T12:51:32.230763.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-25T12:51:32.230763.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-25T12:51:32.230763.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-25T12:51:32.230763.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-25T12:51:32.230763.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-25T12:51:32.230763.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-25T12:51:32.230763.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-25T12:51:32.230763.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-25T12:51:32.230763.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-25T12:51:32.230763.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-25T12:51:32.230763.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-25T12:51:32.230763.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-25T12:51:32.230763.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-25T12:51:32.230763.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-25T12:51:32.230763.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-25T12:51:32.230763.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-25T12:51:32.230763.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-25T12:51:32.230763.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-25T12:51:32.230763.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-25T12:51:32.230763.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-25T12:51:32.230763.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-25T12:51:32.230763.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-25T12:51:32.230763.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-25T12:51:32.230763.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-25T12:51:32.230763.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-25T12:51:32.230763.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-25T12:51:32.230763.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-25T12:51:32.230763.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-25T12:51:32.230763.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-25T12:51:32.230763.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-25T12:51:32.230763.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-25T12:51:32.230763.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-25T12:51:32.230763.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-25T12:51:32.230763.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-25T12:51:32.230763.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-25T12:51:32.230763.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-25T12:51:32.230763.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-25T12:51:32.230763.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-25T12:51:32.230763.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-25T12:51:32.230763.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-25T12:51:32.230763.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-25T12:51:32.230763.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-25T12:51:32.230763.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-25T12:51:32.230763.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-25T12:51:32.230763.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-25T12:51:32.230763.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-25T12:51:32.230763.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-25T12:51:32.230763.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-25T12:51:32.230763.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-25T12:51:32.230763.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-25T12:51:32.230763.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-25T12:51:32.230763.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-25T12:51:32.230763.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-25T12:51:32.230763.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-25T12:51:32.230763.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-25T12:51:32.230763.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-25T12:51:32.230763.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-25T12:51:32.230763.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-25T12:51:32.230763.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-25T12:51:32.230763.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-25T12:51:32.230763.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-25T12:51:32.230763.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-25T12:51:32.230763.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-25T12:51:32.230763.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-25T12:51:32.230763.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-25T12:51:32.230763.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-25T12:51:32.230763.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-25T12:51:32.230763.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-25T12:51:32.230763.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-25T12:51:32.230763.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-25T12:51:32.230763.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-25T12:51:32.230763.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-25T12:51:32.230763.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-25T12:51:32.230763.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-25T12:51:32.230763.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-25T12:51:32.230763.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-25T12:51:32.230763.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-25T12:51:32.230763.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-25T12:51:32.230763.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-25T12:51:32.230763.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-25T12:51:32.230763.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-25T12:51:32.230763.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-25T12:51:32.230763.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-25T12:51:32.230763.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-25T12:51:32.230763.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-25T12:51:32.230763.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-25T12:51:32.230763.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-25T12:51:32.230763.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-25T12:51:32.230763.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-25T12:51:32.230763.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-25T12:51:32.230763.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-25T12:51:32.230763.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-25T12:51:32.230763.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-25T12:51:32.230763.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-25T12:51:32.230763.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-25T12:51:32.230763.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-25T12:51:32.230763.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-25T12:51:32.230763.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-25T12:51:32.230763.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-25T12:51:32.230763.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-25T12:51:32.230763.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-25T12:51:32.230763.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-25T12:51:32.230763.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-25T12:51:32.230763.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_25T12_51_32.230763", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-25T12:51:32.230763.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-25T12:51:32.230763.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_25T12_51_32.230763", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-25T12:51:32.230763.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-25T12:51:32.230763.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_25T12_51_32.230763", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-25T12:51:32.230763.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-25T12:51:32.230763.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_25T12_51_32.230763", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-25T12:51:32.230763.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-25T12:51:32.230763.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_25T12_51_32.230763", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-25T12:51:32.230763.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-25T12:51:32.230763.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_25T12_51_32.230763", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-25T12:51:32.230763.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-25T12:51:32.230763.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_25T12_51_32.230763", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-25T12:51:32.230763.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-25T12:51:32.230763.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_25T12_51_32.230763", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-25T12:51:32.230763.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-25T12:51:32.230763.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_25T12_51_32.230763", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-25T12:51:32.230763.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-25T12:51:32.230763.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_25T12_51_32.230763", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-25T12:51:32.230763.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-25T12:51:32.230763.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_25T12_51_32.230763", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-25T12:51:32.230763.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-25T12:51:32.230763.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_25T12_51_32.230763", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-25T12:51:32.230763.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-25T12:51:32.230763.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_25T12_51_32.230763", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-25T12:51:32.230763.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-25T12:51:32.230763.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_25T12_51_32.230763", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-25T12:51:32.230763.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-25T12:51:32.230763.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_25T12_51_32.230763", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-25T12:51:32.230763.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-25T12:51:32.230763.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_25T12_51_32.230763", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-25T12:51:32.230763.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-25T12:51:32.230763.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_25T12_51_32.230763", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-25T12:51:32.230763.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-25T12:51:32.230763.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_25T12_51_32.230763", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-25T12:51:32.230763.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-25T12:51:32.230763.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_25T12_51_32.230763", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-25T12:51:32.230763.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-25T12:51:32.230763.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_25T12_51_32.230763", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-25T12:51:32.230763.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-25T12:51:32.230763.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_25T12_51_32.230763", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-25T12:51:32.230763.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-25T12:51:32.230763.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_25T12_51_32.230763", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-25T12:51:32.230763.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-25T12:51:32.230763.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_25T12_51_32.230763", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-25T12:51:32.230763.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-25T12:51:32.230763.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_25T12_51_32.230763", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-25T12:51:32.230763.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-25T12:51:32.230763.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_25T12_51_32.230763", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-25T12:51:32.230763.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-25T12:51:32.230763.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_25T12_51_32.230763", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-25T12:51:32.230763.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-25T12:51:32.230763.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_25T12_51_32.230763", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-25T12:51:32.230763.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-25T12:51:32.230763.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_25T12_51_32.230763", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-25T12:51:32.230763.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-25T12:51:32.230763.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_25T12_51_32.230763", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-25T12:51:32.230763.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-25T12:51:32.230763.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_25T12_51_32.230763", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-25T12:51:32.230763.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-25T12:51:32.230763.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_25T12_51_32.230763", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-25T12:51:32.230763.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-25T12:51:32.230763.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_25T12_51_32.230763", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-25T12:51:32.230763.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-25T12:51:32.230763.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_25T12_51_32.230763", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-25T12:51:32.230763.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-25T12:51:32.230763.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_25T12_51_32.230763", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-25T12:51:32.230763.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-25T12:51:32.230763.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_25T12_51_32.230763", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-25T12:51:32.230763.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-25T12:51:32.230763.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_25T12_51_32.230763", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-25T12:51:32.230763.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-25T12:51:32.230763.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_25T12_51_32.230763", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-25T12:51:32.230763.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-25T12:51:32.230763.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_25T12_51_32.230763", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-25T12:51:32.230763.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-25T12:51:32.230763.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_25T12_51_32.230763", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-25T12:51:32.230763.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-25T12:51:32.230763.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_25T12_51_32.230763", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-25T12:51:32.230763.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-25T12:51:32.230763.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_25T12_51_32.230763", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-25T12:51:32.230763.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-25T12:51:32.230763.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_25T12_51_32.230763", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-25T12:51:32.230763.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-25T12:51:32.230763.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_25T12_51_32.230763", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-25T12:51:32.230763.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-25T12:51:32.230763.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_25T12_51_32.230763", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-25T12:51:32.230763.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-25T12:51:32.230763.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_25T12_51_32.230763", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-25T12:51:32.230763.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-25T12:51:32.230763.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_25T12_51_32.230763", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-25T12:51:32.230763.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-25T12:51:32.230763.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_25T12_51_32.230763", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-25T12:51:32.230763.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-25T12:51:32.230763.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_25T12_51_32.230763", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-25T12:51:32.230763.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-25T12:51:32.230763.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_25T12_51_32.230763", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-25T12:51:32.230763.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-25T12:51:32.230763.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_25T12_51_32.230763", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-25T12:51:32.230763.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-25T12:51:32.230763.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_25T12_51_32.230763", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-25T12:51:32.230763.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-25T12:51:32.230763.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_25T12_51_32.230763", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-25T12:51:32.230763.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-25T12:51:32.230763.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_25T12_51_32.230763", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-25T12:51:32.230763.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-25T12:51:32.230763.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_25T12_51_32.230763", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-25T12:51:32.230763.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-25T12:51:32.230763.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_25T12_51_32.230763", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-25T12:51:32.230763.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-25T12:51:32.230763.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_25T12_51_32.230763", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-25T12:51:32.230763.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-25T12:51:32.230763.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_25T12_51_32.230763", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-25T12:51:32.230763.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-25T12:51:32.230763.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_25T12_51_32.230763", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-25T12:51:32.230763.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-25T12:51:32.230763.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_09_22T17_14_52.967997", "path": ["**/details_harness|winogrande|5_2023-09-22T17-14-52.967997.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-09-22T17-14-52.967997.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_25T12_51_32.230763", "path": ["results_2023-07-25T12:51:32.230763.parquet"]}, {"split": "2023_09_22T17_14_52.967997", "path": ["results_2023-09-22T17-14-52.967997.parquet"]}, {"split": "latest", "path": ["results_2023-09-22T17-14-52.967997.parquet"]}]}]}
2023-09-22T16:15:05+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of GOAT-AI/GOAT-7B-Community ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model GOAT-AI/GOAT-7B-Community on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-09-22T17:14:52.967997(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of GOAT-AI/GOAT-7B-Community", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model GOAT-AI/GOAT-7B-Community on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-09-22T17:14:52.967997(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of GOAT-AI/GOAT-7B-Community", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model GOAT-AI/GOAT-7B-Community on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-09-22T17:14:52.967997(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 21, 31, 169, 66, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of GOAT-AI/GOAT-7B-Community## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model GOAT-AI/GOAT-7B-Community on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-09-22T17:14:52.967997(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
7890e85700b3bac1c70a8938a17e77737c42e7e7
# Dataset Card for "mmlu-computer_security-neg-prepend-fix" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
joey234/mmlu-computer_security-neg-prepend-fix
[ "region:us" ]
2023-08-18T11:00:02+00:00
{"configs": [{"config_name": "default", "data_files": [{"split": "dev", "path": "data/dev-*"}, {"split": "test", "path": "data/test-*"}]}], "dataset_info": {"features": [{"name": "question", "dtype": "string"}, {"name": "choices", "sequence": "string"}, {"name": "answer", "dtype": {"class_label": {"names": {"0": "A", "1": "B", "2": "C", "3": "D"}}}}, {"name": "negate_openai_prompt", "struct": [{"name": "content", "dtype": "string"}, {"name": "role", "dtype": "string"}]}, {"name": "neg_question", "dtype": "string"}, {"name": "fewshot_context", "dtype": "string"}, {"name": "ori_prompt", "dtype": "string"}], "splits": [{"name": "dev", "num_bytes": 5063, "num_examples": 5}, {"name": "test", "num_bytes": 229284, "num_examples": 100}], "download_size": 13363, "dataset_size": 234347}}
2023-08-21T06:33:26+00:00
[]
[]
TAGS #region-us
# Dataset Card for "mmlu-computer_security-neg-prepend-fix" More Information needed
[ "# Dataset Card for \"mmlu-computer_security-neg-prepend-fix\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"mmlu-computer_security-neg-prepend-fix\"\n\nMore Information needed" ]
[ 6, 25 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"mmlu-computer_security-neg-prepend-fix\"\n\nMore Information needed" ]
42c679919dbc340aa35c0a55df31f54db6b7f5d1
# Dataset Card for Evaluation run of Monero/Manticore-13b-Chat-Pyg-Guanaco ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/Monero/Manticore-13b-Chat-Pyg-Guanaco - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [Monero/Manticore-13b-Chat-Pyg-Guanaco](https://huggingface.co/Monero/Manticore-13b-Chat-Pyg-Guanaco) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_Monero__Manticore-13b-Chat-Pyg-Guanaco", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-09-17T08:05:02.846180](https://huggingface.co/datasets/open-llm-leaderboard/details_Monero__Manticore-13b-Chat-Pyg-Guanaco/blob/main/results_2023-09-17T08-05-02.846180.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.1636954697986577, "em_stderr": 0.00378913611358371, "f1": 0.25622378355704734, "f1_stderr": 0.003909791858313052, "acc": 0.412985669347219, "acc_stderr": 0.010037439004551042 }, "harness|drop|3": { "em": 0.1636954697986577, "em_stderr": 0.00378913611358371, "f1": 0.25622378355704734, "f1_stderr": 0.003909791858313052 }, "harness|gsm8k|5": { "acc": 0.08642911296436695, "acc_stderr": 0.007740044337103798 }, "harness|winogrande|5": { "acc": 0.739542225730071, "acc_stderr": 0.012334833671998289 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_Monero__Manticore-13b-Chat-Pyg-Guanaco
[ "region:us" ]
2023-08-18T11:00:10+00:00
{"pretty_name": "Evaluation run of Monero/Manticore-13b-Chat-Pyg-Guanaco", "dataset_summary": "Dataset automatically created during the evaluation run of model [Monero/Manticore-13b-Chat-Pyg-Guanaco](https://huggingface.co/Monero/Manticore-13b-Chat-Pyg-Guanaco) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Monero__Manticore-13b-Chat-Pyg-Guanaco\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-09-17T08:05:02.846180](https://huggingface.co/datasets/open-llm-leaderboard/details_Monero__Manticore-13b-Chat-Pyg-Guanaco/blob/main/results_2023-09-17T08-05-02.846180.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.1636954697986577,\n \"em_stderr\": 0.00378913611358371,\n \"f1\": 0.25622378355704734,\n \"f1_stderr\": 0.003909791858313052,\n \"acc\": 0.412985669347219,\n \"acc_stderr\": 0.010037439004551042\n },\n \"harness|drop|3\": {\n \"em\": 0.1636954697986577,\n \"em_stderr\": 0.00378913611358371,\n \"f1\": 0.25622378355704734,\n \"f1_stderr\": 0.003909791858313052\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.08642911296436695,\n \"acc_stderr\": 0.007740044337103798\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.739542225730071,\n \"acc_stderr\": 0.012334833671998289\n }\n}\n```", "repo_url": "https://huggingface.co/Monero/Manticore-13b-Chat-Pyg-Guanaco", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_19T18_26_13.261313", "path": ["**/details_harness|arc:challenge|25_2023-07-19T18:26:13.261313.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-19T18:26:13.261313.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_09_17T08_05_02.846180", "path": ["**/details_harness|drop|3_2023-09-17T08-05-02.846180.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-09-17T08-05-02.846180.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_09_17T08_05_02.846180", "path": ["**/details_harness|gsm8k|5_2023-09-17T08-05-02.846180.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-09-17T08-05-02.846180.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_19T18_26_13.261313", "path": ["**/details_harness|hellaswag|10_2023-07-19T18:26:13.261313.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-19T18:26:13.261313.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_19T18_26_13.261313", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T18:26:13.261313.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T18:26:13.261313.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T18:26:13.261313.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T18:26:13.261313.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T18:26:13.261313.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T18:26:13.261313.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T18:26:13.261313.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T18:26:13.261313.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T18:26:13.261313.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T18:26:13.261313.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T18:26:13.261313.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T18:26:13.261313.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T18:26:13.261313.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T18:26:13.261313.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T18:26:13.261313.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T18:26:13.261313.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T18:26:13.261313.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T18:26:13.261313.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T18:26:13.261313.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T18:26:13.261313.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T18:26:13.261313.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T18:26:13.261313.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T18:26:13.261313.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T18:26:13.261313.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T18:26:13.261313.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T18:26:13.261313.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T18:26:13.261313.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T18:26:13.261313.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T18:26:13.261313.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T18:26:13.261313.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T18:26:13.261313.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T18:26:13.261313.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T18:26:13.261313.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T18:26:13.261313.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T18:26:13.261313.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T18:26:13.261313.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T18:26:13.261313.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T18:26:13.261313.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T18:26:13.261313.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T18:26:13.261313.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T18:26:13.261313.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T18:26:13.261313.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T18:26:13.261313.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T18:26:13.261313.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T18:26:13.261313.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T18:26:13.261313.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T18:26:13.261313.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T18:26:13.261313.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T18:26:13.261313.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T18:26:13.261313.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T18:26:13.261313.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T18:26:13.261313.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T18:26:13.261313.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T18:26:13.261313.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T18:26:13.261313.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T18:26:13.261313.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T18:26:13.261313.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T18:26:13.261313.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T18:26:13.261313.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T18:26:13.261313.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T18:26:13.261313.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T18:26:13.261313.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T18:26:13.261313.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T18:26:13.261313.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T18:26:13.261313.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T18:26:13.261313.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T18:26:13.261313.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T18:26:13.261313.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T18:26:13.261313.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T18:26:13.261313.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T18:26:13.261313.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T18:26:13.261313.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T18:26:13.261313.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T18:26:13.261313.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T18:26:13.261313.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T18:26:13.261313.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T18:26:13.261313.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T18:26:13.261313.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T18:26:13.261313.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T18:26:13.261313.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T18:26:13.261313.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T18:26:13.261313.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T18:26:13.261313.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T18:26:13.261313.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T18:26:13.261313.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T18:26:13.261313.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T18:26:13.261313.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T18:26:13.261313.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T18:26:13.261313.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T18:26:13.261313.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T18:26:13.261313.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T18:26:13.261313.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T18:26:13.261313.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T18:26:13.261313.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T18:26:13.261313.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T18:26:13.261313.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T18:26:13.261313.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T18:26:13.261313.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T18:26:13.261313.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T18:26:13.261313.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T18:26:13.261313.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T18:26:13.261313.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T18:26:13.261313.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T18:26:13.261313.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T18:26:13.261313.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T18:26:13.261313.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T18:26:13.261313.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T18:26:13.261313.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T18:26:13.261313.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T18:26:13.261313.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T18:26:13.261313.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T18:26:13.261313.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T18:26:13.261313.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T18:26:13.261313.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_19T18_26_13.261313", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T18:26:13.261313.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T18:26:13.261313.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_19T18_26_13.261313", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T18:26:13.261313.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T18:26:13.261313.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_19T18_26_13.261313", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T18:26:13.261313.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T18:26:13.261313.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_19T18_26_13.261313", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T18:26:13.261313.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T18:26:13.261313.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_19T18_26_13.261313", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T18:26:13.261313.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T18:26:13.261313.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_19T18_26_13.261313", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T18:26:13.261313.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T18:26:13.261313.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_19T18_26_13.261313", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T18:26:13.261313.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T18:26:13.261313.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_19T18_26_13.261313", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T18:26:13.261313.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T18:26:13.261313.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_19T18_26_13.261313", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T18:26:13.261313.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T18:26:13.261313.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_19T18_26_13.261313", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T18:26:13.261313.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T18:26:13.261313.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_19T18_26_13.261313", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T18:26:13.261313.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T18:26:13.261313.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_19T18_26_13.261313", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T18:26:13.261313.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T18:26:13.261313.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_19T18_26_13.261313", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T18:26:13.261313.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T18:26:13.261313.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_19T18_26_13.261313", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T18:26:13.261313.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T18:26:13.261313.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_19T18_26_13.261313", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T18:26:13.261313.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T18:26:13.261313.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_19T18_26_13.261313", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T18:26:13.261313.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T18:26:13.261313.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_19T18_26_13.261313", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T18:26:13.261313.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T18:26:13.261313.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_19T18_26_13.261313", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T18:26:13.261313.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T18:26:13.261313.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_19T18_26_13.261313", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T18:26:13.261313.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T18:26:13.261313.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_19T18_26_13.261313", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T18:26:13.261313.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T18:26:13.261313.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_19T18_26_13.261313", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T18:26:13.261313.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T18:26:13.261313.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_19T18_26_13.261313", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T18:26:13.261313.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T18:26:13.261313.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_19T18_26_13.261313", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T18:26:13.261313.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T18:26:13.261313.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_19T18_26_13.261313", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T18:26:13.261313.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T18:26:13.261313.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_19T18_26_13.261313", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T18:26:13.261313.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T18:26:13.261313.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_19T18_26_13.261313", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T18:26:13.261313.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T18:26:13.261313.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_19T18_26_13.261313", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T18:26:13.261313.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T18:26:13.261313.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_19T18_26_13.261313", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T18:26:13.261313.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T18:26:13.261313.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_19T18_26_13.261313", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T18:26:13.261313.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T18:26:13.261313.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_19T18_26_13.261313", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T18:26:13.261313.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T18:26:13.261313.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_19T18_26_13.261313", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T18:26:13.261313.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T18:26:13.261313.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_19T18_26_13.261313", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T18:26:13.261313.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T18:26:13.261313.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_19T18_26_13.261313", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T18:26:13.261313.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T18:26:13.261313.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_19T18_26_13.261313", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T18:26:13.261313.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T18:26:13.261313.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_19T18_26_13.261313", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T18:26:13.261313.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T18:26:13.261313.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_19T18_26_13.261313", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T18:26:13.261313.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T18:26:13.261313.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_19T18_26_13.261313", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T18:26:13.261313.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T18:26:13.261313.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_19T18_26_13.261313", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T18:26:13.261313.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T18:26:13.261313.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_19T18_26_13.261313", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T18:26:13.261313.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T18:26:13.261313.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_19T18_26_13.261313", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T18:26:13.261313.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T18:26:13.261313.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_19T18_26_13.261313", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T18:26:13.261313.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T18:26:13.261313.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_19T18_26_13.261313", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T18:26:13.261313.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T18:26:13.261313.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_19T18_26_13.261313", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T18:26:13.261313.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T18:26:13.261313.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_19T18_26_13.261313", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T18:26:13.261313.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T18:26:13.261313.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_19T18_26_13.261313", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T18:26:13.261313.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T18:26:13.261313.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_19T18_26_13.261313", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T18:26:13.261313.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T18:26:13.261313.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_19T18_26_13.261313", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T18:26:13.261313.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T18:26:13.261313.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_19T18_26_13.261313", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T18:26:13.261313.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T18:26:13.261313.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_19T18_26_13.261313", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T18:26:13.261313.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T18:26:13.261313.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_19T18_26_13.261313", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T18:26:13.261313.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T18:26:13.261313.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_19T18_26_13.261313", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T18:26:13.261313.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T18:26:13.261313.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_19T18_26_13.261313", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T18:26:13.261313.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T18:26:13.261313.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_19T18_26_13.261313", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T18:26:13.261313.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T18:26:13.261313.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_19T18_26_13.261313", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T18:26:13.261313.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T18:26:13.261313.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_19T18_26_13.261313", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T18:26:13.261313.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T18:26:13.261313.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_19T18_26_13.261313", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T18:26:13.261313.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T18:26:13.261313.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_19T18_26_13.261313", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T18:26:13.261313.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T18:26:13.261313.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_19T18_26_13.261313", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T18:26:13.261313.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T18:26:13.261313.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_09_17T08_05_02.846180", "path": ["**/details_harness|winogrande|5_2023-09-17T08-05-02.846180.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-09-17T08-05-02.846180.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_19T18_26_13.261313", "path": ["results_2023-07-19T18:26:13.261313.parquet"]}, {"split": "2023_09_17T08_05_02.846180", "path": ["results_2023-09-17T08-05-02.846180.parquet"]}, {"split": "latest", "path": ["results_2023-09-17T08-05-02.846180.parquet"]}]}]}
2023-09-17T07:05:14+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of Monero/Manticore-13b-Chat-Pyg-Guanaco ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model Monero/Manticore-13b-Chat-Pyg-Guanaco on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-09-17T08:05:02.846180(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of Monero/Manticore-13b-Chat-Pyg-Guanaco", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model Monero/Manticore-13b-Chat-Pyg-Guanaco on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-09-17T08:05:02.846180(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of Monero/Manticore-13b-Chat-Pyg-Guanaco", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model Monero/Manticore-13b-Chat-Pyg-Guanaco on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-09-17T08:05:02.846180(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 26, 31, 174, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of Monero/Manticore-13b-Chat-Pyg-Guanaco## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model Monero/Manticore-13b-Chat-Pyg-Guanaco on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-09-17T08:05:02.846180(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
f180360fb317673ba619c4a7146bd5c0f6dd7d03
# Dataset Card for "mmlu-conceptual_physics-neg-prepend-fix" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
joey234/mmlu-conceptual_physics-neg-prepend-fix
[ "region:us" ]
2023-08-18T11:00:15+00:00
{"configs": [{"config_name": "default", "data_files": [{"split": "dev", "path": "data/dev-*"}, {"split": "test", "path": "data/test-*"}]}], "dataset_info": {"features": [{"name": "question", "dtype": "string"}, {"name": "choices", "sequence": "string"}, {"name": "answer", "dtype": {"class_label": {"names": {"0": "A", "1": "B", "2": "C", "3": "D"}}}}, {"name": "negate_openai_prompt", "struct": [{"name": "content", "dtype": "string"}, {"name": "role", "dtype": "string"}]}, {"name": "neg_question", "dtype": "string"}, {"name": "fewshot_context", "dtype": "string"}, {"name": "ori_prompt", "dtype": "string"}], "splits": [{"name": "dev", "num_bytes": 4993, "num_examples": 5}, {"name": "test", "num_bytes": 438778, "num_examples": 235}], "download_size": 13083, "dataset_size": 443771}}
2023-08-21T06:33:39+00:00
[]
[]
TAGS #region-us
# Dataset Card for "mmlu-conceptual_physics-neg-prepend-fix" More Information needed
[ "# Dataset Card for \"mmlu-conceptual_physics-neg-prepend-fix\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"mmlu-conceptual_physics-neg-prepend-fix\"\n\nMore Information needed" ]
[ 6, 29 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"mmlu-conceptual_physics-neg-prepend-fix\"\n\nMore Information needed" ]
cbe316592e329068f64c5f97f04e4859f5274d96
# Dataset Card for Evaluation run of Monero/WizardLM-Uncensored-SuperCOT-StoryTelling-30b ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/Monero/WizardLM-Uncensored-SuperCOT-StoryTelling-30b - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [Monero/WizardLM-Uncensored-SuperCOT-StoryTelling-30b](https://huggingface.co/Monero/WizardLM-Uncensored-SuperCOT-StoryTelling-30b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_Monero__WizardLM-Uncensored-SuperCOT-StoryTelling-30b", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-10-15T18:31:20.676081](https://huggingface.co/datasets/open-llm-leaderboard/details_Monero__WizardLM-Uncensored-SuperCOT-StoryTelling-30b/blob/main/results_2023-10-15T18-31-20.676081.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.24653942953020133, "em_stderr": 0.004413804668718679, "f1": 0.33164010067114214, "f1_stderr": 0.004375317074606664, "acc": 0.38205290535450254, "acc_stderr": 0.009533625550775153 }, "harness|drop|3": { "em": 0.24653942953020133, "em_stderr": 0.004413804668718679, "f1": 0.33164010067114214, "f1_stderr": 0.004375317074606664 }, "harness|gsm8k|5": { "acc": 0.05534495830174375, "acc_stderr": 0.006298221796179607 }, "harness|winogrande|5": { "acc": 0.7087608524072613, "acc_stderr": 0.012769029305370699 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_Monero__WizardLM-Uncensored-SuperCOT-StoryTelling-30b
[ "region:us" ]
2023-08-18T11:00:19+00:00
{"pretty_name": "Evaluation run of Monero/WizardLM-Uncensored-SuperCOT-StoryTelling-30b", "dataset_summary": "Dataset automatically created during the evaluation run of model [Monero/WizardLM-Uncensored-SuperCOT-StoryTelling-30b](https://huggingface.co/Monero/WizardLM-Uncensored-SuperCOT-StoryTelling-30b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Monero__WizardLM-Uncensored-SuperCOT-StoryTelling-30b\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-15T18:31:20.676081](https://huggingface.co/datasets/open-llm-leaderboard/details_Monero__WizardLM-Uncensored-SuperCOT-StoryTelling-30b/blob/main/results_2023-10-15T18-31-20.676081.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.24653942953020133,\n \"em_stderr\": 0.004413804668718679,\n \"f1\": 0.33164010067114214,\n \"f1_stderr\": 0.004375317074606664,\n \"acc\": 0.38205290535450254,\n \"acc_stderr\": 0.009533625550775153\n },\n \"harness|drop|3\": {\n \"em\": 0.24653942953020133,\n \"em_stderr\": 0.004413804668718679,\n \"f1\": 0.33164010067114214,\n \"f1_stderr\": 0.004375317074606664\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.05534495830174375,\n \"acc_stderr\": 0.006298221796179607\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7087608524072613,\n \"acc_stderr\": 0.012769029305370699\n }\n}\n```", "repo_url": "https://huggingface.co/Monero/WizardLM-Uncensored-SuperCOT-StoryTelling-30b", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_19T22_17_39.123351", "path": ["**/details_harness|arc:challenge|25_2023-07-19T22:17:39.123351.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-19T22:17:39.123351.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_15T18_31_20.676081", "path": ["**/details_harness|drop|3_2023-10-15T18-31-20.676081.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-15T18-31-20.676081.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_15T18_31_20.676081", "path": ["**/details_harness|gsm8k|5_2023-10-15T18-31-20.676081.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-15T18-31-20.676081.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_19T22_17_39.123351", "path": ["**/details_harness|hellaswag|10_2023-07-19T22:17:39.123351.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-19T22:17:39.123351.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_19T22_17_39.123351", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T22:17:39.123351.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T22:17:39.123351.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T22:17:39.123351.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T22:17:39.123351.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T22:17:39.123351.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T22:17:39.123351.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T22:17:39.123351.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T22:17:39.123351.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T22:17:39.123351.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T22:17:39.123351.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T22:17:39.123351.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T22:17:39.123351.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T22:17:39.123351.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T22:17:39.123351.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T22:17:39.123351.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T22:17:39.123351.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T22:17:39.123351.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T22:17:39.123351.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T22:17:39.123351.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T22:17:39.123351.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T22:17:39.123351.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T22:17:39.123351.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T22:17:39.123351.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T22:17:39.123351.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T22:17:39.123351.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T22:17:39.123351.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T22:17:39.123351.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T22:17:39.123351.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T22:17:39.123351.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T22:17:39.123351.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T22:17:39.123351.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T22:17:39.123351.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T22:17:39.123351.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T22:17:39.123351.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T22:17:39.123351.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T22:17:39.123351.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T22:17:39.123351.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T22:17:39.123351.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T22:17:39.123351.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T22:17:39.123351.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T22:17:39.123351.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T22:17:39.123351.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T22:17:39.123351.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T22:17:39.123351.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T22:17:39.123351.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T22:17:39.123351.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T22:17:39.123351.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T22:17:39.123351.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T22:17:39.123351.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T22:17:39.123351.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T22:17:39.123351.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T22:17:39.123351.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T22:17:39.123351.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T22:17:39.123351.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T22:17:39.123351.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T22:17:39.123351.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T22:17:39.123351.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T22:17:39.123351.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T22:17:39.123351.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T22:17:39.123351.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T22:17:39.123351.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T22:17:39.123351.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T22:17:39.123351.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T22:17:39.123351.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T22:17:39.123351.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T22:17:39.123351.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T22:17:39.123351.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T22:17:39.123351.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T22:17:39.123351.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T22:17:39.123351.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T22:17:39.123351.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T22:17:39.123351.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T22:17:39.123351.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T22:17:39.123351.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T22:17:39.123351.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T22:17:39.123351.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T22:17:39.123351.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T22:17:39.123351.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T22:17:39.123351.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T22:17:39.123351.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T22:17:39.123351.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T22:17:39.123351.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T22:17:39.123351.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T22:17:39.123351.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T22:17:39.123351.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T22:17:39.123351.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T22:17:39.123351.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T22:17:39.123351.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T22:17:39.123351.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T22:17:39.123351.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T22:17:39.123351.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T22:17:39.123351.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T22:17:39.123351.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T22:17:39.123351.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T22:17:39.123351.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T22:17:39.123351.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T22:17:39.123351.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T22:17:39.123351.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T22:17:39.123351.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T22:17:39.123351.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T22:17:39.123351.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T22:17:39.123351.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T22:17:39.123351.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T22:17:39.123351.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T22:17:39.123351.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T22:17:39.123351.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T22:17:39.123351.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T22:17:39.123351.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T22:17:39.123351.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T22:17:39.123351.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T22:17:39.123351.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T22:17:39.123351.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T22:17:39.123351.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T22:17:39.123351.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_19T22_17_39.123351", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T22:17:39.123351.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T22:17:39.123351.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_19T22_17_39.123351", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T22:17:39.123351.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T22:17:39.123351.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_19T22_17_39.123351", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T22:17:39.123351.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T22:17:39.123351.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_19T22_17_39.123351", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T22:17:39.123351.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T22:17:39.123351.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_19T22_17_39.123351", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T22:17:39.123351.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T22:17:39.123351.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_19T22_17_39.123351", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T22:17:39.123351.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T22:17:39.123351.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_19T22_17_39.123351", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T22:17:39.123351.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T22:17:39.123351.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_19T22_17_39.123351", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T22:17:39.123351.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T22:17:39.123351.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_19T22_17_39.123351", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T22:17:39.123351.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T22:17:39.123351.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_19T22_17_39.123351", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T22:17:39.123351.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T22:17:39.123351.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_19T22_17_39.123351", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T22:17:39.123351.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T22:17:39.123351.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_19T22_17_39.123351", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T22:17:39.123351.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T22:17:39.123351.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_19T22_17_39.123351", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T22:17:39.123351.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T22:17:39.123351.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_19T22_17_39.123351", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T22:17:39.123351.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T22:17:39.123351.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_19T22_17_39.123351", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T22:17:39.123351.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T22:17:39.123351.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_19T22_17_39.123351", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T22:17:39.123351.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T22:17:39.123351.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_19T22_17_39.123351", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T22:17:39.123351.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T22:17:39.123351.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_19T22_17_39.123351", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T22:17:39.123351.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T22:17:39.123351.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_19T22_17_39.123351", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T22:17:39.123351.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T22:17:39.123351.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_19T22_17_39.123351", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T22:17:39.123351.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T22:17:39.123351.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_19T22_17_39.123351", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T22:17:39.123351.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T22:17:39.123351.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_19T22_17_39.123351", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T22:17:39.123351.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T22:17:39.123351.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_19T22_17_39.123351", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T22:17:39.123351.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T22:17:39.123351.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_19T22_17_39.123351", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T22:17:39.123351.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T22:17:39.123351.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_19T22_17_39.123351", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T22:17:39.123351.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T22:17:39.123351.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_19T22_17_39.123351", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T22:17:39.123351.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T22:17:39.123351.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_19T22_17_39.123351", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T22:17:39.123351.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T22:17:39.123351.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_19T22_17_39.123351", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T22:17:39.123351.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T22:17:39.123351.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_19T22_17_39.123351", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T22:17:39.123351.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T22:17:39.123351.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_19T22_17_39.123351", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T22:17:39.123351.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T22:17:39.123351.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_19T22_17_39.123351", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T22:17:39.123351.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T22:17:39.123351.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_19T22_17_39.123351", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T22:17:39.123351.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T22:17:39.123351.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_19T22_17_39.123351", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T22:17:39.123351.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T22:17:39.123351.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_19T22_17_39.123351", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T22:17:39.123351.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T22:17:39.123351.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_19T22_17_39.123351", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T22:17:39.123351.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T22:17:39.123351.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_19T22_17_39.123351", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T22:17:39.123351.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T22:17:39.123351.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_19T22_17_39.123351", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T22:17:39.123351.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T22:17:39.123351.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_19T22_17_39.123351", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T22:17:39.123351.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T22:17:39.123351.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_19T22_17_39.123351", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T22:17:39.123351.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T22:17:39.123351.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_19T22_17_39.123351", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T22:17:39.123351.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T22:17:39.123351.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_19T22_17_39.123351", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T22:17:39.123351.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T22:17:39.123351.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_19T22_17_39.123351", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T22:17:39.123351.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T22:17:39.123351.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_19T22_17_39.123351", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T22:17:39.123351.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T22:17:39.123351.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_19T22_17_39.123351", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T22:17:39.123351.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T22:17:39.123351.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_19T22_17_39.123351", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T22:17:39.123351.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T22:17:39.123351.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_19T22_17_39.123351", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T22:17:39.123351.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T22:17:39.123351.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_19T22_17_39.123351", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T22:17:39.123351.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T22:17:39.123351.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_19T22_17_39.123351", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T22:17:39.123351.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T22:17:39.123351.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_19T22_17_39.123351", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T22:17:39.123351.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T22:17:39.123351.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_19T22_17_39.123351", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T22:17:39.123351.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T22:17:39.123351.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_19T22_17_39.123351", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T22:17:39.123351.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T22:17:39.123351.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_19T22_17_39.123351", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T22:17:39.123351.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T22:17:39.123351.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_19T22_17_39.123351", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T22:17:39.123351.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T22:17:39.123351.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_19T22_17_39.123351", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T22:17:39.123351.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T22:17:39.123351.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_19T22_17_39.123351", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T22:17:39.123351.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T22:17:39.123351.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_19T22_17_39.123351", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T22:17:39.123351.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T22:17:39.123351.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_19T22_17_39.123351", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T22:17:39.123351.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T22:17:39.123351.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_19T22_17_39.123351", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T22:17:39.123351.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T22:17:39.123351.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_15T18_31_20.676081", "path": ["**/details_harness|winogrande|5_2023-10-15T18-31-20.676081.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-15T18-31-20.676081.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_19T22_17_39.123351", "path": ["results_2023-07-19T22:17:39.123351.parquet"]}, {"split": "2023_10_15T18_31_20.676081", "path": ["results_2023-10-15T18-31-20.676081.parquet"]}, {"split": "latest", "path": ["results_2023-10-15T18-31-20.676081.parquet"]}]}]}
2023-10-15T17:31:34+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of Monero/WizardLM-Uncensored-SuperCOT-StoryTelling-30b ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model Monero/WizardLM-Uncensored-SuperCOT-StoryTelling-30b on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-10-15T18:31:20.676081(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of Monero/WizardLM-Uncensored-SuperCOT-StoryTelling-30b", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model Monero/WizardLM-Uncensored-SuperCOT-StoryTelling-30b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-15T18:31:20.676081(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of Monero/WizardLM-Uncensored-SuperCOT-StoryTelling-30b", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model Monero/WizardLM-Uncensored-SuperCOT-StoryTelling-30b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-15T18:31:20.676081(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 31, 31, 179, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of Monero/WizardLM-Uncensored-SuperCOT-StoryTelling-30b## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model Monero/WizardLM-Uncensored-SuperCOT-StoryTelling-30b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-15T18:31:20.676081(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
6da9f64cd645e89d0ff954676113b7a12d73df4a
# Dataset Card for Evaluation run of Monero/WizardLM-30B-Uncensored-Guanaco-SuperCOT-30b ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/Monero/WizardLM-30B-Uncensored-Guanaco-SuperCOT-30b - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [Monero/WizardLM-30B-Uncensored-Guanaco-SuperCOT-30b](https://huggingface.co/Monero/WizardLM-30B-Uncensored-Guanaco-SuperCOT-30b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_Monero__WizardLM-30B-Uncensored-Guanaco-SuperCOT-30b", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-09-16T20:24:34.064678](https://huggingface.co/datasets/open-llm-leaderboard/details_Monero__WizardLM-30B-Uncensored-Guanaco-SuperCOT-30b/blob/main/results_2023-09-16T20-24-34.064678.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.23898909395973153, "em_stderr": 0.004367411698321815, "f1": 0.33218645134228264, "f1_stderr": 0.0042948501285767545, "acc": 0.37893683059743066, "acc_stderr": 0.008783513808235714 }, "harness|drop|3": { "em": 0.23898909395973153, "em_stderr": 0.004367411698321815, "f1": 0.33218645134228264, "f1_stderr": 0.0042948501285767545 }, "harness|gsm8k|5": { "acc": 0.03411675511751327, "acc_stderr": 0.005000212600773271 }, "harness|winogrande|5": { "acc": 0.7237569060773481, "acc_stderr": 0.012566815015698158 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_Monero__WizardLM-30B-Uncensored-Guanaco-SuperCOT-30b
[ "region:us" ]
2023-08-18T11:00:27+00:00
{"pretty_name": "Evaluation run of Monero/WizardLM-30B-Uncensored-Guanaco-SuperCOT-30b", "dataset_summary": "Dataset automatically created during the evaluation run of model [Monero/WizardLM-30B-Uncensored-Guanaco-SuperCOT-30b](https://huggingface.co/Monero/WizardLM-30B-Uncensored-Guanaco-SuperCOT-30b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Monero__WizardLM-30B-Uncensored-Guanaco-SuperCOT-30b\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-09-16T20:24:34.064678](https://huggingface.co/datasets/open-llm-leaderboard/details_Monero__WizardLM-30B-Uncensored-Guanaco-SuperCOT-30b/blob/main/results_2023-09-16T20-24-34.064678.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.23898909395973153,\n \"em_stderr\": 0.004367411698321815,\n \"f1\": 0.33218645134228264,\n \"f1_stderr\": 0.0042948501285767545,\n \"acc\": 0.37893683059743066,\n \"acc_stderr\": 0.008783513808235714\n },\n \"harness|drop|3\": {\n \"em\": 0.23898909395973153,\n \"em_stderr\": 0.004367411698321815,\n \"f1\": 0.33218645134228264,\n \"f1_stderr\": 0.0042948501285767545\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.03411675511751327,\n \"acc_stderr\": 0.005000212600773271\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7237569060773481,\n \"acc_stderr\": 0.012566815015698158\n }\n}\n```", "repo_url": "https://huggingface.co/Monero/WizardLM-30B-Uncensored-Guanaco-SuperCOT-30b", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_19T22_53_40.714431", "path": ["**/details_harness|arc:challenge|25_2023-07-19T22:53:40.714431.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-19T22:53:40.714431.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_09_16T20_24_34.064678", "path": ["**/details_harness|drop|3_2023-09-16T20-24-34.064678.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-09-16T20-24-34.064678.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_09_16T20_24_34.064678", "path": ["**/details_harness|gsm8k|5_2023-09-16T20-24-34.064678.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-09-16T20-24-34.064678.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_19T22_53_40.714431", "path": ["**/details_harness|hellaswag|10_2023-07-19T22:53:40.714431.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-19T22:53:40.714431.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_19T22_53_40.714431", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T22:53:40.714431.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T22:53:40.714431.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T22:53:40.714431.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T22:53:40.714431.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T22:53:40.714431.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T22:53:40.714431.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T22:53:40.714431.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T22:53:40.714431.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T22:53:40.714431.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T22:53:40.714431.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T22:53:40.714431.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T22:53:40.714431.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T22:53:40.714431.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T22:53:40.714431.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T22:53:40.714431.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T22:53:40.714431.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T22:53:40.714431.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T22:53:40.714431.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T22:53:40.714431.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T22:53:40.714431.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T22:53:40.714431.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T22:53:40.714431.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T22:53:40.714431.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T22:53:40.714431.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T22:53:40.714431.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T22:53:40.714431.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T22:53:40.714431.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T22:53:40.714431.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T22:53:40.714431.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T22:53:40.714431.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T22:53:40.714431.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T22:53:40.714431.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T22:53:40.714431.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T22:53:40.714431.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T22:53:40.714431.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T22:53:40.714431.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T22:53:40.714431.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T22:53:40.714431.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T22:53:40.714431.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T22:53:40.714431.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T22:53:40.714431.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T22:53:40.714431.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T22:53:40.714431.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T22:53:40.714431.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T22:53:40.714431.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T22:53:40.714431.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T22:53:40.714431.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T22:53:40.714431.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T22:53:40.714431.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T22:53:40.714431.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T22:53:40.714431.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T22:53:40.714431.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T22:53:40.714431.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T22:53:40.714431.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T22:53:40.714431.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T22:53:40.714431.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T22:53:40.714431.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T22:53:40.714431.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T22:53:40.714431.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T22:53:40.714431.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T22:53:40.714431.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T22:53:40.714431.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T22:53:40.714431.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T22:53:40.714431.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T22:53:40.714431.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T22:53:40.714431.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T22:53:40.714431.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T22:53:40.714431.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T22:53:40.714431.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T22:53:40.714431.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T22:53:40.714431.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T22:53:40.714431.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T22:53:40.714431.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T22:53:40.714431.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T22:53:40.714431.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T22:53:40.714431.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T22:53:40.714431.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T22:53:40.714431.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T22:53:40.714431.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T22:53:40.714431.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T22:53:40.714431.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T22:53:40.714431.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T22:53:40.714431.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T22:53:40.714431.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T22:53:40.714431.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T22:53:40.714431.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T22:53:40.714431.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T22:53:40.714431.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T22:53:40.714431.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T22:53:40.714431.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T22:53:40.714431.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T22:53:40.714431.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T22:53:40.714431.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T22:53:40.714431.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T22:53:40.714431.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T22:53:40.714431.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T22:53:40.714431.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T22:53:40.714431.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T22:53:40.714431.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T22:53:40.714431.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T22:53:40.714431.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T22:53:40.714431.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T22:53:40.714431.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T22:53:40.714431.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T22:53:40.714431.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T22:53:40.714431.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T22:53:40.714431.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T22:53:40.714431.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T22:53:40.714431.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T22:53:40.714431.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T22:53:40.714431.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T22:53:40.714431.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T22:53:40.714431.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T22:53:40.714431.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_19T22_53_40.714431", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T22:53:40.714431.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T22:53:40.714431.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_19T22_53_40.714431", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T22:53:40.714431.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T22:53:40.714431.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_19T22_53_40.714431", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T22:53:40.714431.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T22:53:40.714431.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_19T22_53_40.714431", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T22:53:40.714431.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T22:53:40.714431.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_19T22_53_40.714431", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T22:53:40.714431.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T22:53:40.714431.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_19T22_53_40.714431", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T22:53:40.714431.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T22:53:40.714431.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_19T22_53_40.714431", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T22:53:40.714431.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T22:53:40.714431.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_19T22_53_40.714431", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T22:53:40.714431.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T22:53:40.714431.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_19T22_53_40.714431", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T22:53:40.714431.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T22:53:40.714431.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_19T22_53_40.714431", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T22:53:40.714431.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T22:53:40.714431.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_19T22_53_40.714431", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T22:53:40.714431.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T22:53:40.714431.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_19T22_53_40.714431", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T22:53:40.714431.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T22:53:40.714431.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_19T22_53_40.714431", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T22:53:40.714431.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T22:53:40.714431.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_19T22_53_40.714431", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T22:53:40.714431.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T22:53:40.714431.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_19T22_53_40.714431", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T22:53:40.714431.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T22:53:40.714431.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_19T22_53_40.714431", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T22:53:40.714431.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T22:53:40.714431.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_19T22_53_40.714431", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T22:53:40.714431.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T22:53:40.714431.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_19T22_53_40.714431", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T22:53:40.714431.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T22:53:40.714431.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_19T22_53_40.714431", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T22:53:40.714431.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T22:53:40.714431.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_19T22_53_40.714431", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T22:53:40.714431.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T22:53:40.714431.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_19T22_53_40.714431", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T22:53:40.714431.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T22:53:40.714431.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_19T22_53_40.714431", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T22:53:40.714431.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T22:53:40.714431.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_19T22_53_40.714431", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T22:53:40.714431.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T22:53:40.714431.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_19T22_53_40.714431", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T22:53:40.714431.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T22:53:40.714431.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_19T22_53_40.714431", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T22:53:40.714431.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T22:53:40.714431.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_19T22_53_40.714431", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T22:53:40.714431.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T22:53:40.714431.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_19T22_53_40.714431", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T22:53:40.714431.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T22:53:40.714431.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_19T22_53_40.714431", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T22:53:40.714431.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T22:53:40.714431.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_19T22_53_40.714431", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T22:53:40.714431.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T22:53:40.714431.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_19T22_53_40.714431", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T22:53:40.714431.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T22:53:40.714431.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_19T22_53_40.714431", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T22:53:40.714431.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T22:53:40.714431.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_19T22_53_40.714431", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T22:53:40.714431.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T22:53:40.714431.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_19T22_53_40.714431", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T22:53:40.714431.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T22:53:40.714431.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_19T22_53_40.714431", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T22:53:40.714431.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T22:53:40.714431.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_19T22_53_40.714431", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T22:53:40.714431.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T22:53:40.714431.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_19T22_53_40.714431", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T22:53:40.714431.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T22:53:40.714431.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_19T22_53_40.714431", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T22:53:40.714431.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T22:53:40.714431.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_19T22_53_40.714431", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T22:53:40.714431.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T22:53:40.714431.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_19T22_53_40.714431", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T22:53:40.714431.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T22:53:40.714431.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_19T22_53_40.714431", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T22:53:40.714431.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T22:53:40.714431.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_19T22_53_40.714431", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T22:53:40.714431.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T22:53:40.714431.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_19T22_53_40.714431", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T22:53:40.714431.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T22:53:40.714431.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_19T22_53_40.714431", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T22:53:40.714431.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T22:53:40.714431.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_19T22_53_40.714431", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T22:53:40.714431.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T22:53:40.714431.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_19T22_53_40.714431", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T22:53:40.714431.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T22:53:40.714431.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_19T22_53_40.714431", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T22:53:40.714431.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T22:53:40.714431.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_19T22_53_40.714431", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T22:53:40.714431.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T22:53:40.714431.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_19T22_53_40.714431", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T22:53:40.714431.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T22:53:40.714431.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_19T22_53_40.714431", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T22:53:40.714431.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T22:53:40.714431.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_19T22_53_40.714431", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T22:53:40.714431.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T22:53:40.714431.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_19T22_53_40.714431", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T22:53:40.714431.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T22:53:40.714431.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_19T22_53_40.714431", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T22:53:40.714431.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T22:53:40.714431.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_19T22_53_40.714431", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T22:53:40.714431.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T22:53:40.714431.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_19T22_53_40.714431", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T22:53:40.714431.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T22:53:40.714431.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_19T22_53_40.714431", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T22:53:40.714431.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T22:53:40.714431.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_19T22_53_40.714431", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T22:53:40.714431.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T22:53:40.714431.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_19T22_53_40.714431", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T22:53:40.714431.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T22:53:40.714431.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_19T22_53_40.714431", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T22:53:40.714431.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T22:53:40.714431.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_09_16T20_24_34.064678", "path": ["**/details_harness|winogrande|5_2023-09-16T20-24-34.064678.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-09-16T20-24-34.064678.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_19T22_53_40.714431", "path": ["results_2023-07-19T22:53:40.714431.parquet"]}, {"split": "2023_09_16T20_24_34.064678", "path": ["results_2023-09-16T20-24-34.064678.parquet"]}, {"split": "latest", "path": ["results_2023-09-16T20-24-34.064678.parquet"]}]}]}
2023-09-16T19:24:46+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of Monero/WizardLM-30B-Uncensored-Guanaco-SuperCOT-30b ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model Monero/WizardLM-30B-Uncensored-Guanaco-SuperCOT-30b on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-09-16T20:24:34.064678(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of Monero/WizardLM-30B-Uncensored-Guanaco-SuperCOT-30b", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model Monero/WizardLM-30B-Uncensored-Guanaco-SuperCOT-30b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-09-16T20:24:34.064678(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of Monero/WizardLM-30B-Uncensored-Guanaco-SuperCOT-30b", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model Monero/WizardLM-30B-Uncensored-Guanaco-SuperCOT-30b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-09-16T20:24:34.064678(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 32, 31, 180, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of Monero/WizardLM-30B-Uncensored-Guanaco-SuperCOT-30b## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model Monero/WizardLM-30B-Uncensored-Guanaco-SuperCOT-30b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-09-16T20:24:34.064678(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
12bdfddcba6a1d57df7bbc6031c689a73e297296
# Dataset Card for "mmlu-econometrics-neg-prepend-fix" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
joey234/mmlu-econometrics-neg-prepend-fix
[ "region:us" ]
2023-08-18T11:00:30+00:00
{"configs": [{"config_name": "default", "data_files": [{"split": "dev", "path": "data/dev-*"}, {"split": "test", "path": "data/test-*"}]}], "dataset_info": {"features": [{"name": "question", "dtype": "string"}, {"name": "choices", "sequence": "string"}, {"name": "answer", "dtype": {"class_label": {"names": {"0": "A", "1": "B", "2": "C", "3": "D"}}}}, {"name": "negate_openai_prompt", "struct": [{"name": "content", "dtype": "string"}, {"name": "role", "dtype": "string"}]}, {"name": "neg_question", "dtype": "string"}, {"name": "fewshot_context", "dtype": "string"}, {"name": "ori_prompt", "dtype": "string"}], "splits": [{"name": "dev", "num_bytes": 7799, "num_examples": 5}, {"name": "test", "num_bytes": 374298, "num_examples": 114}], "download_size": 15514, "dataset_size": 382097}}
2023-08-21T06:33:51+00:00
[]
[]
TAGS #region-us
# Dataset Card for "mmlu-econometrics-neg-prepend-fix" More Information needed
[ "# Dataset Card for \"mmlu-econometrics-neg-prepend-fix\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"mmlu-econometrics-neg-prepend-fix\"\n\nMore Information needed" ]
[ 6, 25 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"mmlu-econometrics-neg-prepend-fix\"\n\nMore Information needed" ]
a6d185bd239c3c545cdac23db1f44ab2db02626e
# Dataset Card for Evaluation run of Monero/WizardLM-13b-OpenAssistant-Uncensored ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/Monero/WizardLM-13b-OpenAssistant-Uncensored - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [Monero/WizardLM-13b-OpenAssistant-Uncensored](https://huggingface.co/Monero/WizardLM-13b-OpenAssistant-Uncensored) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_Monero__WizardLM-13b-OpenAssistant-Uncensored", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-10-18T15:10:52.677936](https://huggingface.co/datasets/open-llm-leaderboard/details_Monero__WizardLM-13b-OpenAssistant-Uncensored/blob/main/results_2023-10-18T15-10-52.677936.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.0803271812080537, "em_stderr": 0.002783476701010582, "f1": 0.17449454697986574, "f1_stderr": 0.0031261159442318247, "acc": 0.3640185665996279, "acc_stderr": 0.008815343913571156 }, "harness|drop|3": { "em": 0.0803271812080537, "em_stderr": 0.002783476701010582, "f1": 0.17449454697986574, "f1_stderr": 0.0031261159442318247 }, "harness|gsm8k|5": { "acc": 0.030326004548900682, "acc_stderr": 0.004723487465514781 }, "harness|winogrande|5": { "acc": 0.6977111286503551, "acc_stderr": 0.012907200361627532 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_Monero__WizardLM-13b-OpenAssistant-Uncensored
[ "region:us" ]
2023-08-18T11:00:36+00:00
{"pretty_name": "Evaluation run of Monero/WizardLM-13b-OpenAssistant-Uncensored", "dataset_summary": "Dataset automatically created during the evaluation run of model [Monero/WizardLM-13b-OpenAssistant-Uncensored](https://huggingface.co/Monero/WizardLM-13b-OpenAssistant-Uncensored) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Monero__WizardLM-13b-OpenAssistant-Uncensored\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-18T15:10:52.677936](https://huggingface.co/datasets/open-llm-leaderboard/details_Monero__WizardLM-13b-OpenAssistant-Uncensored/blob/main/results_2023-10-18T15-10-52.677936.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0803271812080537,\n \"em_stderr\": 0.002783476701010582,\n \"f1\": 0.17449454697986574,\n \"f1_stderr\": 0.0031261159442318247,\n \"acc\": 0.3640185665996279,\n \"acc_stderr\": 0.008815343913571156\n },\n \"harness|drop|3\": {\n \"em\": 0.0803271812080537,\n \"em_stderr\": 0.002783476701010582,\n \"f1\": 0.17449454697986574,\n \"f1_stderr\": 0.0031261159442318247\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.030326004548900682,\n \"acc_stderr\": 0.004723487465514781\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.6977111286503551,\n \"acc_stderr\": 0.012907200361627532\n }\n}\n```", "repo_url": "https://huggingface.co/Monero/WizardLM-13b-OpenAssistant-Uncensored", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_24T13_19_46.120790", "path": ["**/details_harness|arc:challenge|25_2023-07-24T13:19:46.120790.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-24T13:19:46.120790.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_16T23_12_06.260380", "path": ["**/details_harness|drop|3_2023-10-16T23-12-06.260380.parquet"]}, {"split": "2023_10_18T15_10_52.677936", "path": ["**/details_harness|drop|3_2023-10-18T15-10-52.677936.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-18T15-10-52.677936.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_16T23_12_06.260380", "path": ["**/details_harness|gsm8k|5_2023-10-16T23-12-06.260380.parquet"]}, {"split": "2023_10_18T15_10_52.677936", "path": ["**/details_harness|gsm8k|5_2023-10-18T15-10-52.677936.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-18T15-10-52.677936.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_24T13_19_46.120790", "path": ["**/details_harness|hellaswag|10_2023-07-24T13:19:46.120790.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-24T13:19:46.120790.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_24T13_19_46.120790", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T13:19:46.120790.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-24T13:19:46.120790.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-24T13:19:46.120790.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T13:19:46.120790.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T13:19:46.120790.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-24T13:19:46.120790.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T13:19:46.120790.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T13:19:46.120790.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T13:19:46.120790.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T13:19:46.120790.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-24T13:19:46.120790.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-24T13:19:46.120790.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T13:19:46.120790.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-24T13:19:46.120790.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T13:19:46.120790.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T13:19:46.120790.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T13:19:46.120790.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-24T13:19:46.120790.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T13:19:46.120790.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T13:19:46.120790.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T13:19:46.120790.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T13:19:46.120790.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T13:19:46.120790.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T13:19:46.120790.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T13:19:46.120790.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T13:19:46.120790.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T13:19:46.120790.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T13:19:46.120790.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T13:19:46.120790.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T13:19:46.120790.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T13:19:46.120790.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T13:19:46.120790.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-24T13:19:46.120790.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T13:19:46.120790.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-24T13:19:46.120790.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T13:19:46.120790.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T13:19:46.120790.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T13:19:46.120790.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-24T13:19:46.120790.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-24T13:19:46.120790.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T13:19:46.120790.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T13:19:46.120790.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T13:19:46.120790.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T13:19:46.120790.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-24T13:19:46.120790.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-24T13:19:46.120790.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-24T13:19:46.120790.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T13:19:46.120790.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-24T13:19:46.120790.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T13:19:46.120790.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T13:19:46.120790.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-24T13:19:46.120790.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-24T13:19:46.120790.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-24T13:19:46.120790.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T13:19:46.120790.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-24T13:19:46.120790.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-24T13:19:46.120790.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T13:19:46.120790.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-24T13:19:46.120790.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-24T13:19:46.120790.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T13:19:46.120790.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T13:19:46.120790.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-24T13:19:46.120790.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T13:19:46.120790.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T13:19:46.120790.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T13:19:46.120790.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T13:19:46.120790.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-24T13:19:46.120790.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-24T13:19:46.120790.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T13:19:46.120790.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-24T13:19:46.120790.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T13:19:46.120790.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T13:19:46.120790.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T13:19:46.120790.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-24T13:19:46.120790.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T13:19:46.120790.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T13:19:46.120790.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T13:19:46.120790.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T13:19:46.120790.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T13:19:46.120790.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T13:19:46.120790.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T13:19:46.120790.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T13:19:46.120790.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T13:19:46.120790.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T13:19:46.120790.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T13:19:46.120790.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T13:19:46.120790.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T13:19:46.120790.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T13:19:46.120790.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-24T13:19:46.120790.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T13:19:46.120790.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-24T13:19:46.120790.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T13:19:46.120790.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T13:19:46.120790.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T13:19:46.120790.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-24T13:19:46.120790.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-24T13:19:46.120790.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T13:19:46.120790.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T13:19:46.120790.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T13:19:46.120790.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T13:19:46.120790.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-24T13:19:46.120790.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-24T13:19:46.120790.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-24T13:19:46.120790.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T13:19:46.120790.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-24T13:19:46.120790.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T13:19:46.120790.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T13:19:46.120790.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-24T13:19:46.120790.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-24T13:19:46.120790.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-24T13:19:46.120790.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T13:19:46.120790.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-24T13:19:46.120790.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-24T13:19:46.120790.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_24T13_19_46.120790", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T13:19:46.120790.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T13:19:46.120790.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_24T13_19_46.120790", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-24T13:19:46.120790.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-24T13:19:46.120790.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_24T13_19_46.120790", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-24T13:19:46.120790.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-24T13:19:46.120790.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_24T13_19_46.120790", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T13:19:46.120790.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T13:19:46.120790.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_24T13_19_46.120790", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T13:19:46.120790.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T13:19:46.120790.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_24T13_19_46.120790", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-24T13:19:46.120790.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-24T13:19:46.120790.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_24T13_19_46.120790", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T13:19:46.120790.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T13:19:46.120790.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_24T13_19_46.120790", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T13:19:46.120790.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T13:19:46.120790.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_24T13_19_46.120790", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T13:19:46.120790.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T13:19:46.120790.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_24T13_19_46.120790", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T13:19:46.120790.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T13:19:46.120790.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_24T13_19_46.120790", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-24T13:19:46.120790.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-24T13:19:46.120790.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_24T13_19_46.120790", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-24T13:19:46.120790.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-24T13:19:46.120790.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_24T13_19_46.120790", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T13:19:46.120790.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T13:19:46.120790.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_24T13_19_46.120790", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-24T13:19:46.120790.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-24T13:19:46.120790.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_24T13_19_46.120790", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T13:19:46.120790.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T13:19:46.120790.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_24T13_19_46.120790", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T13:19:46.120790.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T13:19:46.120790.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_24T13_19_46.120790", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T13:19:46.120790.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T13:19:46.120790.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_24T13_19_46.120790", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-24T13:19:46.120790.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-24T13:19:46.120790.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_24T13_19_46.120790", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T13:19:46.120790.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T13:19:46.120790.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_24T13_19_46.120790", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T13:19:46.120790.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T13:19:46.120790.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_24T13_19_46.120790", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T13:19:46.120790.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T13:19:46.120790.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_24T13_19_46.120790", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T13:19:46.120790.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T13:19:46.120790.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_24T13_19_46.120790", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T13:19:46.120790.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T13:19:46.120790.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_24T13_19_46.120790", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T13:19:46.120790.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T13:19:46.120790.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_24T13_19_46.120790", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T13:19:46.120790.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T13:19:46.120790.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_24T13_19_46.120790", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T13:19:46.120790.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T13:19:46.120790.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_24T13_19_46.120790", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T13:19:46.120790.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T13:19:46.120790.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_24T13_19_46.120790", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T13:19:46.120790.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T13:19:46.120790.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_24T13_19_46.120790", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T13:19:46.120790.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T13:19:46.120790.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_24T13_19_46.120790", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T13:19:46.120790.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T13:19:46.120790.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_24T13_19_46.120790", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T13:19:46.120790.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T13:19:46.120790.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_24T13_19_46.120790", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T13:19:46.120790.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T13:19:46.120790.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_24T13_19_46.120790", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-24T13:19:46.120790.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-24T13:19:46.120790.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_24T13_19_46.120790", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T13:19:46.120790.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T13:19:46.120790.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_24T13_19_46.120790", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-24T13:19:46.120790.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-24T13:19:46.120790.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_24T13_19_46.120790", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T13:19:46.120790.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T13:19:46.120790.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_24T13_19_46.120790", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T13:19:46.120790.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T13:19:46.120790.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_24T13_19_46.120790", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T13:19:46.120790.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T13:19:46.120790.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_24T13_19_46.120790", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-24T13:19:46.120790.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-24T13:19:46.120790.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_24T13_19_46.120790", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-24T13:19:46.120790.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-24T13:19:46.120790.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_24T13_19_46.120790", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T13:19:46.120790.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T13:19:46.120790.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_24T13_19_46.120790", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T13:19:46.120790.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T13:19:46.120790.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_24T13_19_46.120790", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T13:19:46.120790.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T13:19:46.120790.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_24T13_19_46.120790", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T13:19:46.120790.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T13:19:46.120790.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_24T13_19_46.120790", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-24T13:19:46.120790.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-24T13:19:46.120790.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_24T13_19_46.120790", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-24T13:19:46.120790.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-24T13:19:46.120790.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_24T13_19_46.120790", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-24T13:19:46.120790.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-24T13:19:46.120790.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_24T13_19_46.120790", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T13:19:46.120790.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T13:19:46.120790.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_24T13_19_46.120790", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-24T13:19:46.120790.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-24T13:19:46.120790.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_24T13_19_46.120790", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T13:19:46.120790.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T13:19:46.120790.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_24T13_19_46.120790", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T13:19:46.120790.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T13:19:46.120790.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_24T13_19_46.120790", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-24T13:19:46.120790.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-24T13:19:46.120790.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_24T13_19_46.120790", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-24T13:19:46.120790.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-24T13:19:46.120790.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_24T13_19_46.120790", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-24T13:19:46.120790.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-24T13:19:46.120790.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_24T13_19_46.120790", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T13:19:46.120790.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T13:19:46.120790.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_24T13_19_46.120790", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-24T13:19:46.120790.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-24T13:19:46.120790.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_24T13_19_46.120790", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-24T13:19:46.120790.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-24T13:19:46.120790.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_24T13_19_46.120790", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-24T13:19:46.120790.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-24T13:19:46.120790.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_16T23_12_06.260380", "path": ["**/details_harness|winogrande|5_2023-10-16T23-12-06.260380.parquet"]}, {"split": "2023_10_18T15_10_52.677936", "path": ["**/details_harness|winogrande|5_2023-10-18T15-10-52.677936.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-18T15-10-52.677936.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_24T13_19_46.120790", "path": ["results_2023-07-24T13:19:46.120790.parquet"]}, {"split": "2023_10_16T23_12_06.260380", "path": ["results_2023-10-16T23-12-06.260380.parquet"]}, {"split": "2023_10_18T15_10_52.677936", "path": ["results_2023-10-18T15-10-52.677936.parquet"]}, {"split": "latest", "path": ["results_2023-10-18T15-10-52.677936.parquet"]}]}]}
2023-10-18T14:11:05+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of Monero/WizardLM-13b-OpenAssistant-Uncensored ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model Monero/WizardLM-13b-OpenAssistant-Uncensored on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-10-18T15:10:52.677936(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of Monero/WizardLM-13b-OpenAssistant-Uncensored", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model Monero/WizardLM-13b-OpenAssistant-Uncensored on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-18T15:10:52.677936(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of Monero/WizardLM-13b-OpenAssistant-Uncensored", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model Monero/WizardLM-13b-OpenAssistant-Uncensored on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-18T15:10:52.677936(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 27, 31, 175, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of Monero/WizardLM-13b-OpenAssistant-Uncensored## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model Monero/WizardLM-13b-OpenAssistant-Uncensored on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-18T15:10:52.677936(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
77449b4751b490987d7946b59c5d7b7fa0772bb7
# Dataset Card for "mmlu-electrical_engineering-neg-prepend-fix" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
joey234/mmlu-electrical_engineering-neg-prepend-fix
[ "region:us" ]
2023-08-18T11:00:43+00:00
{"configs": [{"config_name": "default", "data_files": [{"split": "dev", "path": "data/dev-*"}, {"split": "test", "path": "data/test-*"}]}], "dataset_info": {"features": [{"name": "question", "dtype": "string"}, {"name": "choices", "sequence": "string"}, {"name": "answer", "dtype": {"class_label": {"names": {"0": "A", "1": "B", "2": "C", "3": "D"}}}}, {"name": "negate_openai_prompt", "struct": [{"name": "content", "dtype": "string"}, {"name": "role", "dtype": "string"}]}, {"name": "neg_question", "dtype": "string"}, {"name": "fewshot_context", "dtype": "string"}, {"name": "ori_prompt", "dtype": "string"}], "splits": [{"name": "dev", "num_bytes": 5473, "num_examples": 5}, {"name": "test", "num_bytes": 275445, "num_examples": 145}], "download_size": 13670, "dataset_size": 280918}}
2023-08-21T06:34:03+00:00
[]
[]
TAGS #region-us
# Dataset Card for "mmlu-electrical_engineering-neg-prepend-fix" More Information needed
[ "# Dataset Card for \"mmlu-electrical_engineering-neg-prepend-fix\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"mmlu-electrical_engineering-neg-prepend-fix\"\n\nMore Information needed" ]
[ 6, 27 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"mmlu-electrical_engineering-neg-prepend-fix\"\n\nMore Information needed" ]
cd1a7c2a0fdf230c9bf9b3139f19334e6f9cf250
# Dataset Card for Evaluation run of wannaphong/openthaigpt-0.1.0-beta-full-model_for_open_llm_leaderboard ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/wannaphong/openthaigpt-0.1.0-beta-full-model_for_open_llm_leaderboard - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [wannaphong/openthaigpt-0.1.0-beta-full-model_for_open_llm_leaderboard](https://huggingface.co/wannaphong/openthaigpt-0.1.0-beta-full-model_for_open_llm_leaderboard) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_wannaphong__openthaigpt-0.1.0-beta-full-model_for_open_llm_leaderboard", "harness_truthfulqa_mc_0", split="train") ``` ## Latest results These are the [latest results from run 2023-07-19T19:43:56.163640](https://huggingface.co/datasets/open-llm-leaderboard/details_wannaphong__openthaigpt-0.1.0-beta-full-model_for_open_llm_leaderboard/blob/main/results_2023-07-19T19%3A43%3A56.163640.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.3383082942783733, "acc_stderr": 0.034038904937501814, "acc_norm": 0.3424207667888371, "acc_norm_stderr": 0.03402640930744709, "mc1": 0.2741738066095471, "mc1_stderr": 0.015616518497219373, "mc2": 0.4327576136566873, "mc2_stderr": 0.015062768361653264 }, "harness|arc:challenge|25": { "acc": 0.4684300341296928, "acc_stderr": 0.014582236460866984, "acc_norm": 0.5127986348122867, "acc_norm_stderr": 0.014606603181012538 }, "harness|hellaswag|10": { "acc": 0.5763792073292173, "acc_stderr": 0.004931219148182242, "acc_norm": 0.7746464847639912, "acc_norm_stderr": 0.0041696102548079705 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.23, "acc_stderr": 0.04229525846816506, "acc_norm": 0.23, "acc_norm_stderr": 0.04229525846816506 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.34814814814814815, "acc_stderr": 0.041153246103369526, "acc_norm": 0.34814814814814815, "acc_norm_stderr": 0.041153246103369526 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.3157894736842105, "acc_stderr": 0.037827289808654685, "acc_norm": 0.3157894736842105, "acc_norm_stderr": 0.037827289808654685 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.42, "acc_stderr": 0.049604496374885836, "acc_norm": 0.42, "acc_norm_stderr": 0.049604496374885836 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.37735849056603776, "acc_stderr": 0.029832808114796005, "acc_norm": 0.37735849056603776, "acc_norm_stderr": 0.029832808114796005 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.2916666666666667, "acc_stderr": 0.03800968060554858, "acc_norm": 0.2916666666666667, "acc_norm_stderr": 0.03800968060554858 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.27, "acc_stderr": 0.044619604333847394, "acc_norm": 0.27, "acc_norm_stderr": 0.044619604333847394 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.31, "acc_stderr": 0.04648231987117316, "acc_norm": 0.31, "acc_norm_stderr": 0.04648231987117316 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.33, "acc_stderr": 0.04725815626252604, "acc_norm": 0.33, "acc_norm_stderr": 0.04725815626252604 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.3063583815028902, "acc_stderr": 0.03514942551267437, "acc_norm": 0.3063583815028902, "acc_norm_stderr": 0.03514942551267437 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.23529411764705882, "acc_stderr": 0.04220773659171452, "acc_norm": 0.23529411764705882, "acc_norm_stderr": 0.04220773659171452 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.41, "acc_stderr": 0.049431107042371025, "acc_norm": 0.41, "acc_norm_stderr": 0.049431107042371025 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.3617021276595745, "acc_stderr": 0.03141082197596241, "acc_norm": 0.3617021276595745, "acc_norm_stderr": 0.03141082197596241 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.2807017543859649, "acc_stderr": 0.042270544512322004, "acc_norm": 0.2807017543859649, "acc_norm_stderr": 0.042270544512322004 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.22758620689655173, "acc_stderr": 0.03493950380131184, "acc_norm": 0.22758620689655173, "acc_norm_stderr": 0.03493950380131184 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.24338624338624337, "acc_stderr": 0.022101128787415426, "acc_norm": 0.24338624338624337, "acc_norm_stderr": 0.022101128787415426 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.2222222222222222, "acc_stderr": 0.037184890068181146, "acc_norm": 0.2222222222222222, "acc_norm_stderr": 0.037184890068181146 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.28, "acc_stderr": 0.045126085985421276, "acc_norm": 0.28, "acc_norm_stderr": 0.045126085985421276 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.3387096774193548, "acc_stderr": 0.026923446059302834, "acc_norm": 0.3387096774193548, "acc_norm_stderr": 0.026923446059302834 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.28078817733990147, "acc_stderr": 0.0316185633535861, "acc_norm": 0.28078817733990147, "acc_norm_stderr": 0.0316185633535861 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.31, "acc_stderr": 0.04648231987117316, "acc_norm": 0.31, "acc_norm_stderr": 0.04648231987117316 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.41818181818181815, "acc_stderr": 0.03851716319398393, "acc_norm": 0.41818181818181815, "acc_norm_stderr": 0.03851716319398393 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.41919191919191917, "acc_stderr": 0.035155207286704175, "acc_norm": 0.41919191919191917, "acc_norm_stderr": 0.035155207286704175 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.42487046632124353, "acc_stderr": 0.0356747133521254, "acc_norm": 0.42487046632124353, "acc_norm_stderr": 0.0356747133521254 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.3128205128205128, "acc_stderr": 0.02350757902064535, "acc_norm": 0.3128205128205128, "acc_norm_stderr": 0.02350757902064535 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.24814814814814815, "acc_stderr": 0.0263357394040558, "acc_norm": 0.24814814814814815, "acc_norm_stderr": 0.0263357394040558 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.29411764705882354, "acc_stderr": 0.02959732973097809, "acc_norm": 0.29411764705882354, "acc_norm_stderr": 0.02959732973097809 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.26490066225165565, "acc_stderr": 0.036030385453603854, "acc_norm": 0.26490066225165565, "acc_norm_stderr": 0.036030385453603854 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.3577981651376147, "acc_stderr": 0.020552060784827814, "acc_norm": 0.3577981651376147, "acc_norm_stderr": 0.020552060784827814 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.2824074074074074, "acc_stderr": 0.03070137211151094, "acc_norm": 0.2824074074074074, "acc_norm_stderr": 0.03070137211151094 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.37254901960784315, "acc_stderr": 0.03393388584958404, "acc_norm": 0.37254901960784315, "acc_norm_stderr": 0.03393388584958404 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.3755274261603376, "acc_stderr": 0.03152256243091156, "acc_norm": 0.3755274261603376, "acc_norm_stderr": 0.03152256243091156 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.36771300448430494, "acc_stderr": 0.03236198350928276, "acc_norm": 0.36771300448430494, "acc_norm_stderr": 0.03236198350928276 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.32061068702290074, "acc_stderr": 0.04093329229834278, "acc_norm": 0.32061068702290074, "acc_norm_stderr": 0.04093329229834278 }, "harness|hendrycksTest-international_law|5": { "acc": 0.5702479338842975, "acc_stderr": 0.04519082021319773, "acc_norm": 0.5702479338842975, "acc_norm_stderr": 0.04519082021319773 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.35185185185185186, "acc_stderr": 0.04616631111801713, "acc_norm": 0.35185185185185186, "acc_norm_stderr": 0.04616631111801713 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.34355828220858897, "acc_stderr": 0.03731133519673893, "acc_norm": 0.34355828220858897, "acc_norm_stderr": 0.03731133519673893 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.21428571428571427, "acc_stderr": 0.038946411200447915, "acc_norm": 0.21428571428571427, "acc_norm_stderr": 0.038946411200447915 }, "harness|hendrycksTest-management|5": { "acc": 0.32038834951456313, "acc_stderr": 0.04620284082280039, "acc_norm": 0.32038834951456313, "acc_norm_stderr": 0.04620284082280039 }, "harness|hendrycksTest-marketing|5": { "acc": 0.4188034188034188, "acc_stderr": 0.03232128912157791, "acc_norm": 0.4188034188034188, "acc_norm_stderr": 0.03232128912157791 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.34, "acc_stderr": 0.04760952285695235, "acc_norm": 0.34, "acc_norm_stderr": 0.04760952285695235 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.4125159642401022, "acc_stderr": 0.01760414910867193, "acc_norm": 0.4125159642401022, "acc_norm_stderr": 0.01760414910867193 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.3439306358381503, "acc_stderr": 0.025574123786546648, "acc_norm": 0.3439306358381503, "acc_norm_stderr": 0.025574123786546648 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.2424581005586592, "acc_stderr": 0.014333522059217889, "acc_norm": 0.2424581005586592, "acc_norm_stderr": 0.014333522059217889 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.3333333333333333, "acc_stderr": 0.02699254433929725, "acc_norm": 0.3333333333333333, "acc_norm_stderr": 0.02699254433929725 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.33762057877813506, "acc_stderr": 0.026858825879488547, "acc_norm": 0.33762057877813506, "acc_norm_stderr": 0.026858825879488547 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.35802469135802467, "acc_stderr": 0.026675611926037093, "acc_norm": 0.35802469135802467, "acc_norm_stderr": 0.026675611926037093 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.2695035460992908, "acc_stderr": 0.026469036818590627, "acc_norm": 0.2695035460992908, "acc_norm_stderr": 0.026469036818590627 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.3089960886571056, "acc_stderr": 0.011801729777239249, "acc_norm": 0.3089960886571056, "acc_norm_stderr": 0.011801729777239249 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.33088235294117646, "acc_stderr": 0.02858270975389844, "acc_norm": 0.33088235294117646, "acc_norm_stderr": 0.02858270975389844 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.3284313725490196, "acc_stderr": 0.01899970738316267, "acc_norm": 0.3284313725490196, "acc_norm_stderr": 0.01899970738316267 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.4, "acc_stderr": 0.0469237132203465, "acc_norm": 0.4, "acc_norm_stderr": 0.0469237132203465 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.27755102040816326, "acc_stderr": 0.028666857790274648, "acc_norm": 0.27755102040816326, "acc_norm_stderr": 0.028666857790274648 }, "harness|hendrycksTest-sociology|5": { "acc": 0.4228855721393035, "acc_stderr": 0.034932317774212816, "acc_norm": 0.4228855721393035, "acc_norm_stderr": 0.034932317774212816 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.4, "acc_stderr": 0.04923659639173309, "acc_norm": 0.4, "acc_norm_stderr": 0.04923659639173309 }, "harness|hendrycksTest-virology|5": { "acc": 0.3614457831325301, "acc_stderr": 0.037400593820293204, "acc_norm": 0.3614457831325301, "acc_norm_stderr": 0.037400593820293204 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.38011695906432746, "acc_stderr": 0.037229657413855394, "acc_norm": 0.38011695906432746, "acc_norm_stderr": 0.037229657413855394 }, "harness|truthfulqa:mc|0": { "mc1": 0.2741738066095471, "mc1_stderr": 0.015616518497219373, "mc2": 0.4327576136566873, "mc2_stderr": 0.015062768361653264 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_wannaphong__openthaigpt-0.1.0-beta-full-model_for_open_llm_leaderboard
[ "region:us" ]
2023-08-18T11:00:45+00:00
{"pretty_name": "Evaluation run of wannaphong/openthaigpt-0.1.0-beta-full-model_for_open_llm_leaderboard", "dataset_summary": "Dataset automatically created during the evaluation run of model [wannaphong/openthaigpt-0.1.0-beta-full-model_for_open_llm_leaderboard](https://huggingface.co/wannaphong/openthaigpt-0.1.0-beta-full-model_for_open_llm_leaderboard) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_wannaphong__openthaigpt-0.1.0-beta-full-model_for_open_llm_leaderboard\",\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-07-19T19:43:56.163640](https://huggingface.co/datasets/open-llm-leaderboard/details_wannaphong__openthaigpt-0.1.0-beta-full-model_for_open_llm_leaderboard/blob/main/results_2023-07-19T19%3A43%3A56.163640.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.3383082942783733,\n \"acc_stderr\": 0.034038904937501814,\n \"acc_norm\": 0.3424207667888371,\n \"acc_norm_stderr\": 0.03402640930744709,\n \"mc1\": 0.2741738066095471,\n \"mc1_stderr\": 0.015616518497219373,\n \"mc2\": 0.4327576136566873,\n \"mc2_stderr\": 0.015062768361653264\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.4684300341296928,\n \"acc_stderr\": 0.014582236460866984,\n \"acc_norm\": 0.5127986348122867,\n \"acc_norm_stderr\": 0.014606603181012538\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5763792073292173,\n \"acc_stderr\": 0.004931219148182242,\n \"acc_norm\": 0.7746464847639912,\n \"acc_norm_stderr\": 0.0041696102548079705\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816506,\n \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816506\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.34814814814814815,\n \"acc_stderr\": 0.041153246103369526,\n \"acc_norm\": 0.34814814814814815,\n \"acc_norm_stderr\": 0.041153246103369526\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.3157894736842105,\n \"acc_stderr\": 0.037827289808654685,\n \"acc_norm\": 0.3157894736842105,\n \"acc_norm_stderr\": 0.037827289808654685\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.37735849056603776,\n \"acc_stderr\": 0.029832808114796005,\n \"acc_norm\": 0.37735849056603776,\n \"acc_norm_stderr\": 0.029832808114796005\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2916666666666667,\n \"acc_stderr\": 0.03800968060554858,\n \"acc_norm\": 0.2916666666666667,\n \"acc_norm_stderr\": 0.03800968060554858\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.3063583815028902,\n \"acc_stderr\": 0.03514942551267437,\n \"acc_norm\": 0.3063583815028902,\n \"acc_norm_stderr\": 0.03514942551267437\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.23529411764705882,\n \"acc_stderr\": 0.04220773659171452,\n \"acc_norm\": 0.23529411764705882,\n \"acc_norm_stderr\": 0.04220773659171452\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.3617021276595745,\n \"acc_stderr\": 0.03141082197596241,\n \"acc_norm\": 0.3617021276595745,\n \"acc_norm_stderr\": 0.03141082197596241\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2807017543859649,\n \"acc_stderr\": 0.042270544512322004,\n \"acc_norm\": 0.2807017543859649,\n \"acc_norm_stderr\": 0.042270544512322004\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.22758620689655173,\n \"acc_stderr\": 0.03493950380131184,\n \"acc_norm\": 0.22758620689655173,\n \"acc_norm_stderr\": 0.03493950380131184\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.24338624338624337,\n \"acc_stderr\": 0.022101128787415426,\n \"acc_norm\": 0.24338624338624337,\n \"acc_norm_stderr\": 0.022101128787415426\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2222222222222222,\n \"acc_stderr\": 0.037184890068181146,\n \"acc_norm\": 0.2222222222222222,\n \"acc_norm_stderr\": 0.037184890068181146\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.045126085985421276,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.045126085985421276\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.3387096774193548,\n \"acc_stderr\": 0.026923446059302834,\n \"acc_norm\": 0.3387096774193548,\n \"acc_norm_stderr\": 0.026923446059302834\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.28078817733990147,\n \"acc_stderr\": 0.0316185633535861,\n \"acc_norm\": 0.28078817733990147,\n \"acc_norm_stderr\": 0.0316185633535861\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.41818181818181815,\n \"acc_stderr\": 0.03851716319398393,\n \"acc_norm\": 0.41818181818181815,\n \"acc_norm_stderr\": 0.03851716319398393\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.41919191919191917,\n \"acc_stderr\": 0.035155207286704175,\n \"acc_norm\": 0.41919191919191917,\n \"acc_norm_stderr\": 0.035155207286704175\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.42487046632124353,\n \"acc_stderr\": 0.0356747133521254,\n \"acc_norm\": 0.42487046632124353,\n \"acc_norm_stderr\": 0.0356747133521254\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.3128205128205128,\n \"acc_stderr\": 0.02350757902064535,\n \"acc_norm\": 0.3128205128205128,\n \"acc_norm_stderr\": 0.02350757902064535\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.24814814814814815,\n \"acc_stderr\": 0.0263357394040558,\n \"acc_norm\": 0.24814814814814815,\n \"acc_norm_stderr\": 0.0263357394040558\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.29411764705882354,\n \"acc_stderr\": 0.02959732973097809,\n \"acc_norm\": 0.29411764705882354,\n \"acc_norm_stderr\": 0.02959732973097809\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.26490066225165565,\n \"acc_stderr\": 0.036030385453603854,\n \"acc_norm\": 0.26490066225165565,\n \"acc_norm_stderr\": 0.036030385453603854\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.3577981651376147,\n \"acc_stderr\": 0.020552060784827814,\n \"acc_norm\": 0.3577981651376147,\n \"acc_norm_stderr\": 0.020552060784827814\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.2824074074074074,\n \"acc_stderr\": 0.03070137211151094,\n \"acc_norm\": 0.2824074074074074,\n \"acc_norm_stderr\": 0.03070137211151094\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.37254901960784315,\n \"acc_stderr\": 0.03393388584958404,\n \"acc_norm\": 0.37254901960784315,\n \"acc_norm_stderr\": 0.03393388584958404\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.3755274261603376,\n \"acc_stderr\": 0.03152256243091156,\n \"acc_norm\": 0.3755274261603376,\n \"acc_norm_stderr\": 0.03152256243091156\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.36771300448430494,\n \"acc_stderr\": 0.03236198350928276,\n \"acc_norm\": 0.36771300448430494,\n \"acc_norm_stderr\": 0.03236198350928276\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.32061068702290074,\n \"acc_stderr\": 0.04093329229834278,\n \"acc_norm\": 0.32061068702290074,\n \"acc_norm_stderr\": 0.04093329229834278\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.5702479338842975,\n \"acc_stderr\": 0.04519082021319773,\n \"acc_norm\": 0.5702479338842975,\n \"acc_norm_stderr\": 0.04519082021319773\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.35185185185185186,\n \"acc_stderr\": 0.04616631111801713,\n \"acc_norm\": 0.35185185185185186,\n \"acc_norm_stderr\": 0.04616631111801713\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.34355828220858897,\n \"acc_stderr\": 0.03731133519673893,\n \"acc_norm\": 0.34355828220858897,\n \"acc_norm_stderr\": 0.03731133519673893\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.21428571428571427,\n \"acc_stderr\": 0.038946411200447915,\n \"acc_norm\": 0.21428571428571427,\n \"acc_norm_stderr\": 0.038946411200447915\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.32038834951456313,\n \"acc_stderr\": 0.04620284082280039,\n \"acc_norm\": 0.32038834951456313,\n \"acc_norm_stderr\": 0.04620284082280039\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.4188034188034188,\n \"acc_stderr\": 0.03232128912157791,\n \"acc_norm\": 0.4188034188034188,\n \"acc_norm_stderr\": 0.03232128912157791\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.4125159642401022,\n \"acc_stderr\": 0.01760414910867193,\n \"acc_norm\": 0.4125159642401022,\n \"acc_norm_stderr\": 0.01760414910867193\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.3439306358381503,\n \"acc_stderr\": 0.025574123786546648,\n \"acc_norm\": 0.3439306358381503,\n \"acc_norm_stderr\": 0.025574123786546648\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2424581005586592,\n \"acc_stderr\": 0.014333522059217889,\n \"acc_norm\": 0.2424581005586592,\n \"acc_norm_stderr\": 0.014333522059217889\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.02699254433929725,\n \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.02699254433929725\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.33762057877813506,\n \"acc_stderr\": 0.026858825879488547,\n \"acc_norm\": 0.33762057877813506,\n \"acc_norm_stderr\": 0.026858825879488547\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.35802469135802467,\n \"acc_stderr\": 0.026675611926037093,\n \"acc_norm\": 0.35802469135802467,\n \"acc_norm_stderr\": 0.026675611926037093\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.2695035460992908,\n \"acc_stderr\": 0.026469036818590627,\n \"acc_norm\": 0.2695035460992908,\n \"acc_norm_stderr\": 0.026469036818590627\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3089960886571056,\n \"acc_stderr\": 0.011801729777239249,\n \"acc_norm\": 0.3089960886571056,\n \"acc_norm_stderr\": 0.011801729777239249\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.33088235294117646,\n \"acc_stderr\": 0.02858270975389844,\n \"acc_norm\": 0.33088235294117646,\n \"acc_norm_stderr\": 0.02858270975389844\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.3284313725490196,\n \"acc_stderr\": 0.01899970738316267,\n \"acc_norm\": 0.3284313725490196,\n \"acc_norm_stderr\": 0.01899970738316267\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.4,\n \"acc_stderr\": 0.0469237132203465,\n \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.0469237132203465\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.27755102040816326,\n \"acc_stderr\": 0.028666857790274648,\n \"acc_norm\": 0.27755102040816326,\n \"acc_norm_stderr\": 0.028666857790274648\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.4228855721393035,\n \"acc_stderr\": 0.034932317774212816,\n \"acc_norm\": 0.4228855721393035,\n \"acc_norm_stderr\": 0.034932317774212816\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.4,\n \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3614457831325301,\n \"acc_stderr\": 0.037400593820293204,\n \"acc_norm\": 0.3614457831325301,\n \"acc_norm_stderr\": 0.037400593820293204\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.38011695906432746,\n \"acc_stderr\": 0.037229657413855394,\n \"acc_norm\": 0.38011695906432746,\n \"acc_norm_stderr\": 0.037229657413855394\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2741738066095471,\n \"mc1_stderr\": 0.015616518497219373,\n \"mc2\": 0.4327576136566873,\n \"mc2_stderr\": 0.015062768361653264\n }\n}\n```", "repo_url": "https://huggingface.co/wannaphong/openthaigpt-0.1.0-beta-full-model_for_open_llm_leaderboard", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_19T19_43_56.163640", "path": ["**/details_harness|arc:challenge|25_2023-07-19T19:43:56.163640.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-19T19:43:56.163640.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_19T19_43_56.163640", "path": ["**/details_harness|hellaswag|10_2023-07-19T19:43:56.163640.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-19T19:43:56.163640.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_19T19_43_56.163640", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T19:43:56.163640.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T19:43:56.163640.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T19:43:56.163640.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T19:43:56.163640.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T19:43:56.163640.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T19:43:56.163640.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T19:43:56.163640.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T19:43:56.163640.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T19:43:56.163640.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T19:43:56.163640.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T19:43:56.163640.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T19:43:56.163640.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T19:43:56.163640.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T19:43:56.163640.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T19:43:56.163640.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T19:43:56.163640.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T19:43:56.163640.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T19:43:56.163640.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T19:43:56.163640.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T19:43:56.163640.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T19:43:56.163640.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T19:43:56.163640.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T19:43:56.163640.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T19:43:56.163640.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T19:43:56.163640.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T19:43:56.163640.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T19:43:56.163640.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T19:43:56.163640.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T19:43:56.163640.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T19:43:56.163640.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T19:43:56.163640.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T19:43:56.163640.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T19:43:56.163640.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T19:43:56.163640.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T19:43:56.163640.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T19:43:56.163640.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T19:43:56.163640.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T19:43:56.163640.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T19:43:56.163640.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T19:43:56.163640.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T19:43:56.163640.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T19:43:56.163640.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T19:43:56.163640.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T19:43:56.163640.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T19:43:56.163640.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T19:43:56.163640.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T19:43:56.163640.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T19:43:56.163640.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T19:43:56.163640.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T19:43:56.163640.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T19:43:56.163640.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T19:43:56.163640.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T19:43:56.163640.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T19:43:56.163640.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T19:43:56.163640.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T19:43:56.163640.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T19:43:56.163640.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T19:43:56.163640.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T19:43:56.163640.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T19:43:56.163640.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T19:43:56.163640.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T19:43:56.163640.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T19:43:56.163640.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T19:43:56.163640.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T19:43:56.163640.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T19:43:56.163640.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T19:43:56.163640.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T19:43:56.163640.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T19:43:56.163640.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T19:43:56.163640.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T19:43:56.163640.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T19:43:56.163640.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T19:43:56.163640.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T19:43:56.163640.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T19:43:56.163640.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T19:43:56.163640.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T19:43:56.163640.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T19:43:56.163640.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T19:43:56.163640.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T19:43:56.163640.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T19:43:56.163640.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T19:43:56.163640.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T19:43:56.163640.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T19:43:56.163640.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T19:43:56.163640.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T19:43:56.163640.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T19:43:56.163640.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T19:43:56.163640.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T19:43:56.163640.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T19:43:56.163640.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T19:43:56.163640.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T19:43:56.163640.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T19:43:56.163640.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T19:43:56.163640.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T19:43:56.163640.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T19:43:56.163640.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T19:43:56.163640.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T19:43:56.163640.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T19:43:56.163640.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T19:43:56.163640.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T19:43:56.163640.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T19:43:56.163640.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T19:43:56.163640.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T19:43:56.163640.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T19:43:56.163640.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T19:43:56.163640.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T19:43:56.163640.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T19:43:56.163640.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T19:43:56.163640.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T19:43:56.163640.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T19:43:56.163640.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T19:43:56.163640.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T19:43:56.163640.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T19:43:56.163640.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_19T19_43_56.163640", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T19:43:56.163640.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T19:43:56.163640.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_19T19_43_56.163640", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T19:43:56.163640.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T19:43:56.163640.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_19T19_43_56.163640", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T19:43:56.163640.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T19:43:56.163640.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_19T19_43_56.163640", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T19:43:56.163640.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T19:43:56.163640.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_19T19_43_56.163640", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T19:43:56.163640.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T19:43:56.163640.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_19T19_43_56.163640", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T19:43:56.163640.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T19:43:56.163640.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_19T19_43_56.163640", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T19:43:56.163640.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T19:43:56.163640.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_19T19_43_56.163640", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T19:43:56.163640.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T19:43:56.163640.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_19T19_43_56.163640", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T19:43:56.163640.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T19:43:56.163640.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_19T19_43_56.163640", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T19:43:56.163640.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T19:43:56.163640.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_19T19_43_56.163640", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T19:43:56.163640.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T19:43:56.163640.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_19T19_43_56.163640", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T19:43:56.163640.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T19:43:56.163640.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_19T19_43_56.163640", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T19:43:56.163640.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T19:43:56.163640.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_19T19_43_56.163640", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T19:43:56.163640.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T19:43:56.163640.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_19T19_43_56.163640", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T19:43:56.163640.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T19:43:56.163640.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_19T19_43_56.163640", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T19:43:56.163640.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T19:43:56.163640.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_19T19_43_56.163640", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T19:43:56.163640.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T19:43:56.163640.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_19T19_43_56.163640", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T19:43:56.163640.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T19:43:56.163640.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_19T19_43_56.163640", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T19:43:56.163640.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T19:43:56.163640.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_19T19_43_56.163640", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T19:43:56.163640.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T19:43:56.163640.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_19T19_43_56.163640", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T19:43:56.163640.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T19:43:56.163640.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_19T19_43_56.163640", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T19:43:56.163640.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T19:43:56.163640.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_19T19_43_56.163640", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T19:43:56.163640.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T19:43:56.163640.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_19T19_43_56.163640", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T19:43:56.163640.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T19:43:56.163640.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_19T19_43_56.163640", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T19:43:56.163640.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T19:43:56.163640.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_19T19_43_56.163640", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T19:43:56.163640.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T19:43:56.163640.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_19T19_43_56.163640", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T19:43:56.163640.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T19:43:56.163640.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_19T19_43_56.163640", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T19:43:56.163640.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T19:43:56.163640.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_19T19_43_56.163640", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T19:43:56.163640.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T19:43:56.163640.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_19T19_43_56.163640", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T19:43:56.163640.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T19:43:56.163640.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_19T19_43_56.163640", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T19:43:56.163640.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T19:43:56.163640.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_19T19_43_56.163640", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T19:43:56.163640.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T19:43:56.163640.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_19T19_43_56.163640", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T19:43:56.163640.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T19:43:56.163640.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_19T19_43_56.163640", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T19:43:56.163640.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T19:43:56.163640.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_19T19_43_56.163640", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T19:43:56.163640.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T19:43:56.163640.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_19T19_43_56.163640", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T19:43:56.163640.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T19:43:56.163640.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_19T19_43_56.163640", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T19:43:56.163640.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T19:43:56.163640.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_19T19_43_56.163640", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T19:43:56.163640.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T19:43:56.163640.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_19T19_43_56.163640", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T19:43:56.163640.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T19:43:56.163640.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_19T19_43_56.163640", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T19:43:56.163640.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T19:43:56.163640.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_19T19_43_56.163640", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T19:43:56.163640.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T19:43:56.163640.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_19T19_43_56.163640", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T19:43:56.163640.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T19:43:56.163640.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_19T19_43_56.163640", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T19:43:56.163640.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T19:43:56.163640.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_19T19_43_56.163640", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T19:43:56.163640.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T19:43:56.163640.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_19T19_43_56.163640", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T19:43:56.163640.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T19:43:56.163640.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_19T19_43_56.163640", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T19:43:56.163640.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T19:43:56.163640.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_19T19_43_56.163640", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T19:43:56.163640.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T19:43:56.163640.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_19T19_43_56.163640", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T19:43:56.163640.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T19:43:56.163640.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_19T19_43_56.163640", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T19:43:56.163640.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T19:43:56.163640.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_19T19_43_56.163640", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T19:43:56.163640.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T19:43:56.163640.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_19T19_43_56.163640", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T19:43:56.163640.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T19:43:56.163640.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_19T19_43_56.163640", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T19:43:56.163640.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T19:43:56.163640.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_19T19_43_56.163640", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T19:43:56.163640.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T19:43:56.163640.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_19T19_43_56.163640", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T19:43:56.163640.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T19:43:56.163640.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_19T19_43_56.163640", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T19:43:56.163640.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T19:43:56.163640.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_19T19_43_56.163640", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T19:43:56.163640.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T19:43:56.163640.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_19T19_43_56.163640", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T19:43:56.163640.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T19:43:56.163640.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_19T19_43_56.163640", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T19:43:56.163640.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T19:43:56.163640.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_19T19_43_56.163640", "path": ["results_2023-07-19T19:43:56.163640.parquet"]}, {"split": "latest", "path": ["results_2023-07-19T19:43:56.163640.parquet"]}]}]}
2023-08-27T11:39:12+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of wannaphong/openthaigpt-0.1.0-beta-full-model_for_open_llm_leaderboard ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model wannaphong/openthaigpt-0.1.0-beta-full-model_for_open_llm_leaderboard on the Open LLM Leaderboard. The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-07-19T19:43:56.163640 (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of wannaphong/openthaigpt-0.1.0-beta-full-model_for_open_llm_leaderboard", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model wannaphong/openthaigpt-0.1.0-beta-full-model_for_open_llm_leaderboard on the Open LLM Leaderboard.\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-07-19T19:43:56.163640 (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of wannaphong/openthaigpt-0.1.0-beta-full-model_for_open_llm_leaderboard", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model wannaphong/openthaigpt-0.1.0-beta-full-model_for_open_llm_leaderboard on the Open LLM Leaderboard.\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-07-19T19:43:56.163640 (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 37, 31, 185, 66, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of wannaphong/openthaigpt-0.1.0-beta-full-model_for_open_llm_leaderboard## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model wannaphong/openthaigpt-0.1.0-beta-full-model_for_open_llm_leaderboard on the Open LLM Leaderboard.\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-07-19T19:43:56.163640 (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
f560dc6b9ab2a4935e0c23ca9a4e06331e04d40e
# Dataset Card for Evaluation run of frank098/WizardLM_13B_juniper ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/frank098/WizardLM_13B_juniper - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [frank098/WizardLM_13B_juniper](https://huggingface.co/frank098/WizardLM_13B_juniper) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_frank098__WizardLM_13B_juniper", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-10-29T18:18:55.569728](https://huggingface.co/datasets/open-llm-leaderboard/details_frank098__WizardLM_13B_juniper/blob/main/results_2023-10-29T18-18-55.569728.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.00576761744966443, "em_stderr": 0.0007755000442814698, "f1": 0.07442428691275203, "f1_stderr": 0.001635445995042788, "acc": 0.39574628120487826, "acc_stderr": 0.010113249922128762 }, "harness|drop|3": { "em": 0.00576761744966443, "em_stderr": 0.0007755000442814698, "f1": 0.07442428691275203, "f1_stderr": 0.001635445995042788 }, "harness|gsm8k|5": { "acc": 0.0803639120545868, "acc_stderr": 0.007488258573239077 }, "harness|winogrande|5": { "acc": 0.7111286503551697, "acc_stderr": 0.012738241271018446 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_frank098__WizardLM_13B_juniper
[ "region:us" ]
2023-08-18T11:00:54+00:00
{"pretty_name": "Evaluation run of frank098/WizardLM_13B_juniper", "dataset_summary": "Dataset automatically created during the evaluation run of model [frank098/WizardLM_13B_juniper](https://huggingface.co/frank098/WizardLM_13B_juniper) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_frank098__WizardLM_13B_juniper\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-29T18:18:55.569728](https://huggingface.co/datasets/open-llm-leaderboard/details_frank098__WizardLM_13B_juniper/blob/main/results_2023-10-29T18-18-55.569728.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.00576761744966443,\n \"em_stderr\": 0.0007755000442814698,\n \"f1\": 0.07442428691275203,\n \"f1_stderr\": 0.001635445995042788,\n \"acc\": 0.39574628120487826,\n \"acc_stderr\": 0.010113249922128762\n },\n \"harness|drop|3\": {\n \"em\": 0.00576761744966443,\n \"em_stderr\": 0.0007755000442814698,\n \"f1\": 0.07442428691275203,\n \"f1_stderr\": 0.001635445995042788\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0803639120545868,\n \"acc_stderr\": 0.007488258573239077\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7111286503551697,\n \"acc_stderr\": 0.012738241271018446\n }\n}\n```", "repo_url": "https://huggingface.co/frank098/WizardLM_13B_juniper", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_24T12_54_22.349435", "path": ["**/details_harness|arc:challenge|25_2023-07-24T12:54:22.349435.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-24T12:54:22.349435.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_29T18_18_55.569728", "path": ["**/details_harness|drop|3_2023-10-29T18-18-55.569728.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-29T18-18-55.569728.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_29T18_18_55.569728", "path": ["**/details_harness|gsm8k|5_2023-10-29T18-18-55.569728.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-29T18-18-55.569728.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_24T12_54_22.349435", "path": ["**/details_harness|hellaswag|10_2023-07-24T12:54:22.349435.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-24T12:54:22.349435.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_24T12_54_22.349435", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T12:54:22.349435.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-24T12:54:22.349435.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-24T12:54:22.349435.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T12:54:22.349435.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T12:54:22.349435.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-24T12:54:22.349435.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T12:54:22.349435.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T12:54:22.349435.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T12:54:22.349435.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T12:54:22.349435.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-24T12:54:22.349435.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-24T12:54:22.349435.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T12:54:22.349435.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-24T12:54:22.349435.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T12:54:22.349435.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T12:54:22.349435.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T12:54:22.349435.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-24T12:54:22.349435.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T12:54:22.349435.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T12:54:22.349435.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T12:54:22.349435.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T12:54:22.349435.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T12:54:22.349435.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T12:54:22.349435.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T12:54:22.349435.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T12:54:22.349435.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T12:54:22.349435.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T12:54:22.349435.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T12:54:22.349435.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T12:54:22.349435.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T12:54:22.349435.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T12:54:22.349435.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-24T12:54:22.349435.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T12:54:22.349435.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-24T12:54:22.349435.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T12:54:22.349435.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T12:54:22.349435.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T12:54:22.349435.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-24T12:54:22.349435.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-24T12:54:22.349435.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T12:54:22.349435.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T12:54:22.349435.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T12:54:22.349435.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T12:54:22.349435.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-24T12:54:22.349435.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-24T12:54:22.349435.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-24T12:54:22.349435.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T12:54:22.349435.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-24T12:54:22.349435.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T12:54:22.349435.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T12:54:22.349435.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-24T12:54:22.349435.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-24T12:54:22.349435.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-24T12:54:22.349435.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T12:54:22.349435.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-24T12:54:22.349435.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-24T12:54:22.349435.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T12:54:22.349435.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-24T12:54:22.349435.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-24T12:54:22.349435.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T12:54:22.349435.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T12:54:22.349435.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-24T12:54:22.349435.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T12:54:22.349435.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T12:54:22.349435.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T12:54:22.349435.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T12:54:22.349435.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-24T12:54:22.349435.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-24T12:54:22.349435.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T12:54:22.349435.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-24T12:54:22.349435.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T12:54:22.349435.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T12:54:22.349435.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T12:54:22.349435.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-24T12:54:22.349435.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T12:54:22.349435.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T12:54:22.349435.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T12:54:22.349435.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T12:54:22.349435.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T12:54:22.349435.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T12:54:22.349435.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T12:54:22.349435.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T12:54:22.349435.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T12:54:22.349435.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T12:54:22.349435.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T12:54:22.349435.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T12:54:22.349435.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T12:54:22.349435.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T12:54:22.349435.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-24T12:54:22.349435.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T12:54:22.349435.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-24T12:54:22.349435.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T12:54:22.349435.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T12:54:22.349435.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T12:54:22.349435.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-24T12:54:22.349435.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-24T12:54:22.349435.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T12:54:22.349435.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T12:54:22.349435.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T12:54:22.349435.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T12:54:22.349435.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-24T12:54:22.349435.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-24T12:54:22.349435.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-24T12:54:22.349435.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T12:54:22.349435.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-24T12:54:22.349435.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T12:54:22.349435.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T12:54:22.349435.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-24T12:54:22.349435.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-24T12:54:22.349435.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-24T12:54:22.349435.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T12:54:22.349435.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-24T12:54:22.349435.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-24T12:54:22.349435.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_24T12_54_22.349435", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T12:54:22.349435.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T12:54:22.349435.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_24T12_54_22.349435", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-24T12:54:22.349435.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-24T12:54:22.349435.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_24T12_54_22.349435", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-24T12:54:22.349435.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-24T12:54:22.349435.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_24T12_54_22.349435", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T12:54:22.349435.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T12:54:22.349435.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_24T12_54_22.349435", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T12:54:22.349435.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T12:54:22.349435.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_24T12_54_22.349435", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-24T12:54:22.349435.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-24T12:54:22.349435.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_24T12_54_22.349435", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T12:54:22.349435.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T12:54:22.349435.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_24T12_54_22.349435", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T12:54:22.349435.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T12:54:22.349435.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_24T12_54_22.349435", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T12:54:22.349435.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T12:54:22.349435.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_24T12_54_22.349435", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T12:54:22.349435.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T12:54:22.349435.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_24T12_54_22.349435", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-24T12:54:22.349435.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-24T12:54:22.349435.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_24T12_54_22.349435", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-24T12:54:22.349435.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-24T12:54:22.349435.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_24T12_54_22.349435", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T12:54:22.349435.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T12:54:22.349435.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_24T12_54_22.349435", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-24T12:54:22.349435.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-24T12:54:22.349435.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_24T12_54_22.349435", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T12:54:22.349435.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T12:54:22.349435.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_24T12_54_22.349435", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T12:54:22.349435.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T12:54:22.349435.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_24T12_54_22.349435", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T12:54:22.349435.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T12:54:22.349435.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_24T12_54_22.349435", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-24T12:54:22.349435.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-24T12:54:22.349435.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_24T12_54_22.349435", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T12:54:22.349435.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T12:54:22.349435.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_24T12_54_22.349435", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T12:54:22.349435.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T12:54:22.349435.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_24T12_54_22.349435", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T12:54:22.349435.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T12:54:22.349435.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_24T12_54_22.349435", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T12:54:22.349435.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T12:54:22.349435.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_24T12_54_22.349435", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T12:54:22.349435.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T12:54:22.349435.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_24T12_54_22.349435", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T12:54:22.349435.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T12:54:22.349435.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_24T12_54_22.349435", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T12:54:22.349435.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T12:54:22.349435.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_24T12_54_22.349435", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T12:54:22.349435.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T12:54:22.349435.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_24T12_54_22.349435", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T12:54:22.349435.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T12:54:22.349435.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_24T12_54_22.349435", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T12:54:22.349435.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T12:54:22.349435.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_24T12_54_22.349435", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T12:54:22.349435.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T12:54:22.349435.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_24T12_54_22.349435", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T12:54:22.349435.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T12:54:22.349435.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_24T12_54_22.349435", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T12:54:22.349435.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T12:54:22.349435.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_24T12_54_22.349435", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T12:54:22.349435.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T12:54:22.349435.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_24T12_54_22.349435", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-24T12:54:22.349435.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-24T12:54:22.349435.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_24T12_54_22.349435", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T12:54:22.349435.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T12:54:22.349435.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_24T12_54_22.349435", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-24T12:54:22.349435.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-24T12:54:22.349435.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_24T12_54_22.349435", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T12:54:22.349435.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T12:54:22.349435.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_24T12_54_22.349435", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T12:54:22.349435.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T12:54:22.349435.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_24T12_54_22.349435", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T12:54:22.349435.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T12:54:22.349435.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_24T12_54_22.349435", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-24T12:54:22.349435.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-24T12:54:22.349435.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_24T12_54_22.349435", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-24T12:54:22.349435.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-24T12:54:22.349435.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_24T12_54_22.349435", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T12:54:22.349435.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T12:54:22.349435.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_24T12_54_22.349435", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T12:54:22.349435.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T12:54:22.349435.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_24T12_54_22.349435", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T12:54:22.349435.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T12:54:22.349435.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_24T12_54_22.349435", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T12:54:22.349435.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T12:54:22.349435.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_24T12_54_22.349435", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-24T12:54:22.349435.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-24T12:54:22.349435.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_24T12_54_22.349435", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-24T12:54:22.349435.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-24T12:54:22.349435.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_24T12_54_22.349435", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-24T12:54:22.349435.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-24T12:54:22.349435.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_24T12_54_22.349435", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T12:54:22.349435.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T12:54:22.349435.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_24T12_54_22.349435", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-24T12:54:22.349435.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-24T12:54:22.349435.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_24T12_54_22.349435", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T12:54:22.349435.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T12:54:22.349435.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_24T12_54_22.349435", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T12:54:22.349435.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T12:54:22.349435.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_24T12_54_22.349435", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-24T12:54:22.349435.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-24T12:54:22.349435.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_24T12_54_22.349435", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-24T12:54:22.349435.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-24T12:54:22.349435.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_24T12_54_22.349435", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-24T12:54:22.349435.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-24T12:54:22.349435.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_24T12_54_22.349435", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T12:54:22.349435.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T12:54:22.349435.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_24T12_54_22.349435", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-24T12:54:22.349435.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-24T12:54:22.349435.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_24T12_54_22.349435", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-24T12:54:22.349435.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-24T12:54:22.349435.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_24T12_54_22.349435", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-24T12:54:22.349435.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-24T12:54:22.349435.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_29T18_18_55.569728", "path": ["**/details_harness|winogrande|5_2023-10-29T18-18-55.569728.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-29T18-18-55.569728.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_24T12_54_22.349435", "path": ["results_2023-07-24T12:54:22.349435.parquet"]}, {"split": "2023_10_29T18_18_55.569728", "path": ["results_2023-10-29T18-18-55.569728.parquet"]}, {"split": "latest", "path": ["results_2023-10-29T18-18-55.569728.parquet"]}]}]}
2023-10-29T18:19:07+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of frank098/WizardLM_13B_juniper ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model frank098/WizardLM_13B_juniper on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-10-29T18:18:55.569728(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of frank098/WizardLM_13B_juniper", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model frank098/WizardLM_13B_juniper on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-29T18:18:55.569728(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of frank098/WizardLM_13B_juniper", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model frank098/WizardLM_13B_juniper on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-29T18:18:55.569728(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 23, 31, 171, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of frank098/WizardLM_13B_juniper## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model frank098/WizardLM_13B_juniper on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-29T18:18:55.569728(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
84403c8f067bb8dbc2978e2f7250e4317deab52a
# Dataset Card for "mmlu-elementary_mathematics-neg-prepend-fix" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
joey234/mmlu-elementary_mathematics-neg-prepend-fix
[ "region:us" ]
2023-08-18T11:00:58+00:00
{"configs": [{"config_name": "default", "data_files": [{"split": "dev", "path": "data/dev-*"}, {"split": "test", "path": "data/test-*"}]}], "dataset_info": {"features": [{"name": "question", "dtype": "string"}, {"name": "choices", "sequence": "string"}, {"name": "answer", "dtype": {"class_label": {"names": {"0": "A", "1": "B", "2": "C", "3": "D"}}}}, {"name": "negate_openai_prompt", "struct": [{"name": "content", "dtype": "string"}, {"name": "role", "dtype": "string"}]}, {"name": "neg_question", "dtype": "string"}, {"name": "fewshot_context", "dtype": "string"}, {"name": "ori_prompt", "dtype": "string"}], "splits": [{"name": "dev", "num_bytes": 6267, "num_examples": 5}, {"name": "test", "num_bytes": 854641, "num_examples": 378}], "download_size": 14034, "dataset_size": 860908}}
2023-08-21T06:34:16+00:00
[]
[]
TAGS #region-us
# Dataset Card for "mmlu-elementary_mathematics-neg-prepend-fix" More Information needed
[ "# Dataset Card for \"mmlu-elementary_mathematics-neg-prepend-fix\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"mmlu-elementary_mathematics-neg-prepend-fix\"\n\nMore Information needed" ]
[ 6, 28 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"mmlu-elementary_mathematics-neg-prepend-fix\"\n\nMore Information needed" ]
2e3a2107b35cb6ff0b0f138261df5c63071234ce
# Dataset Card for Evaluation run of frank098/orca_mini_3b_juniper ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/frank098/orca_mini_3b_juniper - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [frank098/orca_mini_3b_juniper](https://huggingface.co/frank098/orca_mini_3b_juniper) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_frank098__orca_mini_3b_juniper", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-09-17T00:19:44.475095](https://huggingface.co/datasets/open-llm-leaderboard/details_frank098__orca_mini_3b_juniper/blob/main/results_2023-09-17T00-19-44.475095.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.0007340604026845638, "em_stderr": 0.000277361445733574, "f1": 0.04966652684563771, "f1_stderr": 0.001261898789421576, "acc": 0.3041531307650375, "acc_stderr": 0.007876199120377373 }, "harness|drop|3": { "em": 0.0007340604026845638, "em_stderr": 0.000277361445733574, "f1": 0.04966652684563771, "f1_stderr": 0.001261898789421576 }, "harness|gsm8k|5": { "acc": 0.00530705079605762, "acc_stderr": 0.002001305720948044 }, "harness|winogrande|5": { "acc": 0.6029992107340174, "acc_stderr": 0.013751092519806702 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_frank098__orca_mini_3b_juniper
[ "region:us" ]
2023-08-18T11:01:02+00:00
{"pretty_name": "Evaluation run of frank098/orca_mini_3b_juniper", "dataset_summary": "Dataset automatically created during the evaluation run of model [frank098/orca_mini_3b_juniper](https://huggingface.co/frank098/orca_mini_3b_juniper) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_frank098__orca_mini_3b_juniper\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-09-17T00:19:44.475095](https://huggingface.co/datasets/open-llm-leaderboard/details_frank098__orca_mini_3b_juniper/blob/main/results_2023-09-17T00-19-44.475095.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0007340604026845638,\n \"em_stderr\": 0.000277361445733574,\n \"f1\": 0.04966652684563771,\n \"f1_stderr\": 0.001261898789421576,\n \"acc\": 0.3041531307650375,\n \"acc_stderr\": 0.007876199120377373\n },\n \"harness|drop|3\": {\n \"em\": 0.0007340604026845638,\n \"em_stderr\": 0.000277361445733574,\n \"f1\": 0.04966652684563771,\n \"f1_stderr\": 0.001261898789421576\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.00530705079605762,\n \"acc_stderr\": 0.002001305720948044\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.6029992107340174,\n \"acc_stderr\": 0.013751092519806702\n }\n}\n```", "repo_url": "https://huggingface.co/frank098/orca_mini_3b_juniper", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_24T10_27_47.193085", "path": ["**/details_harness|arc:challenge|25_2023-07-24T10:27:47.193085.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-24T10:27:47.193085.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_09_17T00_19_44.475095", "path": ["**/details_harness|drop|3_2023-09-17T00-19-44.475095.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-09-17T00-19-44.475095.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_09_17T00_19_44.475095", "path": ["**/details_harness|gsm8k|5_2023-09-17T00-19-44.475095.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-09-17T00-19-44.475095.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_24T10_27_47.193085", "path": ["**/details_harness|hellaswag|10_2023-07-24T10:27:47.193085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-24T10:27:47.193085.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_24T10_27_47.193085", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T10:27:47.193085.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-24T10:27:47.193085.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-24T10:27:47.193085.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T10:27:47.193085.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T10:27:47.193085.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-24T10:27:47.193085.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T10:27:47.193085.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T10:27:47.193085.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T10:27:47.193085.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T10:27:47.193085.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-24T10:27:47.193085.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-24T10:27:47.193085.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T10:27:47.193085.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-24T10:27:47.193085.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T10:27:47.193085.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T10:27:47.193085.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T10:27:47.193085.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-24T10:27:47.193085.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T10:27:47.193085.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T10:27:47.193085.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T10:27:47.193085.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T10:27:47.193085.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T10:27:47.193085.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T10:27:47.193085.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T10:27:47.193085.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T10:27:47.193085.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T10:27:47.193085.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T10:27:47.193085.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T10:27:47.193085.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T10:27:47.193085.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T10:27:47.193085.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T10:27:47.193085.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-24T10:27:47.193085.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T10:27:47.193085.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-24T10:27:47.193085.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T10:27:47.193085.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T10:27:47.193085.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T10:27:47.193085.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-24T10:27:47.193085.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-24T10:27:47.193085.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T10:27:47.193085.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T10:27:47.193085.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T10:27:47.193085.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T10:27:47.193085.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-24T10:27:47.193085.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-24T10:27:47.193085.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-24T10:27:47.193085.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T10:27:47.193085.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-24T10:27:47.193085.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T10:27:47.193085.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T10:27:47.193085.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-24T10:27:47.193085.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-24T10:27:47.193085.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-24T10:27:47.193085.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T10:27:47.193085.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-24T10:27:47.193085.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-24T10:27:47.193085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T10:27:47.193085.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-24T10:27:47.193085.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-24T10:27:47.193085.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T10:27:47.193085.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T10:27:47.193085.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-24T10:27:47.193085.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T10:27:47.193085.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T10:27:47.193085.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T10:27:47.193085.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T10:27:47.193085.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-24T10:27:47.193085.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-24T10:27:47.193085.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T10:27:47.193085.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-24T10:27:47.193085.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T10:27:47.193085.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T10:27:47.193085.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T10:27:47.193085.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-24T10:27:47.193085.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T10:27:47.193085.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T10:27:47.193085.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T10:27:47.193085.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T10:27:47.193085.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T10:27:47.193085.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T10:27:47.193085.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T10:27:47.193085.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T10:27:47.193085.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T10:27:47.193085.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T10:27:47.193085.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T10:27:47.193085.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T10:27:47.193085.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T10:27:47.193085.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T10:27:47.193085.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-24T10:27:47.193085.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T10:27:47.193085.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-24T10:27:47.193085.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T10:27:47.193085.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T10:27:47.193085.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T10:27:47.193085.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-24T10:27:47.193085.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-24T10:27:47.193085.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T10:27:47.193085.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T10:27:47.193085.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T10:27:47.193085.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T10:27:47.193085.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-24T10:27:47.193085.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-24T10:27:47.193085.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-24T10:27:47.193085.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T10:27:47.193085.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-24T10:27:47.193085.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T10:27:47.193085.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T10:27:47.193085.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-24T10:27:47.193085.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-24T10:27:47.193085.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-24T10:27:47.193085.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T10:27:47.193085.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-24T10:27:47.193085.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-24T10:27:47.193085.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_24T10_27_47.193085", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T10:27:47.193085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T10:27:47.193085.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_24T10_27_47.193085", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-24T10:27:47.193085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-24T10:27:47.193085.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_24T10_27_47.193085", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-24T10:27:47.193085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-24T10:27:47.193085.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_24T10_27_47.193085", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T10:27:47.193085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T10:27:47.193085.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_24T10_27_47.193085", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T10:27:47.193085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T10:27:47.193085.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_24T10_27_47.193085", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-24T10:27:47.193085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-24T10:27:47.193085.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_24T10_27_47.193085", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T10:27:47.193085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T10:27:47.193085.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_24T10_27_47.193085", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T10:27:47.193085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T10:27:47.193085.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_24T10_27_47.193085", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T10:27:47.193085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T10:27:47.193085.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_24T10_27_47.193085", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T10:27:47.193085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T10:27:47.193085.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_24T10_27_47.193085", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-24T10:27:47.193085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-24T10:27:47.193085.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_24T10_27_47.193085", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-24T10:27:47.193085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-24T10:27:47.193085.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_24T10_27_47.193085", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T10:27:47.193085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T10:27:47.193085.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_24T10_27_47.193085", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-24T10:27:47.193085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-24T10:27:47.193085.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_24T10_27_47.193085", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T10:27:47.193085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T10:27:47.193085.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_24T10_27_47.193085", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T10:27:47.193085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T10:27:47.193085.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_24T10_27_47.193085", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T10:27:47.193085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T10:27:47.193085.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_24T10_27_47.193085", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-24T10:27:47.193085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-24T10:27:47.193085.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_24T10_27_47.193085", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T10:27:47.193085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T10:27:47.193085.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_24T10_27_47.193085", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T10:27:47.193085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T10:27:47.193085.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_24T10_27_47.193085", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T10:27:47.193085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T10:27:47.193085.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_24T10_27_47.193085", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T10:27:47.193085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T10:27:47.193085.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_24T10_27_47.193085", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T10:27:47.193085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T10:27:47.193085.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_24T10_27_47.193085", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T10:27:47.193085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T10:27:47.193085.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_24T10_27_47.193085", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T10:27:47.193085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T10:27:47.193085.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_24T10_27_47.193085", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T10:27:47.193085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T10:27:47.193085.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_24T10_27_47.193085", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T10:27:47.193085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T10:27:47.193085.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_24T10_27_47.193085", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T10:27:47.193085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T10:27:47.193085.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_24T10_27_47.193085", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T10:27:47.193085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T10:27:47.193085.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_24T10_27_47.193085", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T10:27:47.193085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T10:27:47.193085.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_24T10_27_47.193085", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T10:27:47.193085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T10:27:47.193085.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_24T10_27_47.193085", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T10:27:47.193085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T10:27:47.193085.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_24T10_27_47.193085", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-24T10:27:47.193085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-24T10:27:47.193085.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_24T10_27_47.193085", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T10:27:47.193085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T10:27:47.193085.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_24T10_27_47.193085", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-24T10:27:47.193085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-24T10:27:47.193085.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_24T10_27_47.193085", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T10:27:47.193085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T10:27:47.193085.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_24T10_27_47.193085", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T10:27:47.193085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T10:27:47.193085.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_24T10_27_47.193085", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T10:27:47.193085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T10:27:47.193085.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_24T10_27_47.193085", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-24T10:27:47.193085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-24T10:27:47.193085.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_24T10_27_47.193085", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-24T10:27:47.193085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-24T10:27:47.193085.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_24T10_27_47.193085", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T10:27:47.193085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T10:27:47.193085.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_24T10_27_47.193085", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T10:27:47.193085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T10:27:47.193085.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_24T10_27_47.193085", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T10:27:47.193085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T10:27:47.193085.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_24T10_27_47.193085", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T10:27:47.193085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T10:27:47.193085.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_24T10_27_47.193085", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-24T10:27:47.193085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-24T10:27:47.193085.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_24T10_27_47.193085", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-24T10:27:47.193085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-24T10:27:47.193085.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_24T10_27_47.193085", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-24T10:27:47.193085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-24T10:27:47.193085.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_24T10_27_47.193085", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T10:27:47.193085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T10:27:47.193085.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_24T10_27_47.193085", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-24T10:27:47.193085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-24T10:27:47.193085.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_24T10_27_47.193085", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T10:27:47.193085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T10:27:47.193085.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_24T10_27_47.193085", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T10:27:47.193085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T10:27:47.193085.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_24T10_27_47.193085", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-24T10:27:47.193085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-24T10:27:47.193085.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_24T10_27_47.193085", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-24T10:27:47.193085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-24T10:27:47.193085.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_24T10_27_47.193085", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-24T10:27:47.193085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-24T10:27:47.193085.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_24T10_27_47.193085", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T10:27:47.193085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T10:27:47.193085.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_24T10_27_47.193085", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-24T10:27:47.193085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-24T10:27:47.193085.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_24T10_27_47.193085", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-24T10:27:47.193085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-24T10:27:47.193085.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_24T10_27_47.193085", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-24T10:27:47.193085.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-24T10:27:47.193085.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_09_17T00_19_44.475095", "path": ["**/details_harness|winogrande|5_2023-09-17T00-19-44.475095.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-09-17T00-19-44.475095.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_24T10_27_47.193085", "path": ["results_2023-07-24T10:27:47.193085.parquet"]}, {"split": "2023_09_17T00_19_44.475095", "path": ["results_2023-09-17T00-19-44.475095.parquet"]}, {"split": "latest", "path": ["results_2023-09-17T00-19-44.475095.parquet"]}]}]}
2023-09-16T23:19:56+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of frank098/orca_mini_3b_juniper ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model frank098/orca_mini_3b_juniper on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-09-17T00:19:44.475095(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of frank098/orca_mini_3b_juniper", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model frank098/orca_mini_3b_juniper on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-09-17T00:19:44.475095(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of frank098/orca_mini_3b_juniper", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model frank098/orca_mini_3b_juniper on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-09-17T00:19:44.475095(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 23, 31, 171, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of frank098/orca_mini_3b_juniper## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model frank098/orca_mini_3b_juniper on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-09-17T00:19:44.475095(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
da027bd78208d7a3ca9b141573f49ab79b7099b2
# Dataset Card for Evaluation run of FabbriSimo01/Facebook_opt_1.3b_Quantized ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/FabbriSimo01/Facebook_opt_1.3b_Quantized - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [FabbriSimo01/Facebook_opt_1.3b_Quantized](https://huggingface.co/FabbriSimo01/Facebook_opt_1.3b_Quantized) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_FabbriSimo01__Facebook_opt_1.3b_Quantized", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-09-18T06:09:22.891569](https://huggingface.co/datasets/open-llm-leaderboard/details_FabbriSimo01__Facebook_opt_1.3b_Quantized/blob/main/results_2023-09-18T06-09-22.891569.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.0020973154362416107, "em_stderr": 0.00046850650303682405, "f1": 0.05110318791946325, "f1_stderr": 0.0012507542097710141, "acc": 0.29910069155018665, "acc_stderr": 0.007429518317222754 }, "harness|drop|3": { "em": 0.0020973154362416107, "em_stderr": 0.00046850650303682405, "f1": 0.05110318791946325, "f1_stderr": 0.0012507542097710141 }, "harness|gsm8k|5": { "acc": 0.001516300227445034, "acc_stderr": 0.0010717793485492619 }, "harness|winogrande|5": { "acc": 0.5966850828729282, "acc_stderr": 0.013787257285896245 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_FabbriSimo01__Facebook_opt_1.3b_Quantized
[ "region:us" ]
2023-08-18T11:01:11+00:00
{"pretty_name": "Evaluation run of FabbriSimo01/Facebook_opt_1.3b_Quantized", "dataset_summary": "Dataset automatically created during the evaluation run of model [FabbriSimo01/Facebook_opt_1.3b_Quantized](https://huggingface.co/FabbriSimo01/Facebook_opt_1.3b_Quantized) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_FabbriSimo01__Facebook_opt_1.3b_Quantized\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-09-18T06:09:22.891569](https://huggingface.co/datasets/open-llm-leaderboard/details_FabbriSimo01__Facebook_opt_1.3b_Quantized/blob/main/results_2023-09-18T06-09-22.891569.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0020973154362416107,\n \"em_stderr\": 0.00046850650303682405,\n \"f1\": 0.05110318791946325,\n \"f1_stderr\": 0.0012507542097710141,\n \"acc\": 0.29910069155018665,\n \"acc_stderr\": 0.007429518317222754\n },\n \"harness|drop|3\": {\n \"em\": 0.0020973154362416107,\n \"em_stderr\": 0.00046850650303682405,\n \"f1\": 0.05110318791946325,\n \"f1_stderr\": 0.0012507542097710141\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.001516300227445034,\n \"acc_stderr\": 0.0010717793485492619\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.5966850828729282,\n \"acc_stderr\": 0.013787257285896245\n }\n}\n```", "repo_url": "https://huggingface.co/FabbriSimo01/Facebook_opt_1.3b_Quantized", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_19T14_58_20.478747", "path": ["**/details_harness|arc:challenge|25_2023-07-19T14:58:20.478747.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-19T14:58:20.478747.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_09_18T06_09_22.891569", "path": ["**/details_harness|drop|3_2023-09-18T06-09-22.891569.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-09-18T06-09-22.891569.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_09_18T06_09_22.891569", "path": ["**/details_harness|gsm8k|5_2023-09-18T06-09-22.891569.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-09-18T06-09-22.891569.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_19T14_58_20.478747", "path": ["**/details_harness|hellaswag|10_2023-07-19T14:58:20.478747.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-19T14:58:20.478747.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_19T14_58_20.478747", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T14:58:20.478747.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T14:58:20.478747.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T14:58:20.478747.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T14:58:20.478747.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T14:58:20.478747.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T14:58:20.478747.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T14:58:20.478747.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T14:58:20.478747.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T14:58:20.478747.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T14:58:20.478747.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T14:58:20.478747.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T14:58:20.478747.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T14:58:20.478747.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T14:58:20.478747.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T14:58:20.478747.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T14:58:20.478747.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T14:58:20.478747.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T14:58:20.478747.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T14:58:20.478747.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T14:58:20.478747.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T14:58:20.478747.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T14:58:20.478747.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T14:58:20.478747.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T14:58:20.478747.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T14:58:20.478747.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T14:58:20.478747.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T14:58:20.478747.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T14:58:20.478747.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T14:58:20.478747.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T14:58:20.478747.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T14:58:20.478747.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T14:58:20.478747.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T14:58:20.478747.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T14:58:20.478747.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T14:58:20.478747.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T14:58:20.478747.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T14:58:20.478747.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T14:58:20.478747.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T14:58:20.478747.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T14:58:20.478747.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T14:58:20.478747.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T14:58:20.478747.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T14:58:20.478747.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T14:58:20.478747.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T14:58:20.478747.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T14:58:20.478747.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T14:58:20.478747.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T14:58:20.478747.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T14:58:20.478747.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T14:58:20.478747.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T14:58:20.478747.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T14:58:20.478747.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T14:58:20.478747.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T14:58:20.478747.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T14:58:20.478747.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T14:58:20.478747.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T14:58:20.478747.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T14:58:20.478747.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T14:58:20.478747.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T14:58:20.478747.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T14:58:20.478747.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T14:58:20.478747.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T14:58:20.478747.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T14:58:20.478747.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T14:58:20.478747.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T14:58:20.478747.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T14:58:20.478747.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T14:58:20.478747.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T14:58:20.478747.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T14:58:20.478747.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T14:58:20.478747.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T14:58:20.478747.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T14:58:20.478747.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T14:58:20.478747.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T14:58:20.478747.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T14:58:20.478747.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T14:58:20.478747.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T14:58:20.478747.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T14:58:20.478747.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T14:58:20.478747.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T14:58:20.478747.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T14:58:20.478747.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T14:58:20.478747.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T14:58:20.478747.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T14:58:20.478747.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T14:58:20.478747.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T14:58:20.478747.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T14:58:20.478747.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T14:58:20.478747.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T14:58:20.478747.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T14:58:20.478747.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T14:58:20.478747.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T14:58:20.478747.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T14:58:20.478747.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T14:58:20.478747.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T14:58:20.478747.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T14:58:20.478747.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T14:58:20.478747.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T14:58:20.478747.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T14:58:20.478747.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T14:58:20.478747.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T14:58:20.478747.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T14:58:20.478747.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T14:58:20.478747.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T14:58:20.478747.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T14:58:20.478747.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T14:58:20.478747.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T14:58:20.478747.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T14:58:20.478747.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T14:58:20.478747.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T14:58:20.478747.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T14:58:20.478747.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T14:58:20.478747.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T14:58:20.478747.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_19T14_58_20.478747", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T14:58:20.478747.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T14:58:20.478747.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_19T14_58_20.478747", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T14:58:20.478747.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T14:58:20.478747.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_19T14_58_20.478747", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T14:58:20.478747.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T14:58:20.478747.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_19T14_58_20.478747", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T14:58:20.478747.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T14:58:20.478747.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_19T14_58_20.478747", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T14:58:20.478747.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T14:58:20.478747.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_19T14_58_20.478747", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T14:58:20.478747.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T14:58:20.478747.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_19T14_58_20.478747", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T14:58:20.478747.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T14:58:20.478747.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_19T14_58_20.478747", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T14:58:20.478747.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T14:58:20.478747.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_19T14_58_20.478747", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T14:58:20.478747.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T14:58:20.478747.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_19T14_58_20.478747", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T14:58:20.478747.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T14:58:20.478747.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_19T14_58_20.478747", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T14:58:20.478747.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T14:58:20.478747.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_19T14_58_20.478747", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T14:58:20.478747.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T14:58:20.478747.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_19T14_58_20.478747", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T14:58:20.478747.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T14:58:20.478747.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_19T14_58_20.478747", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T14:58:20.478747.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T14:58:20.478747.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_19T14_58_20.478747", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T14:58:20.478747.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T14:58:20.478747.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_19T14_58_20.478747", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T14:58:20.478747.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T14:58:20.478747.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_19T14_58_20.478747", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T14:58:20.478747.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T14:58:20.478747.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_19T14_58_20.478747", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T14:58:20.478747.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T14:58:20.478747.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_19T14_58_20.478747", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T14:58:20.478747.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T14:58:20.478747.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_19T14_58_20.478747", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T14:58:20.478747.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T14:58:20.478747.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_19T14_58_20.478747", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T14:58:20.478747.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T14:58:20.478747.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_19T14_58_20.478747", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T14:58:20.478747.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T14:58:20.478747.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_19T14_58_20.478747", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T14:58:20.478747.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T14:58:20.478747.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_19T14_58_20.478747", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T14:58:20.478747.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T14:58:20.478747.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_19T14_58_20.478747", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T14:58:20.478747.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T14:58:20.478747.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_19T14_58_20.478747", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T14:58:20.478747.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T14:58:20.478747.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_19T14_58_20.478747", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T14:58:20.478747.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T14:58:20.478747.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_19T14_58_20.478747", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T14:58:20.478747.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T14:58:20.478747.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_19T14_58_20.478747", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T14:58:20.478747.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T14:58:20.478747.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_19T14_58_20.478747", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T14:58:20.478747.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T14:58:20.478747.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_19T14_58_20.478747", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T14:58:20.478747.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T14:58:20.478747.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_19T14_58_20.478747", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T14:58:20.478747.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T14:58:20.478747.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_19T14_58_20.478747", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T14:58:20.478747.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T14:58:20.478747.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_19T14_58_20.478747", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T14:58:20.478747.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T14:58:20.478747.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_19T14_58_20.478747", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T14:58:20.478747.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T14:58:20.478747.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_19T14_58_20.478747", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T14:58:20.478747.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T14:58:20.478747.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_19T14_58_20.478747", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T14:58:20.478747.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T14:58:20.478747.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_19T14_58_20.478747", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T14:58:20.478747.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T14:58:20.478747.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_19T14_58_20.478747", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T14:58:20.478747.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T14:58:20.478747.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_19T14_58_20.478747", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T14:58:20.478747.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T14:58:20.478747.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_19T14_58_20.478747", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T14:58:20.478747.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T14:58:20.478747.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_19T14_58_20.478747", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T14:58:20.478747.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T14:58:20.478747.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_19T14_58_20.478747", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T14:58:20.478747.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T14:58:20.478747.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_19T14_58_20.478747", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T14:58:20.478747.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T14:58:20.478747.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_19T14_58_20.478747", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T14:58:20.478747.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T14:58:20.478747.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_19T14_58_20.478747", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T14:58:20.478747.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T14:58:20.478747.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_19T14_58_20.478747", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T14:58:20.478747.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T14:58:20.478747.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_19T14_58_20.478747", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T14:58:20.478747.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T14:58:20.478747.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_19T14_58_20.478747", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T14:58:20.478747.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T14:58:20.478747.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_19T14_58_20.478747", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T14:58:20.478747.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T14:58:20.478747.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_19T14_58_20.478747", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T14:58:20.478747.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T14:58:20.478747.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_19T14_58_20.478747", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T14:58:20.478747.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T14:58:20.478747.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_19T14_58_20.478747", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T14:58:20.478747.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T14:58:20.478747.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_19T14_58_20.478747", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T14:58:20.478747.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T14:58:20.478747.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_19T14_58_20.478747", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T14:58:20.478747.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T14:58:20.478747.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_19T14_58_20.478747", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T14:58:20.478747.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T14:58:20.478747.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_19T14_58_20.478747", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T14:58:20.478747.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T14:58:20.478747.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_19T14_58_20.478747", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T14:58:20.478747.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T14:58:20.478747.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_09_18T06_09_22.891569", "path": ["**/details_harness|winogrande|5_2023-09-18T06-09-22.891569.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-09-18T06-09-22.891569.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_19T14_58_20.478747", "path": ["results_2023-07-19T14:58:20.478747.parquet"]}, {"split": "2023_09_18T06_09_22.891569", "path": ["results_2023-09-18T06-09-22.891569.parquet"]}, {"split": "latest", "path": ["results_2023-09-18T06-09-22.891569.parquet"]}]}]}
2023-09-18T05:09:34+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of FabbriSimo01/Facebook_opt_1.3b_Quantized ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model FabbriSimo01/Facebook_opt_1.3b_Quantized on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-09-18T06:09:22.891569(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of FabbriSimo01/Facebook_opt_1.3b_Quantized", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model FabbriSimo01/Facebook_opt_1.3b_Quantized on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-09-18T06:09:22.891569(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of FabbriSimo01/Facebook_opt_1.3b_Quantized", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model FabbriSimo01/Facebook_opt_1.3b_Quantized on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-09-18T06:09:22.891569(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 26, 31, 174, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of FabbriSimo01/Facebook_opt_1.3b_Quantized## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model FabbriSimo01/Facebook_opt_1.3b_Quantized on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-09-18T06:09:22.891569(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
68f4569018fa4004f2508db005866286340e34d2
# Dataset Card for "mmlu-formal_logic-neg-prepend-fix" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
joey234/mmlu-formal_logic-neg-prepend-fix
[ "region:us" ]
2023-08-18T11:01:12+00:00
{"configs": [{"config_name": "default", "data_files": [{"split": "dev", "path": "data/dev-*"}, {"split": "test", "path": "data/test-*"}]}], "dataset_info": {"features": [{"name": "question", "dtype": "string"}, {"name": "choices", "sequence": "string"}, {"name": "answer", "dtype": {"class_label": {"names": {"0": "A", "1": "B", "2": "C", "3": "D"}}}}, {"name": "negate_openai_prompt", "struct": [{"name": "content", "dtype": "string"}, {"name": "role", "dtype": "string"}]}, {"name": "neg_question", "dtype": "string"}, {"name": "fewshot_context", "dtype": "string"}, {"name": "ori_prompt", "dtype": "string"}], "splits": [{"name": "dev", "num_bytes": 7320, "num_examples": 5}, {"name": "test", "num_bytes": 426415, "num_examples": 126}], "download_size": 16460, "dataset_size": 433735}}
2023-08-21T06:34:27+00:00
[]
[]
TAGS #region-us
# Dataset Card for "mmlu-formal_logic-neg-prepend-fix" More Information needed
[ "# Dataset Card for \"mmlu-formal_logic-neg-prepend-fix\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"mmlu-formal_logic-neg-prepend-fix\"\n\nMore Information needed" ]
[ 6, 26 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"mmlu-formal_logic-neg-prepend-fix\"\n\nMore Information needed" ]
e00f6400811e258d2de7cff92b23695abcf70bcf
# Dataset Card for Evaluation run of arver/llama7b-qlora ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/arver/llama7b-qlora - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [arver/llama7b-qlora](https://huggingface.co/arver/llama7b-qlora) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_arver__llama7b-qlora", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-09-17T10:28:15.239885](https://huggingface.co/datasets/open-llm-leaderboard/details_arver__llama7b-qlora/blob/main/results_2023-09-17T10-28-15.239885.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.0014681208053691276, "em_stderr": 0.0003921042190298541, "f1": 0.06146078020134238, "f1_stderr": 0.0013862861484435665, "acc": 0.37858887140948305, "acc_stderr": 0.008690432281689055 }, "harness|drop|3": { "em": 0.0014681208053691276, "em_stderr": 0.0003921042190298541, "f1": 0.06146078020134238, "f1_stderr": 0.0013862861484435665 }, "harness|gsm8k|5": { "acc": 0.03184230477634572, "acc_stderr": 0.004836348558260928 }, "harness|winogrande|5": { "acc": 0.7253354380426204, "acc_stderr": 0.012544516005117185 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_arver__llama7b-qlora
[ "region:us" ]
2023-08-18T11:01:20+00:00
{"pretty_name": "Evaluation run of arver/llama7b-qlora", "dataset_summary": "Dataset automatically created during the evaluation run of model [arver/llama7b-qlora](https://huggingface.co/arver/llama7b-qlora) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_arver__llama7b-qlora\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-09-17T10:28:15.239885](https://huggingface.co/datasets/open-llm-leaderboard/details_arver__llama7b-qlora/blob/main/results_2023-09-17T10-28-15.239885.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0014681208053691276,\n \"em_stderr\": 0.0003921042190298541,\n \"f1\": 0.06146078020134238,\n \"f1_stderr\": 0.0013862861484435665,\n \"acc\": 0.37858887140948305,\n \"acc_stderr\": 0.008690432281689055\n },\n \"harness|drop|3\": {\n \"em\": 0.0014681208053691276,\n \"em_stderr\": 0.0003921042190298541,\n \"f1\": 0.06146078020134238,\n \"f1_stderr\": 0.0013862861484435665\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.03184230477634572,\n \"acc_stderr\": 0.004836348558260928\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7253354380426204,\n \"acc_stderr\": 0.012544516005117185\n }\n}\n```", "repo_url": "https://huggingface.co/arver/llama7b-qlora", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_08_09T19_44_33.087537", "path": ["**/details_harness|arc:challenge|25_2023-08-09T19:44:33.087537.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-08-09T19:44:33.087537.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_09_17T10_28_15.239885", "path": ["**/details_harness|drop|3_2023-09-17T10-28-15.239885.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-09-17T10-28-15.239885.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_09_17T10_28_15.239885", "path": ["**/details_harness|gsm8k|5_2023-09-17T10-28-15.239885.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-09-17T10-28-15.239885.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_08_09T19_44_33.087537", "path": ["**/details_harness|hellaswag|10_2023-08-09T19:44:33.087537.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-08-09T19:44:33.087537.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_08_09T19_44_33.087537", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T19:44:33.087537.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-09T19:44:33.087537.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-09T19:44:33.087537.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T19:44:33.087537.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T19:44:33.087537.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-09T19:44:33.087537.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T19:44:33.087537.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T19:44:33.087537.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T19:44:33.087537.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T19:44:33.087537.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-09T19:44:33.087537.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-09T19:44:33.087537.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T19:44:33.087537.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-09T19:44:33.087537.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T19:44:33.087537.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T19:44:33.087537.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T19:44:33.087537.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-09T19:44:33.087537.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T19:44:33.087537.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T19:44:33.087537.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T19:44:33.087537.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T19:44:33.087537.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T19:44:33.087537.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T19:44:33.087537.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T19:44:33.087537.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T19:44:33.087537.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T19:44:33.087537.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T19:44:33.087537.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T19:44:33.087537.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T19:44:33.087537.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T19:44:33.087537.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T19:44:33.087537.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-09T19:44:33.087537.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T19:44:33.087537.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-09T19:44:33.087537.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T19:44:33.087537.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T19:44:33.087537.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T19:44:33.087537.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-09T19:44:33.087537.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-09T19:44:33.087537.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T19:44:33.087537.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T19:44:33.087537.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T19:44:33.087537.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T19:44:33.087537.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-09T19:44:33.087537.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-09T19:44:33.087537.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-09T19:44:33.087537.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T19:44:33.087537.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-09T19:44:33.087537.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T19:44:33.087537.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T19:44:33.087537.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-09T19:44:33.087537.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-09T19:44:33.087537.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-09T19:44:33.087537.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T19:44:33.087537.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-09T19:44:33.087537.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-09T19:44:33.087537.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T19:44:33.087537.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-09T19:44:33.087537.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-09T19:44:33.087537.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T19:44:33.087537.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T19:44:33.087537.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-09T19:44:33.087537.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T19:44:33.087537.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T19:44:33.087537.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T19:44:33.087537.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T19:44:33.087537.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-09T19:44:33.087537.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-09T19:44:33.087537.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T19:44:33.087537.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-09T19:44:33.087537.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T19:44:33.087537.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T19:44:33.087537.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T19:44:33.087537.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-09T19:44:33.087537.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T19:44:33.087537.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T19:44:33.087537.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T19:44:33.087537.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T19:44:33.087537.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T19:44:33.087537.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T19:44:33.087537.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T19:44:33.087537.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T19:44:33.087537.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T19:44:33.087537.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T19:44:33.087537.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T19:44:33.087537.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T19:44:33.087537.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T19:44:33.087537.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T19:44:33.087537.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-09T19:44:33.087537.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T19:44:33.087537.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-09T19:44:33.087537.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T19:44:33.087537.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T19:44:33.087537.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T19:44:33.087537.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-09T19:44:33.087537.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-09T19:44:33.087537.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T19:44:33.087537.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T19:44:33.087537.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T19:44:33.087537.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T19:44:33.087537.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-09T19:44:33.087537.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-09T19:44:33.087537.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-09T19:44:33.087537.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T19:44:33.087537.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-09T19:44:33.087537.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T19:44:33.087537.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T19:44:33.087537.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-09T19:44:33.087537.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-09T19:44:33.087537.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-09T19:44:33.087537.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T19:44:33.087537.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-09T19:44:33.087537.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-09T19:44:33.087537.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_08_09T19_44_33.087537", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T19:44:33.087537.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T19:44:33.087537.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_08_09T19_44_33.087537", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-09T19:44:33.087537.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-09T19:44:33.087537.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_08_09T19_44_33.087537", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-09T19:44:33.087537.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-09T19:44:33.087537.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_08_09T19_44_33.087537", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T19:44:33.087537.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T19:44:33.087537.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_08_09T19_44_33.087537", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T19:44:33.087537.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T19:44:33.087537.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_08_09T19_44_33.087537", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-09T19:44:33.087537.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-09T19:44:33.087537.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_08_09T19_44_33.087537", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T19:44:33.087537.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T19:44:33.087537.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_08_09T19_44_33.087537", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T19:44:33.087537.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T19:44:33.087537.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_08_09T19_44_33.087537", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T19:44:33.087537.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T19:44:33.087537.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_08_09T19_44_33.087537", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T19:44:33.087537.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T19:44:33.087537.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_08_09T19_44_33.087537", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-09T19:44:33.087537.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-09T19:44:33.087537.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_08_09T19_44_33.087537", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-09T19:44:33.087537.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-09T19:44:33.087537.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_08_09T19_44_33.087537", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T19:44:33.087537.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T19:44:33.087537.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_08_09T19_44_33.087537", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-09T19:44:33.087537.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-09T19:44:33.087537.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_08_09T19_44_33.087537", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T19:44:33.087537.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T19:44:33.087537.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_08_09T19_44_33.087537", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T19:44:33.087537.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T19:44:33.087537.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_08_09T19_44_33.087537", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T19:44:33.087537.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T19:44:33.087537.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_08_09T19_44_33.087537", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-09T19:44:33.087537.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-09T19:44:33.087537.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_08_09T19_44_33.087537", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T19:44:33.087537.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T19:44:33.087537.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_08_09T19_44_33.087537", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T19:44:33.087537.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T19:44:33.087537.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_08_09T19_44_33.087537", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T19:44:33.087537.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T19:44:33.087537.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_08_09T19_44_33.087537", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T19:44:33.087537.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T19:44:33.087537.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_08_09T19_44_33.087537", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T19:44:33.087537.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T19:44:33.087537.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_08_09T19_44_33.087537", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T19:44:33.087537.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T19:44:33.087537.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_08_09T19_44_33.087537", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T19:44:33.087537.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T19:44:33.087537.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_08_09T19_44_33.087537", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T19:44:33.087537.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T19:44:33.087537.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_08_09T19_44_33.087537", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T19:44:33.087537.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T19:44:33.087537.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_08_09T19_44_33.087537", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T19:44:33.087537.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T19:44:33.087537.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_08_09T19_44_33.087537", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T19:44:33.087537.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T19:44:33.087537.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_08_09T19_44_33.087537", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T19:44:33.087537.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T19:44:33.087537.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_08_09T19_44_33.087537", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T19:44:33.087537.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T19:44:33.087537.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_08_09T19_44_33.087537", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T19:44:33.087537.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T19:44:33.087537.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_08_09T19_44_33.087537", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-09T19:44:33.087537.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-09T19:44:33.087537.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_08_09T19_44_33.087537", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T19:44:33.087537.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T19:44:33.087537.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_08_09T19_44_33.087537", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-09T19:44:33.087537.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-09T19:44:33.087537.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_08_09T19_44_33.087537", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T19:44:33.087537.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T19:44:33.087537.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_08_09T19_44_33.087537", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T19:44:33.087537.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T19:44:33.087537.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_08_09T19_44_33.087537", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T19:44:33.087537.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T19:44:33.087537.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_08_09T19_44_33.087537", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-09T19:44:33.087537.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-09T19:44:33.087537.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_08_09T19_44_33.087537", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-09T19:44:33.087537.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-09T19:44:33.087537.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_08_09T19_44_33.087537", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T19:44:33.087537.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T19:44:33.087537.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_08_09T19_44_33.087537", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T19:44:33.087537.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T19:44:33.087537.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_08_09T19_44_33.087537", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T19:44:33.087537.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T19:44:33.087537.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_08_09T19_44_33.087537", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T19:44:33.087537.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T19:44:33.087537.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_08_09T19_44_33.087537", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-09T19:44:33.087537.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-09T19:44:33.087537.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_08_09T19_44_33.087537", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-09T19:44:33.087537.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-09T19:44:33.087537.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_08_09T19_44_33.087537", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-09T19:44:33.087537.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-09T19:44:33.087537.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_08_09T19_44_33.087537", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T19:44:33.087537.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T19:44:33.087537.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_08_09T19_44_33.087537", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-09T19:44:33.087537.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-09T19:44:33.087537.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_08_09T19_44_33.087537", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T19:44:33.087537.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T19:44:33.087537.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_08_09T19_44_33.087537", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T19:44:33.087537.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T19:44:33.087537.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_08_09T19_44_33.087537", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-09T19:44:33.087537.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-09T19:44:33.087537.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_08_09T19_44_33.087537", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-09T19:44:33.087537.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-09T19:44:33.087537.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_08_09T19_44_33.087537", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-09T19:44:33.087537.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-09T19:44:33.087537.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_08_09T19_44_33.087537", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T19:44:33.087537.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T19:44:33.087537.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_08_09T19_44_33.087537", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-09T19:44:33.087537.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-09T19:44:33.087537.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_08_09T19_44_33.087537", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-09T19:44:33.087537.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-09T19:44:33.087537.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_08_09T19_44_33.087537", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-09T19:44:33.087537.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-09T19:44:33.087537.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_09_17T10_28_15.239885", "path": ["**/details_harness|winogrande|5_2023-09-17T10-28-15.239885.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-09-17T10-28-15.239885.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_08_09T19_44_33.087537", "path": ["results_2023-08-09T19:44:33.087537.parquet"]}, {"split": "2023_09_17T10_28_15.239885", "path": ["results_2023-09-17T10-28-15.239885.parquet"]}, {"split": "latest", "path": ["results_2023-09-17T10-28-15.239885.parquet"]}]}]}
2023-09-17T09:28:26+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of arver/llama7b-qlora ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model arver/llama7b-qlora on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-09-17T10:28:15.239885(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of arver/llama7b-qlora", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model arver/llama7b-qlora on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-09-17T10:28:15.239885(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of arver/llama7b-qlora", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model arver/llama7b-qlora on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-09-17T10:28:15.239885(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 19, 31, 167, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of arver/llama7b-qlora## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model arver/llama7b-qlora on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-09-17T10:28:15.239885(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
682123fa335845a456d157d1c77597025ebade65
# Dataset Card for "mmlu-global_facts-neg-prepend-fix" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
joey234/mmlu-global_facts-neg-prepend-fix
[ "region:us" ]
2023-08-18T11:01:26+00:00
{"configs": [{"config_name": "default", "data_files": [{"split": "dev", "path": "data/dev-*"}, {"split": "test", "path": "data/test-*"}]}], "dataset_info": {"features": [{"name": "question", "dtype": "string"}, {"name": "choices", "sequence": "string"}, {"name": "answer", "dtype": {"class_label": {"names": {"0": "A", "1": "B", "2": "C", "3": "D"}}}}, {"name": "negate_openai_prompt", "struct": [{"name": "content", "dtype": "string"}, {"name": "role", "dtype": "string"}]}, {"name": "neg_question", "dtype": "string"}, {"name": "fewshot_context", "dtype": "string"}, {"name": "ori_prompt", "dtype": "string"}], "splits": [{"name": "dev", "num_bytes": 5659, "num_examples": 5}, {"name": "test", "num_bytes": 225313, "num_examples": 100}], "download_size": 13046, "dataset_size": 230972}}
2023-08-21T06:34:40+00:00
[]
[]
TAGS #region-us
# Dataset Card for "mmlu-global_facts-neg-prepend-fix" More Information needed
[ "# Dataset Card for \"mmlu-global_facts-neg-prepend-fix\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"mmlu-global_facts-neg-prepend-fix\"\n\nMore Information needed" ]
[ 6, 26 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"mmlu-global_facts-neg-prepend-fix\"\n\nMore Information needed" ]
f433267f3d392b21299ef52de6ea21a9a95ea925
# Dataset Card for Evaluation run of beaugogh/pythia-1.4b-deduped-sharegpt ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/beaugogh/pythia-1.4b-deduped-sharegpt - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [beaugogh/pythia-1.4b-deduped-sharegpt](https://huggingface.co/beaugogh/pythia-1.4b-deduped-sharegpt) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_beaugogh__pythia-1.4b-deduped-sharegpt", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-10-15T17:32:01.983101](https://huggingface.co/datasets/open-llm-leaderboard/details_beaugogh__pythia-1.4b-deduped-sharegpt/blob/main/results_2023-10-15T17-32-01.983101.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.0010486577181208054, "em_stderr": 0.00033145814652192217, "f1": 0.04875104865771823, "f1_stderr": 0.0012458540332815637, "acc": 0.2804129195481258, "acc_stderr": 0.008239894933698364 }, "harness|drop|3": { "em": 0.0010486577181208054, "em_stderr": 0.00033145814652192217, "f1": 0.04875104865771823, "f1_stderr": 0.0012458540332815637 }, "harness|gsm8k|5": { "acc": 0.008339651250947688, "acc_stderr": 0.002504942226860534 }, "harness|winogrande|5": { "acc": 0.5524861878453039, "acc_stderr": 0.013974847640536194 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_beaugogh__pythia-1.4b-deduped-sharegpt
[ "region:us" ]
2023-08-18T11:01:29+00:00
{"pretty_name": "Evaluation run of beaugogh/pythia-1.4b-deduped-sharegpt", "dataset_summary": "Dataset automatically created during the evaluation run of model [beaugogh/pythia-1.4b-deduped-sharegpt](https://huggingface.co/beaugogh/pythia-1.4b-deduped-sharegpt) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_beaugogh__pythia-1.4b-deduped-sharegpt\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-15T17:32:01.983101](https://huggingface.co/datasets/open-llm-leaderboard/details_beaugogh__pythia-1.4b-deduped-sharegpt/blob/main/results_2023-10-15T17-32-01.983101.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0010486577181208054,\n \"em_stderr\": 0.00033145814652192217,\n \"f1\": 0.04875104865771823,\n \"f1_stderr\": 0.0012458540332815637,\n \"acc\": 0.2804129195481258,\n \"acc_stderr\": 0.008239894933698364\n },\n \"harness|drop|3\": {\n \"em\": 0.0010486577181208054,\n \"em_stderr\": 0.00033145814652192217,\n \"f1\": 0.04875104865771823,\n \"f1_stderr\": 0.0012458540332815637\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.008339651250947688,\n \"acc_stderr\": 0.002504942226860534\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.5524861878453039,\n \"acc_stderr\": 0.013974847640536194\n }\n}\n```", "repo_url": "https://huggingface.co/beaugogh/pythia-1.4b-deduped-sharegpt", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_31T09_37_34.765508", "path": ["**/details_harness|arc:challenge|25_2023-07-31T09:37:34.765508.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-31T09:37:34.765508.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_15T17_32_01.983101", "path": ["**/details_harness|drop|3_2023-10-15T17-32-01.983101.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-15T17-32-01.983101.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_15T17_32_01.983101", "path": ["**/details_harness|gsm8k|5_2023-10-15T17-32-01.983101.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-15T17-32-01.983101.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_31T09_37_34.765508", "path": ["**/details_harness|hellaswag|10_2023-07-31T09:37:34.765508.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-31T09:37:34.765508.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_31T09_37_34.765508", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-31T09:37:34.765508.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-31T09:37:34.765508.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-31T09:37:34.765508.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-31T09:37:34.765508.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-31T09:37:34.765508.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-31T09:37:34.765508.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-31T09:37:34.765508.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-31T09:37:34.765508.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-31T09:37:34.765508.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-31T09:37:34.765508.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-31T09:37:34.765508.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-31T09:37:34.765508.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-31T09:37:34.765508.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-31T09:37:34.765508.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-31T09:37:34.765508.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-31T09:37:34.765508.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-31T09:37:34.765508.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-31T09:37:34.765508.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-31T09:37:34.765508.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-31T09:37:34.765508.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-31T09:37:34.765508.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-31T09:37:34.765508.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-31T09:37:34.765508.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-31T09:37:34.765508.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-31T09:37:34.765508.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-31T09:37:34.765508.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-31T09:37:34.765508.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-31T09:37:34.765508.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-31T09:37:34.765508.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-31T09:37:34.765508.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-31T09:37:34.765508.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-31T09:37:34.765508.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-31T09:37:34.765508.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-31T09:37:34.765508.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-31T09:37:34.765508.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-31T09:37:34.765508.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-31T09:37:34.765508.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-31T09:37:34.765508.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-31T09:37:34.765508.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-31T09:37:34.765508.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-31T09:37:34.765508.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-31T09:37:34.765508.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-31T09:37:34.765508.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-31T09:37:34.765508.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-31T09:37:34.765508.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-31T09:37:34.765508.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-31T09:37:34.765508.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-31T09:37:34.765508.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-31T09:37:34.765508.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-31T09:37:34.765508.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-31T09:37:34.765508.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-31T09:37:34.765508.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-31T09:37:34.765508.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-31T09:37:34.765508.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-31T09:37:34.765508.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-31T09:37:34.765508.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-31T09:37:34.765508.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-31T09:37:34.765508.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-31T09:37:34.765508.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-31T09:37:34.765508.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-31T09:37:34.765508.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-31T09:37:34.765508.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-31T09:37:34.765508.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-31T09:37:34.765508.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-31T09:37:34.765508.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-31T09:37:34.765508.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-31T09:37:34.765508.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-31T09:37:34.765508.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-31T09:37:34.765508.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-31T09:37:34.765508.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-31T09:37:34.765508.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-31T09:37:34.765508.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-31T09:37:34.765508.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-31T09:37:34.765508.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-31T09:37:34.765508.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-31T09:37:34.765508.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-31T09:37:34.765508.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-31T09:37:34.765508.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-31T09:37:34.765508.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-31T09:37:34.765508.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-31T09:37:34.765508.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-31T09:37:34.765508.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-31T09:37:34.765508.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-31T09:37:34.765508.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-31T09:37:34.765508.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-31T09:37:34.765508.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-31T09:37:34.765508.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-31T09:37:34.765508.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-31T09:37:34.765508.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-31T09:37:34.765508.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-31T09:37:34.765508.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-31T09:37:34.765508.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-31T09:37:34.765508.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-31T09:37:34.765508.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-31T09:37:34.765508.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-31T09:37:34.765508.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-31T09:37:34.765508.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-31T09:37:34.765508.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-31T09:37:34.765508.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-31T09:37:34.765508.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-31T09:37:34.765508.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-31T09:37:34.765508.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-31T09:37:34.765508.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-31T09:37:34.765508.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-31T09:37:34.765508.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-31T09:37:34.765508.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-31T09:37:34.765508.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-31T09:37:34.765508.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-31T09:37:34.765508.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-31T09:37:34.765508.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-31T09:37:34.765508.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-31T09:37:34.765508.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-31T09:37:34.765508.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-31T09:37:34.765508.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_31T09_37_34.765508", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-31T09:37:34.765508.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-31T09:37:34.765508.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_31T09_37_34.765508", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-31T09:37:34.765508.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-31T09:37:34.765508.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_31T09_37_34.765508", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-31T09:37:34.765508.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-31T09:37:34.765508.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_31T09_37_34.765508", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-31T09:37:34.765508.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-31T09:37:34.765508.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_31T09_37_34.765508", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-31T09:37:34.765508.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-31T09:37:34.765508.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_31T09_37_34.765508", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-31T09:37:34.765508.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-31T09:37:34.765508.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_31T09_37_34.765508", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-31T09:37:34.765508.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-31T09:37:34.765508.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_31T09_37_34.765508", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-31T09:37:34.765508.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-31T09:37:34.765508.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_31T09_37_34.765508", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-31T09:37:34.765508.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-31T09:37:34.765508.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_31T09_37_34.765508", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-31T09:37:34.765508.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-31T09:37:34.765508.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_31T09_37_34.765508", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-31T09:37:34.765508.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-31T09:37:34.765508.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_31T09_37_34.765508", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-31T09:37:34.765508.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-31T09:37:34.765508.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_31T09_37_34.765508", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-31T09:37:34.765508.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-31T09:37:34.765508.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_31T09_37_34.765508", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-31T09:37:34.765508.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-31T09:37:34.765508.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_31T09_37_34.765508", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-31T09:37:34.765508.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-31T09:37:34.765508.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_31T09_37_34.765508", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-31T09:37:34.765508.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-31T09:37:34.765508.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_31T09_37_34.765508", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-31T09:37:34.765508.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-31T09:37:34.765508.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_31T09_37_34.765508", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-31T09:37:34.765508.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-31T09:37:34.765508.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_31T09_37_34.765508", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-31T09:37:34.765508.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-31T09:37:34.765508.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_31T09_37_34.765508", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-31T09:37:34.765508.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-31T09:37:34.765508.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_31T09_37_34.765508", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-31T09:37:34.765508.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-31T09:37:34.765508.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_31T09_37_34.765508", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-31T09:37:34.765508.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-31T09:37:34.765508.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_31T09_37_34.765508", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-31T09:37:34.765508.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-31T09:37:34.765508.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_31T09_37_34.765508", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-31T09:37:34.765508.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-31T09:37:34.765508.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_31T09_37_34.765508", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-31T09:37:34.765508.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-31T09:37:34.765508.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_31T09_37_34.765508", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-31T09:37:34.765508.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-31T09:37:34.765508.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_31T09_37_34.765508", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-31T09:37:34.765508.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-31T09:37:34.765508.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_31T09_37_34.765508", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-31T09:37:34.765508.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-31T09:37:34.765508.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_31T09_37_34.765508", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-31T09:37:34.765508.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-31T09:37:34.765508.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_31T09_37_34.765508", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-31T09:37:34.765508.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-31T09:37:34.765508.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_31T09_37_34.765508", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-31T09:37:34.765508.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-31T09:37:34.765508.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_31T09_37_34.765508", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-31T09:37:34.765508.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-31T09:37:34.765508.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_31T09_37_34.765508", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-31T09:37:34.765508.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-31T09:37:34.765508.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_31T09_37_34.765508", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-31T09:37:34.765508.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-31T09:37:34.765508.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_31T09_37_34.765508", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-31T09:37:34.765508.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-31T09:37:34.765508.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_31T09_37_34.765508", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-31T09:37:34.765508.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-31T09:37:34.765508.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_31T09_37_34.765508", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-31T09:37:34.765508.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-31T09:37:34.765508.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_31T09_37_34.765508", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-31T09:37:34.765508.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-31T09:37:34.765508.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_31T09_37_34.765508", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-31T09:37:34.765508.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-31T09:37:34.765508.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_31T09_37_34.765508", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-31T09:37:34.765508.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-31T09:37:34.765508.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_31T09_37_34.765508", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-31T09:37:34.765508.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-31T09:37:34.765508.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_31T09_37_34.765508", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-31T09:37:34.765508.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-31T09:37:34.765508.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_31T09_37_34.765508", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-31T09:37:34.765508.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-31T09:37:34.765508.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_31T09_37_34.765508", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-31T09:37:34.765508.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-31T09:37:34.765508.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_31T09_37_34.765508", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-31T09:37:34.765508.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-31T09:37:34.765508.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_31T09_37_34.765508", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-31T09:37:34.765508.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-31T09:37:34.765508.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_31T09_37_34.765508", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-31T09:37:34.765508.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-31T09:37:34.765508.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_31T09_37_34.765508", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-31T09:37:34.765508.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-31T09:37:34.765508.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_31T09_37_34.765508", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-31T09:37:34.765508.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-31T09:37:34.765508.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_31T09_37_34.765508", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-31T09:37:34.765508.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-31T09:37:34.765508.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_31T09_37_34.765508", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-31T09:37:34.765508.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-31T09:37:34.765508.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_31T09_37_34.765508", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-31T09:37:34.765508.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-31T09:37:34.765508.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_31T09_37_34.765508", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-31T09:37:34.765508.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-31T09:37:34.765508.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_31T09_37_34.765508", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-31T09:37:34.765508.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-31T09:37:34.765508.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_31T09_37_34.765508", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-31T09:37:34.765508.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-31T09:37:34.765508.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_31T09_37_34.765508", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-31T09:37:34.765508.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-31T09:37:34.765508.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_31T09_37_34.765508", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-31T09:37:34.765508.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-31T09:37:34.765508.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_31T09_37_34.765508", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-31T09:37:34.765508.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-31T09:37:34.765508.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_15T17_32_01.983101", "path": ["**/details_harness|winogrande|5_2023-10-15T17-32-01.983101.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-15T17-32-01.983101.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_31T09_37_34.765508", "path": ["results_2023-07-31T09:37:34.765508.parquet"]}, {"split": "2023_10_15T17_32_01.983101", "path": ["results_2023-10-15T17-32-01.983101.parquet"]}, {"split": "latest", "path": ["results_2023-10-15T17-32-01.983101.parquet"]}]}]}
2023-10-15T16:32:14+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of beaugogh/pythia-1.4b-deduped-sharegpt ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model beaugogh/pythia-1.4b-deduped-sharegpt on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-10-15T17:32:01.983101(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of beaugogh/pythia-1.4b-deduped-sharegpt", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model beaugogh/pythia-1.4b-deduped-sharegpt on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-15T17:32:01.983101(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of beaugogh/pythia-1.4b-deduped-sharegpt", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model beaugogh/pythia-1.4b-deduped-sharegpt on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-15T17:32:01.983101(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 26, 31, 174, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of beaugogh/pythia-1.4b-deduped-sharegpt## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model beaugogh/pythia-1.4b-deduped-sharegpt on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-15T17:32:01.983101(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
be57f37d672dcfc3e6a3a3236c978abbc15c7845
# Dataset Card for Evaluation run of beaugogh/Llama2-7b-sharegpt4 ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/beaugogh/Llama2-7b-sharegpt4 - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [beaugogh/Llama2-7b-sharegpt4](https://huggingface.co/beaugogh/Llama2-7b-sharegpt4) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_beaugogh__Llama2-7b-sharegpt4", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-10-13T12:02:42.509386](https://huggingface.co/datasets/open-llm-leaderboard/details_beaugogh__Llama2-7b-sharegpt4/blob/main/results_2023-10-13T12-02-42.509386.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.001572986577181208, "em_stderr": 0.0004058451132417743, "f1": 0.06141988255033573, "f1_stderr": 0.0014263478827371335, "acc": 0.369226585159047, "acc_stderr": 0.008577465355756637 }, "harness|drop|3": { "em": 0.001572986577181208, "em_stderr": 0.0004058451132417743, "f1": 0.06141988255033573, "f1_stderr": 0.0014263478827371335 }, "harness|gsm8k|5": { "acc": 0.026535253980288095, "acc_stderr": 0.00442704598726516 }, "harness|winogrande|5": { "acc": 0.7119179163378059, "acc_stderr": 0.012727884724248115 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_beaugogh__Llama2-7b-sharegpt4
[ "region:us" ]
2023-08-18T11:01:37+00:00
{"pretty_name": "Evaluation run of beaugogh/Llama2-7b-sharegpt4", "dataset_summary": "Dataset automatically created during the evaluation run of model [beaugogh/Llama2-7b-sharegpt4](https://huggingface.co/beaugogh/Llama2-7b-sharegpt4) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_beaugogh__Llama2-7b-sharegpt4\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-13T12:02:42.509386](https://huggingface.co/datasets/open-llm-leaderboard/details_beaugogh__Llama2-7b-sharegpt4/blob/main/results_2023-10-13T12-02-42.509386.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.001572986577181208,\n \"em_stderr\": 0.0004058451132417743,\n \"f1\": 0.06141988255033573,\n \"f1_stderr\": 0.0014263478827371335,\n \"acc\": 0.369226585159047,\n \"acc_stderr\": 0.008577465355756637\n },\n \"harness|drop|3\": {\n \"em\": 0.001572986577181208,\n \"em_stderr\": 0.0004058451132417743,\n \"f1\": 0.06141988255033573,\n \"f1_stderr\": 0.0014263478827371335\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.026535253980288095,\n \"acc_stderr\": 0.00442704598726516\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7119179163378059,\n \"acc_stderr\": 0.012727884724248115\n }\n}\n```", "repo_url": "https://huggingface.co/beaugogh/Llama2-7b-sharegpt4", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_08_09T11_50_59.260675", "path": ["**/details_harness|arc:challenge|25_2023-08-09T11:50:59.260675.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-08-09T11:50:59.260675.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_13T12_02_42.509386", "path": ["**/details_harness|drop|3_2023-10-13T12-02-42.509386.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-13T12-02-42.509386.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_13T12_02_42.509386", "path": ["**/details_harness|gsm8k|5_2023-10-13T12-02-42.509386.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-13T12-02-42.509386.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_08_09T11_50_59.260675", "path": ["**/details_harness|hellaswag|10_2023-08-09T11:50:59.260675.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-08-09T11:50:59.260675.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_08_09T11_50_59.260675", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T11:50:59.260675.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-09T11:50:59.260675.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-09T11:50:59.260675.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T11:50:59.260675.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T11:50:59.260675.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-09T11:50:59.260675.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T11:50:59.260675.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T11:50:59.260675.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T11:50:59.260675.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T11:50:59.260675.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-09T11:50:59.260675.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-09T11:50:59.260675.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T11:50:59.260675.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-09T11:50:59.260675.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T11:50:59.260675.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T11:50:59.260675.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T11:50:59.260675.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-09T11:50:59.260675.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T11:50:59.260675.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T11:50:59.260675.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T11:50:59.260675.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T11:50:59.260675.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T11:50:59.260675.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T11:50:59.260675.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T11:50:59.260675.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T11:50:59.260675.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T11:50:59.260675.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T11:50:59.260675.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T11:50:59.260675.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T11:50:59.260675.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T11:50:59.260675.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T11:50:59.260675.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-09T11:50:59.260675.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T11:50:59.260675.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-09T11:50:59.260675.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T11:50:59.260675.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T11:50:59.260675.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T11:50:59.260675.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-09T11:50:59.260675.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-09T11:50:59.260675.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T11:50:59.260675.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T11:50:59.260675.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T11:50:59.260675.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T11:50:59.260675.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-09T11:50:59.260675.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-09T11:50:59.260675.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-09T11:50:59.260675.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T11:50:59.260675.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-09T11:50:59.260675.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T11:50:59.260675.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T11:50:59.260675.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-09T11:50:59.260675.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-09T11:50:59.260675.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-09T11:50:59.260675.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T11:50:59.260675.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-09T11:50:59.260675.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-09T11:50:59.260675.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T11:50:59.260675.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-09T11:50:59.260675.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-09T11:50:59.260675.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T11:50:59.260675.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T11:50:59.260675.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-09T11:50:59.260675.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T11:50:59.260675.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T11:50:59.260675.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T11:50:59.260675.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T11:50:59.260675.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-09T11:50:59.260675.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-09T11:50:59.260675.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T11:50:59.260675.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-09T11:50:59.260675.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T11:50:59.260675.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T11:50:59.260675.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T11:50:59.260675.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-09T11:50:59.260675.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T11:50:59.260675.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T11:50:59.260675.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T11:50:59.260675.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T11:50:59.260675.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T11:50:59.260675.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T11:50:59.260675.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T11:50:59.260675.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T11:50:59.260675.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T11:50:59.260675.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T11:50:59.260675.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T11:50:59.260675.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T11:50:59.260675.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T11:50:59.260675.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T11:50:59.260675.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-09T11:50:59.260675.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T11:50:59.260675.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-09T11:50:59.260675.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T11:50:59.260675.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T11:50:59.260675.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T11:50:59.260675.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-09T11:50:59.260675.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-09T11:50:59.260675.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T11:50:59.260675.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T11:50:59.260675.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T11:50:59.260675.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T11:50:59.260675.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-09T11:50:59.260675.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-09T11:50:59.260675.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-09T11:50:59.260675.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T11:50:59.260675.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-09T11:50:59.260675.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T11:50:59.260675.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T11:50:59.260675.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-09T11:50:59.260675.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-09T11:50:59.260675.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-09T11:50:59.260675.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T11:50:59.260675.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-09T11:50:59.260675.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-09T11:50:59.260675.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_08_09T11_50_59.260675", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T11:50:59.260675.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T11:50:59.260675.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_08_09T11_50_59.260675", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-09T11:50:59.260675.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-09T11:50:59.260675.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_08_09T11_50_59.260675", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-09T11:50:59.260675.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-09T11:50:59.260675.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_08_09T11_50_59.260675", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T11:50:59.260675.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T11:50:59.260675.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_08_09T11_50_59.260675", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T11:50:59.260675.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T11:50:59.260675.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_08_09T11_50_59.260675", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-09T11:50:59.260675.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-09T11:50:59.260675.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_08_09T11_50_59.260675", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T11:50:59.260675.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T11:50:59.260675.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_08_09T11_50_59.260675", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T11:50:59.260675.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T11:50:59.260675.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_08_09T11_50_59.260675", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T11:50:59.260675.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T11:50:59.260675.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_08_09T11_50_59.260675", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T11:50:59.260675.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T11:50:59.260675.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_08_09T11_50_59.260675", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-09T11:50:59.260675.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-09T11:50:59.260675.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_08_09T11_50_59.260675", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-09T11:50:59.260675.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-09T11:50:59.260675.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_08_09T11_50_59.260675", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T11:50:59.260675.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T11:50:59.260675.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_08_09T11_50_59.260675", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-09T11:50:59.260675.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-09T11:50:59.260675.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_08_09T11_50_59.260675", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T11:50:59.260675.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T11:50:59.260675.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_08_09T11_50_59.260675", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T11:50:59.260675.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T11:50:59.260675.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_08_09T11_50_59.260675", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T11:50:59.260675.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T11:50:59.260675.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_08_09T11_50_59.260675", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-09T11:50:59.260675.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-09T11:50:59.260675.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_08_09T11_50_59.260675", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T11:50:59.260675.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T11:50:59.260675.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_08_09T11_50_59.260675", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T11:50:59.260675.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T11:50:59.260675.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_08_09T11_50_59.260675", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T11:50:59.260675.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T11:50:59.260675.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_08_09T11_50_59.260675", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T11:50:59.260675.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T11:50:59.260675.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_08_09T11_50_59.260675", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T11:50:59.260675.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T11:50:59.260675.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_08_09T11_50_59.260675", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T11:50:59.260675.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T11:50:59.260675.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_08_09T11_50_59.260675", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T11:50:59.260675.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T11:50:59.260675.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_08_09T11_50_59.260675", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T11:50:59.260675.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T11:50:59.260675.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_08_09T11_50_59.260675", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T11:50:59.260675.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T11:50:59.260675.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_08_09T11_50_59.260675", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T11:50:59.260675.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T11:50:59.260675.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_08_09T11_50_59.260675", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T11:50:59.260675.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T11:50:59.260675.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_08_09T11_50_59.260675", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T11:50:59.260675.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T11:50:59.260675.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_08_09T11_50_59.260675", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T11:50:59.260675.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T11:50:59.260675.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_08_09T11_50_59.260675", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T11:50:59.260675.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T11:50:59.260675.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_08_09T11_50_59.260675", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-09T11:50:59.260675.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-09T11:50:59.260675.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_08_09T11_50_59.260675", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T11:50:59.260675.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T11:50:59.260675.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_08_09T11_50_59.260675", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-09T11:50:59.260675.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-09T11:50:59.260675.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_08_09T11_50_59.260675", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T11:50:59.260675.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T11:50:59.260675.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_08_09T11_50_59.260675", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T11:50:59.260675.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T11:50:59.260675.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_08_09T11_50_59.260675", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T11:50:59.260675.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T11:50:59.260675.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_08_09T11_50_59.260675", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-09T11:50:59.260675.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-09T11:50:59.260675.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_08_09T11_50_59.260675", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-09T11:50:59.260675.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-09T11:50:59.260675.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_08_09T11_50_59.260675", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T11:50:59.260675.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T11:50:59.260675.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_08_09T11_50_59.260675", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T11:50:59.260675.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T11:50:59.260675.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_08_09T11_50_59.260675", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T11:50:59.260675.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T11:50:59.260675.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_08_09T11_50_59.260675", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T11:50:59.260675.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T11:50:59.260675.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_08_09T11_50_59.260675", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-09T11:50:59.260675.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-09T11:50:59.260675.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_08_09T11_50_59.260675", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-09T11:50:59.260675.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-09T11:50:59.260675.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_08_09T11_50_59.260675", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-09T11:50:59.260675.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-09T11:50:59.260675.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_08_09T11_50_59.260675", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T11:50:59.260675.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T11:50:59.260675.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_08_09T11_50_59.260675", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-09T11:50:59.260675.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-09T11:50:59.260675.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_08_09T11_50_59.260675", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T11:50:59.260675.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T11:50:59.260675.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_08_09T11_50_59.260675", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T11:50:59.260675.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T11:50:59.260675.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_08_09T11_50_59.260675", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-09T11:50:59.260675.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-09T11:50:59.260675.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_08_09T11_50_59.260675", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-09T11:50:59.260675.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-09T11:50:59.260675.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_08_09T11_50_59.260675", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-09T11:50:59.260675.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-09T11:50:59.260675.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_08_09T11_50_59.260675", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T11:50:59.260675.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T11:50:59.260675.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_08_09T11_50_59.260675", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-09T11:50:59.260675.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-09T11:50:59.260675.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_08_09T11_50_59.260675", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-09T11:50:59.260675.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-09T11:50:59.260675.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_08_09T11_50_59.260675", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-09T11:50:59.260675.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-09T11:50:59.260675.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_13T12_02_42.509386", "path": ["**/details_harness|winogrande|5_2023-10-13T12-02-42.509386.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-13T12-02-42.509386.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_08_09T11_50_59.260675", "path": ["results_2023-08-09T11:50:59.260675.parquet"]}, {"split": "2023_10_13T12_02_42.509386", "path": ["results_2023-10-13T12-02-42.509386.parquet"]}, {"split": "latest", "path": ["results_2023-10-13T12-02-42.509386.parquet"]}]}]}
2023-10-13T11:02:55+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of beaugogh/Llama2-7b-sharegpt4 ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model beaugogh/Llama2-7b-sharegpt4 on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-10-13T12:02:42.509386(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of beaugogh/Llama2-7b-sharegpt4", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model beaugogh/Llama2-7b-sharegpt4 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-13T12:02:42.509386(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of beaugogh/Llama2-7b-sharegpt4", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model beaugogh/Llama2-7b-sharegpt4 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-13T12:02:42.509386(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 23, 31, 171, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of beaugogh/Llama2-7b-sharegpt4## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model beaugogh/Llama2-7b-sharegpt4 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-13T12:02:42.509386(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
7f1e350ac28990f43224efc012e668b0ca2ebb46
# Dataset Card for "mmlu-high_school_biology-neg-prepend-fix" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
joey234/mmlu-high_school_biology-neg-prepend-fix
[ "region:us" ]
2023-08-18T11:01:40+00:00
{"configs": [{"config_name": "default", "data_files": [{"split": "dev", "path": "data/dev-*"}, {"split": "test", "path": "data/test-*"}]}], "dataset_info": {"features": [{"name": "question", "dtype": "string"}, {"name": "choices", "sequence": "string"}, {"name": "answer", "dtype": {"class_label": {"names": {"0": "A", "1": "B", "2": "C", "3": "D"}}}}, {"name": "negate_openai_prompt", "struct": [{"name": "content", "dtype": "string"}, {"name": "role", "dtype": "string"}]}, {"name": "neg_question", "dtype": "string"}, {"name": "fewshot_context", "dtype": "string"}, {"name": "ori_prompt", "dtype": "string"}], "splits": [{"name": "dev", "num_bytes": 6848, "num_examples": 5}, {"name": "test", "num_bytes": 953604, "num_examples": 310}], "download_size": 15677, "dataset_size": 960452}}
2023-08-21T06:34:53+00:00
[]
[]
TAGS #region-us
# Dataset Card for "mmlu-high_school_biology-neg-prepend-fix" More Information needed
[ "# Dataset Card for \"mmlu-high_school_biology-neg-prepend-fix\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"mmlu-high_school_biology-neg-prepend-fix\"\n\nMore Information needed" ]
[ 6, 28 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"mmlu-high_school_biology-neg-prepend-fix\"\n\nMore Information needed" ]
95d10eb713e440b5706554edd617cd346f0cd77b
# Dataset Card for Evaluation run of Mikael110/llama-2-7b-guanaco-fp16 ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/Mikael110/llama-2-7b-guanaco-fp16 - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [Mikael110/llama-2-7b-guanaco-fp16](https://huggingface.co/Mikael110/llama-2-7b-guanaco-fp16) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_Mikael110__llama-2-7b-guanaco-fp16", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-09-22T21:40:11.783990](https://huggingface.co/datasets/open-llm-leaderboard/details_Mikael110__llama-2-7b-guanaco-fp16/blob/main/results_2023-09-22T21-40-11.783990.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.0020973154362416107, "em_stderr": 0.00046850650303684253, "f1": 0.059943372483221714, "f1_stderr": 0.0013894963297796357, "acc": 0.40754847044560916, "acc_stderr": 0.009411574300699036 }, "harness|drop|3": { "em": 0.0020973154362416107, "em_stderr": 0.00046850650303684253, "f1": 0.059943372483221714, "f1_stderr": 0.0013894963297796357 }, "harness|gsm8k|5": { "acc": 0.06292645943896892, "acc_stderr": 0.006688762581532721 }, "harness|winogrande|5": { "acc": 0.7521704814522494, "acc_stderr": 0.01213438601986535 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_Mikael110__llama-2-7b-guanaco-fp16
[ "region:us" ]
2023-08-18T11:01:46+00:00
{"pretty_name": "Evaluation run of Mikael110/llama-2-7b-guanaco-fp16", "dataset_summary": "Dataset automatically created during the evaluation run of model [Mikael110/llama-2-7b-guanaco-fp16](https://huggingface.co/Mikael110/llama-2-7b-guanaco-fp16) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Mikael110__llama-2-7b-guanaco-fp16\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-09-22T21:40:11.783990](https://huggingface.co/datasets/open-llm-leaderboard/details_Mikael110__llama-2-7b-guanaco-fp16/blob/main/results_2023-09-22T21-40-11.783990.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0020973154362416107,\n \"em_stderr\": 0.00046850650303684253,\n \"f1\": 0.059943372483221714,\n \"f1_stderr\": 0.0013894963297796357,\n \"acc\": 0.40754847044560916,\n \"acc_stderr\": 0.009411574300699036\n },\n \"harness|drop|3\": {\n \"em\": 0.0020973154362416107,\n \"em_stderr\": 0.00046850650303684253,\n \"f1\": 0.059943372483221714,\n \"f1_stderr\": 0.0013894963297796357\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.06292645943896892,\n \"acc_stderr\": 0.006688762581532721\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7521704814522494,\n \"acc_stderr\": 0.01213438601986535\n }\n}\n```", "repo_url": "https://huggingface.co/Mikael110/llama-2-7b-guanaco-fp16", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_24T11_28_02.065670", "path": ["**/details_harness|arc:challenge|25_2023-07-24T11:28:02.065670.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-24T11:28:02.065670.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_09_22T21_40_11.783990", "path": ["**/details_harness|drop|3_2023-09-22T21-40-11.783990.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-09-22T21-40-11.783990.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_09_22T21_40_11.783990", "path": ["**/details_harness|gsm8k|5_2023-09-22T21-40-11.783990.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-09-22T21-40-11.783990.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_24T11_28_02.065670", "path": ["**/details_harness|hellaswag|10_2023-07-24T11:28:02.065670.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-24T11:28:02.065670.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_24T11_28_02.065670", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T11:28:02.065670.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-24T11:28:02.065670.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-24T11:28:02.065670.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T11:28:02.065670.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T11:28:02.065670.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-24T11:28:02.065670.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T11:28:02.065670.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T11:28:02.065670.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T11:28:02.065670.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T11:28:02.065670.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-24T11:28:02.065670.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-24T11:28:02.065670.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T11:28:02.065670.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-24T11:28:02.065670.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T11:28:02.065670.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T11:28:02.065670.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T11:28:02.065670.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-24T11:28:02.065670.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T11:28:02.065670.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T11:28:02.065670.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T11:28:02.065670.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T11:28:02.065670.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T11:28:02.065670.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T11:28:02.065670.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T11:28:02.065670.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T11:28:02.065670.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T11:28:02.065670.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T11:28:02.065670.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T11:28:02.065670.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T11:28:02.065670.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T11:28:02.065670.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T11:28:02.065670.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-24T11:28:02.065670.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T11:28:02.065670.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-24T11:28:02.065670.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T11:28:02.065670.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T11:28:02.065670.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T11:28:02.065670.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-24T11:28:02.065670.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-24T11:28:02.065670.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T11:28:02.065670.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T11:28:02.065670.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T11:28:02.065670.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T11:28:02.065670.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-24T11:28:02.065670.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-24T11:28:02.065670.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-24T11:28:02.065670.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T11:28:02.065670.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-24T11:28:02.065670.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T11:28:02.065670.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T11:28:02.065670.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-24T11:28:02.065670.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-24T11:28:02.065670.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-24T11:28:02.065670.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T11:28:02.065670.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-24T11:28:02.065670.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-24T11:28:02.065670.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T11:28:02.065670.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-24T11:28:02.065670.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-24T11:28:02.065670.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T11:28:02.065670.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T11:28:02.065670.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-24T11:28:02.065670.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T11:28:02.065670.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T11:28:02.065670.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T11:28:02.065670.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T11:28:02.065670.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-24T11:28:02.065670.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-24T11:28:02.065670.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T11:28:02.065670.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-24T11:28:02.065670.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T11:28:02.065670.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T11:28:02.065670.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T11:28:02.065670.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-24T11:28:02.065670.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T11:28:02.065670.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T11:28:02.065670.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T11:28:02.065670.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T11:28:02.065670.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T11:28:02.065670.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T11:28:02.065670.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T11:28:02.065670.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T11:28:02.065670.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T11:28:02.065670.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T11:28:02.065670.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T11:28:02.065670.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T11:28:02.065670.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T11:28:02.065670.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T11:28:02.065670.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-24T11:28:02.065670.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T11:28:02.065670.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-24T11:28:02.065670.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T11:28:02.065670.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T11:28:02.065670.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T11:28:02.065670.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-24T11:28:02.065670.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-24T11:28:02.065670.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T11:28:02.065670.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T11:28:02.065670.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T11:28:02.065670.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T11:28:02.065670.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-24T11:28:02.065670.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-24T11:28:02.065670.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-24T11:28:02.065670.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T11:28:02.065670.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-24T11:28:02.065670.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T11:28:02.065670.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T11:28:02.065670.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-24T11:28:02.065670.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-24T11:28:02.065670.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-24T11:28:02.065670.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T11:28:02.065670.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-24T11:28:02.065670.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-24T11:28:02.065670.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_24T11_28_02.065670", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T11:28:02.065670.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T11:28:02.065670.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_24T11_28_02.065670", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-24T11:28:02.065670.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-24T11:28:02.065670.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_24T11_28_02.065670", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-24T11:28:02.065670.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-24T11:28:02.065670.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_24T11_28_02.065670", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T11:28:02.065670.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T11:28:02.065670.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_24T11_28_02.065670", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T11:28:02.065670.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T11:28:02.065670.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_24T11_28_02.065670", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-24T11:28:02.065670.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-24T11:28:02.065670.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_24T11_28_02.065670", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T11:28:02.065670.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T11:28:02.065670.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_24T11_28_02.065670", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T11:28:02.065670.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T11:28:02.065670.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_24T11_28_02.065670", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T11:28:02.065670.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T11:28:02.065670.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_24T11_28_02.065670", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T11:28:02.065670.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T11:28:02.065670.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_24T11_28_02.065670", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-24T11:28:02.065670.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-24T11:28:02.065670.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_24T11_28_02.065670", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-24T11:28:02.065670.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-24T11:28:02.065670.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_24T11_28_02.065670", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T11:28:02.065670.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T11:28:02.065670.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_24T11_28_02.065670", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-24T11:28:02.065670.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-24T11:28:02.065670.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_24T11_28_02.065670", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T11:28:02.065670.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T11:28:02.065670.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_24T11_28_02.065670", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T11:28:02.065670.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T11:28:02.065670.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_24T11_28_02.065670", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T11:28:02.065670.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T11:28:02.065670.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_24T11_28_02.065670", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-24T11:28:02.065670.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-24T11:28:02.065670.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_24T11_28_02.065670", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T11:28:02.065670.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T11:28:02.065670.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_24T11_28_02.065670", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T11:28:02.065670.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T11:28:02.065670.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_24T11_28_02.065670", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T11:28:02.065670.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T11:28:02.065670.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_24T11_28_02.065670", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T11:28:02.065670.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T11:28:02.065670.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_24T11_28_02.065670", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T11:28:02.065670.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T11:28:02.065670.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_24T11_28_02.065670", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T11:28:02.065670.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T11:28:02.065670.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_24T11_28_02.065670", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T11:28:02.065670.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T11:28:02.065670.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_24T11_28_02.065670", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T11:28:02.065670.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T11:28:02.065670.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_24T11_28_02.065670", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T11:28:02.065670.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T11:28:02.065670.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_24T11_28_02.065670", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T11:28:02.065670.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T11:28:02.065670.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_24T11_28_02.065670", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T11:28:02.065670.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T11:28:02.065670.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_24T11_28_02.065670", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T11:28:02.065670.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T11:28:02.065670.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_24T11_28_02.065670", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T11:28:02.065670.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T11:28:02.065670.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_24T11_28_02.065670", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T11:28:02.065670.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T11:28:02.065670.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_24T11_28_02.065670", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-24T11:28:02.065670.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-24T11:28:02.065670.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_24T11_28_02.065670", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T11:28:02.065670.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T11:28:02.065670.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_24T11_28_02.065670", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-24T11:28:02.065670.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-24T11:28:02.065670.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_24T11_28_02.065670", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T11:28:02.065670.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T11:28:02.065670.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_24T11_28_02.065670", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T11:28:02.065670.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T11:28:02.065670.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_24T11_28_02.065670", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T11:28:02.065670.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T11:28:02.065670.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_24T11_28_02.065670", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-24T11:28:02.065670.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-24T11:28:02.065670.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_24T11_28_02.065670", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-24T11:28:02.065670.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-24T11:28:02.065670.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_24T11_28_02.065670", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T11:28:02.065670.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T11:28:02.065670.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_24T11_28_02.065670", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T11:28:02.065670.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T11:28:02.065670.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_24T11_28_02.065670", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T11:28:02.065670.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T11:28:02.065670.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_24T11_28_02.065670", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T11:28:02.065670.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T11:28:02.065670.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_24T11_28_02.065670", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-24T11:28:02.065670.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-24T11:28:02.065670.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_24T11_28_02.065670", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-24T11:28:02.065670.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-24T11:28:02.065670.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_24T11_28_02.065670", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-24T11:28:02.065670.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-24T11:28:02.065670.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_24T11_28_02.065670", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T11:28:02.065670.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T11:28:02.065670.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_24T11_28_02.065670", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-24T11:28:02.065670.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-24T11:28:02.065670.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_24T11_28_02.065670", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T11:28:02.065670.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T11:28:02.065670.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_24T11_28_02.065670", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T11:28:02.065670.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T11:28:02.065670.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_24T11_28_02.065670", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-24T11:28:02.065670.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-24T11:28:02.065670.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_24T11_28_02.065670", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-24T11:28:02.065670.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-24T11:28:02.065670.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_24T11_28_02.065670", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-24T11:28:02.065670.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-24T11:28:02.065670.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_24T11_28_02.065670", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T11:28:02.065670.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T11:28:02.065670.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_24T11_28_02.065670", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-24T11:28:02.065670.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-24T11:28:02.065670.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_24T11_28_02.065670", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-24T11:28:02.065670.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-24T11:28:02.065670.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_24T11_28_02.065670", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-24T11:28:02.065670.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-24T11:28:02.065670.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_09_22T21_40_11.783990", "path": ["**/details_harness|winogrande|5_2023-09-22T21-40-11.783990.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-09-22T21-40-11.783990.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_24T11_28_02.065670", "path": ["results_2023-07-24T11:28:02.065670.parquet"]}, {"split": "2023_09_22T21_40_11.783990", "path": ["results_2023-09-22T21-40-11.783990.parquet"]}, {"split": "latest", "path": ["results_2023-09-22T21-40-11.783990.parquet"]}]}]}
2023-09-22T20:40:23+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of Mikael110/llama-2-7b-guanaco-fp16 ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model Mikael110/llama-2-7b-guanaco-fp16 on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-09-22T21:40:11.783990(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of Mikael110/llama-2-7b-guanaco-fp16", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model Mikael110/llama-2-7b-guanaco-fp16 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-09-22T21:40:11.783990(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of Mikael110/llama-2-7b-guanaco-fp16", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model Mikael110/llama-2-7b-guanaco-fp16 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-09-22T21:40:11.783990(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 25, 31, 173, 66, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of Mikael110/llama-2-7b-guanaco-fp16## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model Mikael110/llama-2-7b-guanaco-fp16 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-09-22T21:40:11.783990(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
9ff2cbe6d6044ea74513df1945b44d6c292989ca
# Dataset Card for Evaluation run of Mikael110/llama-2-13b-guanaco-fp16 ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/Mikael110/llama-2-13b-guanaco-fp16 - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [Mikael110/llama-2-13b-guanaco-fp16](https://huggingface.co/Mikael110/llama-2-13b-guanaco-fp16) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_Mikael110__llama-2-13b-guanaco-fp16", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-10-15T06:46:55.405946](https://huggingface.co/datasets/open-llm-leaderboard/details_Mikael110__llama-2-13b-guanaco-fp16/blob/main/results_2023-10-15T06-46-55.405946.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.0024119127516778523, "em_stderr": 0.0005023380498893348, "f1": 0.0650419463087247, "f1_stderr": 0.0014141562591008796, "acc": 0.43250519246062497, "acc_stderr": 0.010503130855979311 }, "harness|drop|3": { "em": 0.0024119127516778523, "em_stderr": 0.0005023380498893348, "f1": 0.0650419463087247, "f1_stderr": 0.0014141562591008796 }, "harness|gsm8k|5": { "acc": 0.11599696739954511, "acc_stderr": 0.00882048549144247 }, "harness|winogrande|5": { "acc": 0.7490134175217048, "acc_stderr": 0.012185776220516153 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_Mikael110__llama-2-13b-guanaco-fp16
[ "region:us" ]
2023-08-18T11:01:54+00:00
{"pretty_name": "Evaluation run of Mikael110/llama-2-13b-guanaco-fp16", "dataset_summary": "Dataset automatically created during the evaluation run of model [Mikael110/llama-2-13b-guanaco-fp16](https://huggingface.co/Mikael110/llama-2-13b-guanaco-fp16) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Mikael110__llama-2-13b-guanaco-fp16\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-15T06:46:55.405946](https://huggingface.co/datasets/open-llm-leaderboard/details_Mikael110__llama-2-13b-guanaco-fp16/blob/main/results_2023-10-15T06-46-55.405946.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0024119127516778523,\n \"em_stderr\": 0.0005023380498893348,\n \"f1\": 0.0650419463087247,\n \"f1_stderr\": 0.0014141562591008796,\n \"acc\": 0.43250519246062497,\n \"acc_stderr\": 0.010503130855979311\n },\n \"harness|drop|3\": {\n \"em\": 0.0024119127516778523,\n \"em_stderr\": 0.0005023380498893348,\n \"f1\": 0.0650419463087247,\n \"f1_stderr\": 0.0014141562591008796\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.11599696739954511,\n \"acc_stderr\": 0.00882048549144247\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7490134175217048,\n \"acc_stderr\": 0.012185776220516153\n }\n}\n```", "repo_url": "https://huggingface.co/Mikael110/llama-2-13b-guanaco-fp16", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_24T14_22_01.485033", "path": ["**/details_harness|arc:challenge|25_2023-07-24T14:22:01.485033.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-24T14:22:01.485033.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_15T06_46_55.405946", "path": ["**/details_harness|drop|3_2023-10-15T06-46-55.405946.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-15T06-46-55.405946.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_15T06_46_55.405946", "path": ["**/details_harness|gsm8k|5_2023-10-15T06-46-55.405946.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-15T06-46-55.405946.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_24T14_22_01.485033", "path": ["**/details_harness|hellaswag|10_2023-07-24T14:22:01.485033.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-24T14:22:01.485033.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_24T14_22_01.485033", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T14:22:01.485033.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-24T14:22:01.485033.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-24T14:22:01.485033.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T14:22:01.485033.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T14:22:01.485033.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-24T14:22:01.485033.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T14:22:01.485033.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T14:22:01.485033.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T14:22:01.485033.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T14:22:01.485033.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-24T14:22:01.485033.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-24T14:22:01.485033.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T14:22:01.485033.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-24T14:22:01.485033.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T14:22:01.485033.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T14:22:01.485033.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T14:22:01.485033.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-24T14:22:01.485033.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T14:22:01.485033.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T14:22:01.485033.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T14:22:01.485033.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T14:22:01.485033.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T14:22:01.485033.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T14:22:01.485033.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T14:22:01.485033.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T14:22:01.485033.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T14:22:01.485033.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T14:22:01.485033.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T14:22:01.485033.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T14:22:01.485033.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T14:22:01.485033.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T14:22:01.485033.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-24T14:22:01.485033.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T14:22:01.485033.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-24T14:22:01.485033.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T14:22:01.485033.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T14:22:01.485033.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T14:22:01.485033.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-24T14:22:01.485033.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-24T14:22:01.485033.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T14:22:01.485033.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T14:22:01.485033.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T14:22:01.485033.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T14:22:01.485033.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-24T14:22:01.485033.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-24T14:22:01.485033.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-24T14:22:01.485033.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T14:22:01.485033.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-24T14:22:01.485033.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T14:22:01.485033.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T14:22:01.485033.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-24T14:22:01.485033.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-24T14:22:01.485033.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-24T14:22:01.485033.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T14:22:01.485033.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-24T14:22:01.485033.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-24T14:22:01.485033.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T14:22:01.485033.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-24T14:22:01.485033.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-24T14:22:01.485033.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T14:22:01.485033.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T14:22:01.485033.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-24T14:22:01.485033.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T14:22:01.485033.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T14:22:01.485033.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T14:22:01.485033.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T14:22:01.485033.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-24T14:22:01.485033.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-24T14:22:01.485033.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T14:22:01.485033.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-24T14:22:01.485033.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T14:22:01.485033.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T14:22:01.485033.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T14:22:01.485033.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-24T14:22:01.485033.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T14:22:01.485033.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T14:22:01.485033.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T14:22:01.485033.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T14:22:01.485033.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T14:22:01.485033.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T14:22:01.485033.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T14:22:01.485033.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T14:22:01.485033.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T14:22:01.485033.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T14:22:01.485033.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T14:22:01.485033.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T14:22:01.485033.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T14:22:01.485033.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T14:22:01.485033.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-24T14:22:01.485033.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T14:22:01.485033.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-24T14:22:01.485033.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T14:22:01.485033.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T14:22:01.485033.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T14:22:01.485033.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-24T14:22:01.485033.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-24T14:22:01.485033.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T14:22:01.485033.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T14:22:01.485033.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T14:22:01.485033.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T14:22:01.485033.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-24T14:22:01.485033.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-24T14:22:01.485033.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-24T14:22:01.485033.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T14:22:01.485033.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-24T14:22:01.485033.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T14:22:01.485033.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T14:22:01.485033.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-24T14:22:01.485033.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-24T14:22:01.485033.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-24T14:22:01.485033.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T14:22:01.485033.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-24T14:22:01.485033.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-24T14:22:01.485033.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_24T14_22_01.485033", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T14:22:01.485033.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T14:22:01.485033.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_24T14_22_01.485033", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-24T14:22:01.485033.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-24T14:22:01.485033.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_24T14_22_01.485033", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-24T14:22:01.485033.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-24T14:22:01.485033.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_24T14_22_01.485033", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T14:22:01.485033.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T14:22:01.485033.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_24T14_22_01.485033", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T14:22:01.485033.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T14:22:01.485033.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_24T14_22_01.485033", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-24T14:22:01.485033.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-24T14:22:01.485033.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_24T14_22_01.485033", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T14:22:01.485033.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T14:22:01.485033.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_24T14_22_01.485033", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T14:22:01.485033.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T14:22:01.485033.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_24T14_22_01.485033", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T14:22:01.485033.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T14:22:01.485033.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_24T14_22_01.485033", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T14:22:01.485033.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T14:22:01.485033.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_24T14_22_01.485033", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-24T14:22:01.485033.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-24T14:22:01.485033.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_24T14_22_01.485033", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-24T14:22:01.485033.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-24T14:22:01.485033.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_24T14_22_01.485033", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T14:22:01.485033.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T14:22:01.485033.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_24T14_22_01.485033", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-24T14:22:01.485033.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-24T14:22:01.485033.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_24T14_22_01.485033", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T14:22:01.485033.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T14:22:01.485033.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_24T14_22_01.485033", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T14:22:01.485033.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T14:22:01.485033.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_24T14_22_01.485033", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T14:22:01.485033.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T14:22:01.485033.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_24T14_22_01.485033", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-24T14:22:01.485033.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-24T14:22:01.485033.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_24T14_22_01.485033", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T14:22:01.485033.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T14:22:01.485033.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_24T14_22_01.485033", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T14:22:01.485033.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T14:22:01.485033.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_24T14_22_01.485033", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T14:22:01.485033.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T14:22:01.485033.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_24T14_22_01.485033", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T14:22:01.485033.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T14:22:01.485033.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_24T14_22_01.485033", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T14:22:01.485033.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T14:22:01.485033.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_24T14_22_01.485033", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T14:22:01.485033.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T14:22:01.485033.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_24T14_22_01.485033", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T14:22:01.485033.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T14:22:01.485033.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_24T14_22_01.485033", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T14:22:01.485033.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T14:22:01.485033.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_24T14_22_01.485033", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T14:22:01.485033.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T14:22:01.485033.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_24T14_22_01.485033", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T14:22:01.485033.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T14:22:01.485033.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_24T14_22_01.485033", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T14:22:01.485033.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T14:22:01.485033.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_24T14_22_01.485033", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T14:22:01.485033.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T14:22:01.485033.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_24T14_22_01.485033", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T14:22:01.485033.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T14:22:01.485033.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_24T14_22_01.485033", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T14:22:01.485033.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T14:22:01.485033.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_24T14_22_01.485033", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-24T14:22:01.485033.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-24T14:22:01.485033.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_24T14_22_01.485033", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T14:22:01.485033.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T14:22:01.485033.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_24T14_22_01.485033", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-24T14:22:01.485033.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-24T14:22:01.485033.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_24T14_22_01.485033", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T14:22:01.485033.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T14:22:01.485033.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_24T14_22_01.485033", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T14:22:01.485033.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T14:22:01.485033.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_24T14_22_01.485033", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T14:22:01.485033.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T14:22:01.485033.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_24T14_22_01.485033", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-24T14:22:01.485033.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-24T14:22:01.485033.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_24T14_22_01.485033", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-24T14:22:01.485033.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-24T14:22:01.485033.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_24T14_22_01.485033", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T14:22:01.485033.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T14:22:01.485033.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_24T14_22_01.485033", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T14:22:01.485033.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T14:22:01.485033.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_24T14_22_01.485033", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T14:22:01.485033.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T14:22:01.485033.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_24T14_22_01.485033", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T14:22:01.485033.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T14:22:01.485033.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_24T14_22_01.485033", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-24T14:22:01.485033.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-24T14:22:01.485033.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_24T14_22_01.485033", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-24T14:22:01.485033.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-24T14:22:01.485033.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_24T14_22_01.485033", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-24T14:22:01.485033.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-24T14:22:01.485033.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_24T14_22_01.485033", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T14:22:01.485033.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T14:22:01.485033.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_24T14_22_01.485033", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-24T14:22:01.485033.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-24T14:22:01.485033.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_24T14_22_01.485033", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T14:22:01.485033.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T14:22:01.485033.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_24T14_22_01.485033", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T14:22:01.485033.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T14:22:01.485033.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_24T14_22_01.485033", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-24T14:22:01.485033.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-24T14:22:01.485033.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_24T14_22_01.485033", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-24T14:22:01.485033.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-24T14:22:01.485033.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_24T14_22_01.485033", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-24T14:22:01.485033.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-24T14:22:01.485033.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_24T14_22_01.485033", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T14:22:01.485033.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T14:22:01.485033.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_24T14_22_01.485033", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-24T14:22:01.485033.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-24T14:22:01.485033.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_24T14_22_01.485033", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-24T14:22:01.485033.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-24T14:22:01.485033.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_24T14_22_01.485033", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-24T14:22:01.485033.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-24T14:22:01.485033.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_15T06_46_55.405946", "path": ["**/details_harness|winogrande|5_2023-10-15T06-46-55.405946.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-15T06-46-55.405946.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_24T14_22_01.485033", "path": ["results_2023-07-24T14:22:01.485033.parquet"]}, {"split": "2023_10_15T06_46_55.405946", "path": ["results_2023-10-15T06-46-55.405946.parquet"]}, {"split": "latest", "path": ["results_2023-10-15T06-46-55.405946.parquet"]}]}]}
2023-10-15T05:47:07+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of Mikael110/llama-2-13b-guanaco-fp16 ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model Mikael110/llama-2-13b-guanaco-fp16 on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-10-15T06:46:55.405946(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of Mikael110/llama-2-13b-guanaco-fp16", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model Mikael110/llama-2-13b-guanaco-fp16 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-15T06:46:55.405946(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of Mikael110/llama-2-13b-guanaco-fp16", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model Mikael110/llama-2-13b-guanaco-fp16 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-15T06:46:55.405946(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 25, 31, 173, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of Mikael110/llama-2-13b-guanaco-fp16## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model Mikael110/llama-2-13b-guanaco-fp16 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-15T06:46:55.405946(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
df9e7d40726154909fe2ed0bfe8f5438e083104c
# Dataset Card for "mmlu-high_school_chemistry-neg-prepend-fix" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
joey234/mmlu-high_school_chemistry-neg-prepend-fix
[ "region:us" ]
2023-08-18T11:01:56+00:00
{"configs": [{"config_name": "default", "data_files": [{"split": "dev", "path": "data/dev-*"}, {"split": "test", "path": "data/test-*"}]}], "dataset_info": {"features": [{"name": "question", "dtype": "string"}, {"name": "choices", "sequence": "string"}, {"name": "answer", "dtype": {"class_label": {"names": {"0": "A", "1": "B", "2": "C", "3": "D"}}}}, {"name": "negate_openai_prompt", "struct": [{"name": "content", "dtype": "string"}, {"name": "role", "dtype": "string"}]}, {"name": "neg_question", "dtype": "string"}, {"name": "fewshot_context", "dtype": "string"}, {"name": "ori_prompt", "dtype": "string"}], "splits": [{"name": "dev", "num_bytes": 5851, "num_examples": 5}, {"name": "test", "num_bytes": 425779, "num_examples": 203}], "download_size": 13316, "dataset_size": 431630}}
2023-08-21T06:35:06+00:00
[]
[]
TAGS #region-us
# Dataset Card for "mmlu-high_school_chemistry-neg-prepend-fix" More Information needed
[ "# Dataset Card for \"mmlu-high_school_chemistry-neg-prepend-fix\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"mmlu-high_school_chemistry-neg-prepend-fix\"\n\nMore Information needed" ]
[ 6, 29 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"mmlu-high_school_chemistry-neg-prepend-fix\"\n\nMore Information needed" ]
f9e503a9bddd0ccb1ffeea48abeb6a6816c84125
# Dataset Card for Evaluation run of yihan6324/llama2-7b-instructmining-40k-sharegpt ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/yihan6324/llama2-7b-instructmining-40k-sharegpt - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [yihan6324/llama2-7b-instructmining-40k-sharegpt](https://huggingface.co/yihan6324/llama2-7b-instructmining-40k-sharegpt) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_yihan6324__llama2-7b-instructmining-40k-sharegpt", "harness_truthfulqa_mc_0", split="train") ``` ## Latest results These are the [latest results from run 2023-08-09T21:00:12.284244](https://huggingface.co/datasets/open-llm-leaderboard/details_yihan6324__llama2-7b-instructmining-40k-sharegpt/blob/main/results_2023-08-09T21%3A00%3A12.284244.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.506231001015833, "acc_stderr": 0.03505018845563652, "acc_norm": 0.5099522031118208, "acc_norm_stderr": 0.035035258453899244, "mc1": 0.36474908200734396, "mc1_stderr": 0.01685096106172012, "mc2": 0.5317717765572597, "mc2_stderr": 0.015775374488304787 }, "harness|arc:challenge|25": { "acc": 0.5170648464163823, "acc_stderr": 0.014602878388536595, "acc_norm": 0.5511945392491467, "acc_norm_stderr": 0.014534599585097664 }, "harness|hellaswag|10": { "acc": 0.6041625174268074, "acc_stderr": 0.004880303863138504, "acc_norm": 0.7895837482573193, "acc_norm_stderr": 0.0040677125640782895 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.29, "acc_stderr": 0.045604802157206845, "acc_norm": 0.29, "acc_norm_stderr": 0.045604802157206845 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.48148148148148145, "acc_stderr": 0.043163785995113245, "acc_norm": 0.48148148148148145, "acc_norm_stderr": 0.043163785995113245 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.4605263157894737, "acc_stderr": 0.04056242252249034, "acc_norm": 0.4605263157894737, "acc_norm_stderr": 0.04056242252249034 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.51, "acc_stderr": 0.05024183937956912, "acc_norm": 0.51, "acc_norm_stderr": 0.05024183937956912 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.5245283018867924, "acc_stderr": 0.030735822206205608, "acc_norm": 0.5245283018867924, "acc_norm_stderr": 0.030735822206205608 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.4930555555555556, "acc_stderr": 0.04180806750294938, "acc_norm": 0.4930555555555556, "acc_norm_stderr": 0.04180806750294938 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.36, "acc_stderr": 0.048241815132442176, "acc_norm": 0.36, "acc_norm_stderr": 0.048241815132442176 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.44, "acc_stderr": 0.04988876515698589, "acc_norm": 0.44, "acc_norm_stderr": 0.04988876515698589 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.36, "acc_stderr": 0.04824181513244218, "acc_norm": 0.36, "acc_norm_stderr": 0.04824181513244218 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.4682080924855491, "acc_stderr": 0.03804749744364763, "acc_norm": 0.4682080924855491, "acc_norm_stderr": 0.03804749744364763 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.17647058823529413, "acc_stderr": 0.03793281185307809, "acc_norm": 0.17647058823529413, "acc_norm_stderr": 0.03793281185307809 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.63, "acc_stderr": 0.04852365870939099, "acc_norm": 0.63, "acc_norm_stderr": 0.04852365870939099 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.4723404255319149, "acc_stderr": 0.03263597118409769, "acc_norm": 0.4723404255319149, "acc_norm_stderr": 0.03263597118409769 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.2543859649122807, "acc_stderr": 0.040969851398436716, "acc_norm": 0.2543859649122807, "acc_norm_stderr": 0.040969851398436716 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.496551724137931, "acc_stderr": 0.04166567577101579, "acc_norm": 0.496551724137931, "acc_norm_stderr": 0.04166567577101579 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.3148148148148148, "acc_stderr": 0.02391998416404773, "acc_norm": 0.3148148148148148, "acc_norm_stderr": 0.02391998416404773 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.3492063492063492, "acc_stderr": 0.04263906892795133, "acc_norm": 0.3492063492063492, "acc_norm_stderr": 0.04263906892795133 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.37, "acc_stderr": 0.04852365870939099, "acc_norm": 0.37, "acc_norm_stderr": 0.04852365870939099 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.532258064516129, "acc_stderr": 0.028384747788813332, "acc_norm": 0.532258064516129, "acc_norm_stderr": 0.028384747788813332 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.35467980295566504, "acc_stderr": 0.0336612448905145, "acc_norm": 0.35467980295566504, "acc_norm_stderr": 0.0336612448905145 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.46, "acc_stderr": 0.05009082659620332, "acc_norm": 0.46, "acc_norm_stderr": 0.05009082659620332 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.6484848484848484, "acc_stderr": 0.037282069986826503, "acc_norm": 0.6484848484848484, "acc_norm_stderr": 0.037282069986826503 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.5959595959595959, "acc_stderr": 0.034961309720561294, "acc_norm": 0.5959595959595959, "acc_norm_stderr": 0.034961309720561294 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.7512953367875648, "acc_stderr": 0.031195840877700286, "acc_norm": 0.7512953367875648, "acc_norm_stderr": 0.031195840877700286 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.4564102564102564, "acc_stderr": 0.025254485424799605, "acc_norm": 0.4564102564102564, "acc_norm_stderr": 0.025254485424799605 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.2740740740740741, "acc_stderr": 0.027195934804085622, "acc_norm": 0.2740740740740741, "acc_norm_stderr": 0.027195934804085622 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.42436974789915966, "acc_stderr": 0.03210479051015776, "acc_norm": 0.42436974789915966, "acc_norm_stderr": 0.03210479051015776 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.33112582781456956, "acc_stderr": 0.038425817186598696, "acc_norm": 0.33112582781456956, "acc_norm_stderr": 0.038425817186598696 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.6990825688073394, "acc_stderr": 0.019664751366802114, "acc_norm": 0.6990825688073394, "acc_norm_stderr": 0.019664751366802114 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.3425925925925926, "acc_stderr": 0.032365852526021574, "acc_norm": 0.3425925925925926, "acc_norm_stderr": 0.032365852526021574 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.6764705882352942, "acc_stderr": 0.032834720561085606, "acc_norm": 0.6764705882352942, "acc_norm_stderr": 0.032834720561085606 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.7172995780590717, "acc_stderr": 0.029312814153955934, "acc_norm": 0.7172995780590717, "acc_norm_stderr": 0.029312814153955934 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.5964125560538116, "acc_stderr": 0.03292802819330314, "acc_norm": 0.5964125560538116, "acc_norm_stderr": 0.03292802819330314 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.6183206106870229, "acc_stderr": 0.0426073515764456, "acc_norm": 0.6183206106870229, "acc_norm_stderr": 0.0426073515764456 }, "harness|hendrycksTest-international_law|5": { "acc": 0.6115702479338843, "acc_stderr": 0.044492703500683836, "acc_norm": 0.6115702479338843, "acc_norm_stderr": 0.044492703500683836 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.6111111111111112, "acc_stderr": 0.0471282125742677, "acc_norm": 0.6111111111111112, "acc_norm_stderr": 0.0471282125742677 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.5828220858895705, "acc_stderr": 0.038741028598180814, "acc_norm": 0.5828220858895705, "acc_norm_stderr": 0.038741028598180814 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.4107142857142857, "acc_stderr": 0.04669510663875191, "acc_norm": 0.4107142857142857, "acc_norm_stderr": 0.04669510663875191 }, "harness|hendrycksTest-management|5": { "acc": 0.6601941747572816, "acc_stderr": 0.04689765937278135, "acc_norm": 0.6601941747572816, "acc_norm_stderr": 0.04689765937278135 }, "harness|hendrycksTest-marketing|5": { "acc": 0.7564102564102564, "acc_stderr": 0.028120966503914397, "acc_norm": 0.7564102564102564, "acc_norm_stderr": 0.028120966503914397 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.59, "acc_stderr": 0.049431107042371025, "acc_norm": 0.59, "acc_norm_stderr": 0.049431107042371025 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.6794380587484036, "acc_stderr": 0.01668889331080376, "acc_norm": 0.6794380587484036, "acc_norm_stderr": 0.01668889331080376 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.5635838150289018, "acc_stderr": 0.02670054542494367, "acc_norm": 0.5635838150289018, "acc_norm_stderr": 0.02670054542494367 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.264804469273743, "acc_stderr": 0.014756906483260659, "acc_norm": 0.264804469273743, "acc_norm_stderr": 0.014756906483260659 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.5196078431372549, "acc_stderr": 0.028607893699576066, "acc_norm": 0.5196078431372549, "acc_norm_stderr": 0.028607893699576066 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.5916398713826366, "acc_stderr": 0.027917050748484627, "acc_norm": 0.5916398713826366, "acc_norm_stderr": 0.027917050748484627 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.5216049382716049, "acc_stderr": 0.027794760105008736, "acc_norm": 0.5216049382716049, "acc_norm_stderr": 0.027794760105008736 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.39361702127659576, "acc_stderr": 0.02914454478159615, "acc_norm": 0.39361702127659576, "acc_norm_stderr": 0.02914454478159615 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.3820078226857888, "acc_stderr": 0.012409564470235565, "acc_norm": 0.3820078226857888, "acc_norm_stderr": 0.012409564470235565 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.5257352941176471, "acc_stderr": 0.03033257809455504, "acc_norm": 0.5257352941176471, "acc_norm_stderr": 0.03033257809455504 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.4918300653594771, "acc_stderr": 0.02022513434305726, "acc_norm": 0.4918300653594771, "acc_norm_stderr": 0.02022513434305726 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.5909090909090909, "acc_stderr": 0.04709306978661896, "acc_norm": 0.5909090909090909, "acc_norm_stderr": 0.04709306978661896 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.5918367346938775, "acc_stderr": 0.03146465712827424, "acc_norm": 0.5918367346938775, "acc_norm_stderr": 0.03146465712827424 }, "harness|hendrycksTest-sociology|5": { "acc": 0.6915422885572139, "acc_stderr": 0.03265819588512698, "acc_norm": 0.6915422885572139, "acc_norm_stderr": 0.03265819588512698 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.68, "acc_stderr": 0.04688261722621505, "acc_norm": 0.68, "acc_norm_stderr": 0.04688261722621505 }, "harness|hendrycksTest-virology|5": { "acc": 0.39156626506024095, "acc_stderr": 0.03799857454479636, "acc_norm": 0.39156626506024095, "acc_norm_stderr": 0.03799857454479636 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.7134502923976608, "acc_stderr": 0.03467826685703826, "acc_norm": 0.7134502923976608, "acc_norm_stderr": 0.03467826685703826 }, "harness|truthfulqa:mc|0": { "mc1": 0.36474908200734396, "mc1_stderr": 0.01685096106172012, "mc2": 0.5317717765572597, "mc2_stderr": 0.015775374488304787 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_yihan6324__llama2-7b-instructmining-40k-sharegpt
[ "region:us" ]
2023-08-18T11:02:04+00:00
{"pretty_name": "Evaluation run of yihan6324/llama2-7b-instructmining-40k-sharegpt", "dataset_summary": "Dataset automatically created during the evaluation run of model [yihan6324/llama2-7b-instructmining-40k-sharegpt](https://huggingface.co/yihan6324/llama2-7b-instructmining-40k-sharegpt) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_yihan6324__llama2-7b-instructmining-40k-sharegpt\",\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-08-09T21:00:12.284244](https://huggingface.co/datasets/open-llm-leaderboard/details_yihan6324__llama2-7b-instructmining-40k-sharegpt/blob/main/results_2023-08-09T21%3A00%3A12.284244.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.506231001015833,\n \"acc_stderr\": 0.03505018845563652,\n \"acc_norm\": 0.5099522031118208,\n \"acc_norm_stderr\": 0.035035258453899244,\n \"mc1\": 0.36474908200734396,\n \"mc1_stderr\": 0.01685096106172012,\n \"mc2\": 0.5317717765572597,\n \"mc2_stderr\": 0.015775374488304787\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5170648464163823,\n \"acc_stderr\": 0.014602878388536595,\n \"acc_norm\": 0.5511945392491467,\n \"acc_norm_stderr\": 0.014534599585097664\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6041625174268074,\n \"acc_stderr\": 0.004880303863138504,\n \"acc_norm\": 0.7895837482573193,\n \"acc_norm_stderr\": 0.0040677125640782895\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.48148148148148145,\n \"acc_stderr\": 0.043163785995113245,\n \"acc_norm\": 0.48148148148148145,\n \"acc_norm_stderr\": 0.043163785995113245\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.4605263157894737,\n \"acc_stderr\": 0.04056242252249034,\n \"acc_norm\": 0.4605263157894737,\n \"acc_norm_stderr\": 0.04056242252249034\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.5245283018867924,\n \"acc_stderr\": 0.030735822206205608,\n \"acc_norm\": 0.5245283018867924,\n \"acc_norm_stderr\": 0.030735822206205608\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.4930555555555556,\n \"acc_stderr\": 0.04180806750294938,\n \"acc_norm\": 0.4930555555555556,\n \"acc_norm_stderr\": 0.04180806750294938\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.4682080924855491,\n \"acc_stderr\": 0.03804749744364763,\n \"acc_norm\": 0.4682080924855491,\n \"acc_norm_stderr\": 0.03804749744364763\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.17647058823529413,\n \"acc_stderr\": 0.03793281185307809,\n \"acc_norm\": 0.17647058823529413,\n \"acc_norm_stderr\": 0.03793281185307809\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.63,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.63,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.4723404255319149,\n \"acc_stderr\": 0.03263597118409769,\n \"acc_norm\": 0.4723404255319149,\n \"acc_norm_stderr\": 0.03263597118409769\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2543859649122807,\n \"acc_stderr\": 0.040969851398436716,\n \"acc_norm\": 0.2543859649122807,\n \"acc_norm_stderr\": 0.040969851398436716\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.496551724137931,\n \"acc_stderr\": 0.04166567577101579,\n \"acc_norm\": 0.496551724137931,\n \"acc_norm_stderr\": 0.04166567577101579\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.3148148148148148,\n \"acc_stderr\": 0.02391998416404773,\n \"acc_norm\": 0.3148148148148148,\n \"acc_norm_stderr\": 0.02391998416404773\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3492063492063492,\n \"acc_stderr\": 0.04263906892795133,\n \"acc_norm\": 0.3492063492063492,\n \"acc_norm_stderr\": 0.04263906892795133\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.532258064516129,\n \"acc_stderr\": 0.028384747788813332,\n \"acc_norm\": 0.532258064516129,\n \"acc_norm_stderr\": 0.028384747788813332\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.35467980295566504,\n \"acc_stderr\": 0.0336612448905145,\n \"acc_norm\": 0.35467980295566504,\n \"acc_norm_stderr\": 0.0336612448905145\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.6484848484848484,\n \"acc_stderr\": 0.037282069986826503,\n \"acc_norm\": 0.6484848484848484,\n \"acc_norm_stderr\": 0.037282069986826503\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.5959595959595959,\n \"acc_stderr\": 0.034961309720561294,\n \"acc_norm\": 0.5959595959595959,\n \"acc_norm_stderr\": 0.034961309720561294\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.7512953367875648,\n \"acc_stderr\": 0.031195840877700286,\n \"acc_norm\": 0.7512953367875648,\n \"acc_norm_stderr\": 0.031195840877700286\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.4564102564102564,\n \"acc_stderr\": 0.025254485424799605,\n \"acc_norm\": 0.4564102564102564,\n \"acc_norm_stderr\": 0.025254485424799605\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.2740740740740741,\n \"acc_stderr\": 0.027195934804085622,\n \"acc_norm\": 0.2740740740740741,\n \"acc_norm_stderr\": 0.027195934804085622\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.42436974789915966,\n \"acc_stderr\": 0.03210479051015776,\n \"acc_norm\": 0.42436974789915966,\n \"acc_norm_stderr\": 0.03210479051015776\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.33112582781456956,\n \"acc_stderr\": 0.038425817186598696,\n \"acc_norm\": 0.33112582781456956,\n \"acc_norm_stderr\": 0.038425817186598696\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.6990825688073394,\n \"acc_stderr\": 0.019664751366802114,\n \"acc_norm\": 0.6990825688073394,\n \"acc_norm_stderr\": 0.019664751366802114\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.3425925925925926,\n \"acc_stderr\": 0.032365852526021574,\n \"acc_norm\": 0.3425925925925926,\n \"acc_norm_stderr\": 0.032365852526021574\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.032834720561085606,\n \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.032834720561085606\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7172995780590717,\n \"acc_stderr\": 0.029312814153955934,\n \"acc_norm\": 0.7172995780590717,\n \"acc_norm_stderr\": 0.029312814153955934\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5964125560538116,\n \"acc_stderr\": 0.03292802819330314,\n \"acc_norm\": 0.5964125560538116,\n \"acc_norm_stderr\": 0.03292802819330314\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.6183206106870229,\n \"acc_stderr\": 0.0426073515764456,\n \"acc_norm\": 0.6183206106870229,\n \"acc_norm_stderr\": 0.0426073515764456\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.6115702479338843,\n \"acc_stderr\": 0.044492703500683836,\n \"acc_norm\": 0.6115702479338843,\n \"acc_norm_stderr\": 0.044492703500683836\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6111111111111112,\n \"acc_stderr\": 0.0471282125742677,\n \"acc_norm\": 0.6111111111111112,\n \"acc_norm_stderr\": 0.0471282125742677\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.5828220858895705,\n \"acc_stderr\": 0.038741028598180814,\n \"acc_norm\": 0.5828220858895705,\n \"acc_norm_stderr\": 0.038741028598180814\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4107142857142857,\n \"acc_stderr\": 0.04669510663875191,\n \"acc_norm\": 0.4107142857142857,\n \"acc_norm_stderr\": 0.04669510663875191\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.6601941747572816,\n \"acc_stderr\": 0.04689765937278135,\n \"acc_norm\": 0.6601941747572816,\n \"acc_norm_stderr\": 0.04689765937278135\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7564102564102564,\n \"acc_stderr\": 0.028120966503914397,\n \"acc_norm\": 0.7564102564102564,\n \"acc_norm_stderr\": 0.028120966503914397\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.59,\n \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.59,\n \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6794380587484036,\n \"acc_stderr\": 0.01668889331080376,\n \"acc_norm\": 0.6794380587484036,\n \"acc_norm_stderr\": 0.01668889331080376\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.5635838150289018,\n \"acc_stderr\": 0.02670054542494367,\n \"acc_norm\": 0.5635838150289018,\n \"acc_norm_stderr\": 0.02670054542494367\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.264804469273743,\n \"acc_stderr\": 0.014756906483260659,\n \"acc_norm\": 0.264804469273743,\n \"acc_norm_stderr\": 0.014756906483260659\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.5196078431372549,\n \"acc_stderr\": 0.028607893699576066,\n \"acc_norm\": 0.5196078431372549,\n \"acc_norm_stderr\": 0.028607893699576066\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5916398713826366,\n \"acc_stderr\": 0.027917050748484627,\n \"acc_norm\": 0.5916398713826366,\n \"acc_norm_stderr\": 0.027917050748484627\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.5216049382716049,\n \"acc_stderr\": 0.027794760105008736,\n \"acc_norm\": 0.5216049382716049,\n \"acc_norm_stderr\": 0.027794760105008736\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.39361702127659576,\n \"acc_stderr\": 0.02914454478159615,\n \"acc_norm\": 0.39361702127659576,\n \"acc_norm_stderr\": 0.02914454478159615\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3820078226857888,\n \"acc_stderr\": 0.012409564470235565,\n \"acc_norm\": 0.3820078226857888,\n \"acc_norm_stderr\": 0.012409564470235565\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.5257352941176471,\n \"acc_stderr\": 0.03033257809455504,\n \"acc_norm\": 0.5257352941176471,\n \"acc_norm_stderr\": 0.03033257809455504\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.4918300653594771,\n \"acc_stderr\": 0.02022513434305726,\n \"acc_norm\": 0.4918300653594771,\n \"acc_norm_stderr\": 0.02022513434305726\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5909090909090909,\n \"acc_stderr\": 0.04709306978661896,\n \"acc_norm\": 0.5909090909090909,\n \"acc_norm_stderr\": 0.04709306978661896\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.5918367346938775,\n \"acc_stderr\": 0.03146465712827424,\n \"acc_norm\": 0.5918367346938775,\n \"acc_norm_stderr\": 0.03146465712827424\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6915422885572139,\n \"acc_stderr\": 0.03265819588512698,\n \"acc_norm\": 0.6915422885572139,\n \"acc_norm_stderr\": 0.03265819588512698\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621505,\n \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.04688261722621505\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.39156626506024095,\n \"acc_stderr\": 0.03799857454479636,\n \"acc_norm\": 0.39156626506024095,\n \"acc_norm_stderr\": 0.03799857454479636\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.7134502923976608,\n \"acc_stderr\": 0.03467826685703826,\n \"acc_norm\": 0.7134502923976608,\n \"acc_norm_stderr\": 0.03467826685703826\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.36474908200734396,\n \"mc1_stderr\": 0.01685096106172012,\n \"mc2\": 0.5317717765572597,\n \"mc2_stderr\": 0.015775374488304787\n }\n}\n```", "repo_url": "https://huggingface.co/yihan6324/llama2-7b-instructmining-40k-sharegpt", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_08_09T21_00_12.284244", "path": ["**/details_harness|arc:challenge|25_2023-08-09T21:00:12.284244.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-08-09T21:00:12.284244.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_08_09T21_00_12.284244", "path": ["**/details_harness|hellaswag|10_2023-08-09T21:00:12.284244.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-08-09T21:00:12.284244.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_08_09T21_00_12.284244", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T21:00:12.284244.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-09T21:00:12.284244.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-09T21:00:12.284244.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T21:00:12.284244.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T21:00:12.284244.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-09T21:00:12.284244.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T21:00:12.284244.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T21:00:12.284244.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T21:00:12.284244.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T21:00:12.284244.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-09T21:00:12.284244.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-09T21:00:12.284244.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T21:00:12.284244.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-09T21:00:12.284244.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T21:00:12.284244.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T21:00:12.284244.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T21:00:12.284244.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-09T21:00:12.284244.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T21:00:12.284244.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T21:00:12.284244.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T21:00:12.284244.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T21:00:12.284244.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T21:00:12.284244.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T21:00:12.284244.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T21:00:12.284244.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T21:00:12.284244.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T21:00:12.284244.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T21:00:12.284244.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T21:00:12.284244.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T21:00:12.284244.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T21:00:12.284244.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T21:00:12.284244.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-09T21:00:12.284244.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T21:00:12.284244.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-09T21:00:12.284244.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T21:00:12.284244.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T21:00:12.284244.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T21:00:12.284244.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-09T21:00:12.284244.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-09T21:00:12.284244.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T21:00:12.284244.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T21:00:12.284244.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T21:00:12.284244.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T21:00:12.284244.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-09T21:00:12.284244.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-09T21:00:12.284244.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-09T21:00:12.284244.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T21:00:12.284244.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-09T21:00:12.284244.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T21:00:12.284244.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T21:00:12.284244.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-09T21:00:12.284244.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-09T21:00:12.284244.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-09T21:00:12.284244.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T21:00:12.284244.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-09T21:00:12.284244.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-09T21:00:12.284244.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T21:00:12.284244.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-09T21:00:12.284244.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-09T21:00:12.284244.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T21:00:12.284244.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T21:00:12.284244.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-09T21:00:12.284244.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T21:00:12.284244.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T21:00:12.284244.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T21:00:12.284244.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T21:00:12.284244.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-09T21:00:12.284244.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-09T21:00:12.284244.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T21:00:12.284244.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-09T21:00:12.284244.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T21:00:12.284244.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T21:00:12.284244.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T21:00:12.284244.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-09T21:00:12.284244.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T21:00:12.284244.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T21:00:12.284244.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T21:00:12.284244.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T21:00:12.284244.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T21:00:12.284244.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T21:00:12.284244.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T21:00:12.284244.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T21:00:12.284244.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T21:00:12.284244.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T21:00:12.284244.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T21:00:12.284244.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T21:00:12.284244.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T21:00:12.284244.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T21:00:12.284244.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-09T21:00:12.284244.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T21:00:12.284244.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-09T21:00:12.284244.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T21:00:12.284244.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T21:00:12.284244.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T21:00:12.284244.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-09T21:00:12.284244.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-09T21:00:12.284244.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T21:00:12.284244.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T21:00:12.284244.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T21:00:12.284244.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T21:00:12.284244.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-09T21:00:12.284244.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-09T21:00:12.284244.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-09T21:00:12.284244.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T21:00:12.284244.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-09T21:00:12.284244.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T21:00:12.284244.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T21:00:12.284244.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-09T21:00:12.284244.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-09T21:00:12.284244.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-09T21:00:12.284244.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T21:00:12.284244.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-09T21:00:12.284244.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-09T21:00:12.284244.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_08_09T21_00_12.284244", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T21:00:12.284244.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T21:00:12.284244.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_08_09T21_00_12.284244", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-09T21:00:12.284244.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-09T21:00:12.284244.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_08_09T21_00_12.284244", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-09T21:00:12.284244.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-09T21:00:12.284244.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_08_09T21_00_12.284244", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T21:00:12.284244.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T21:00:12.284244.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_08_09T21_00_12.284244", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T21:00:12.284244.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T21:00:12.284244.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_08_09T21_00_12.284244", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-09T21:00:12.284244.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-09T21:00:12.284244.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_08_09T21_00_12.284244", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T21:00:12.284244.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T21:00:12.284244.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_08_09T21_00_12.284244", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T21:00:12.284244.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T21:00:12.284244.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_08_09T21_00_12.284244", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T21:00:12.284244.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T21:00:12.284244.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_08_09T21_00_12.284244", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T21:00:12.284244.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T21:00:12.284244.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_08_09T21_00_12.284244", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-09T21:00:12.284244.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-09T21:00:12.284244.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_08_09T21_00_12.284244", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-09T21:00:12.284244.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-09T21:00:12.284244.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_08_09T21_00_12.284244", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T21:00:12.284244.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T21:00:12.284244.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_08_09T21_00_12.284244", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-09T21:00:12.284244.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-09T21:00:12.284244.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_08_09T21_00_12.284244", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T21:00:12.284244.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T21:00:12.284244.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_08_09T21_00_12.284244", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T21:00:12.284244.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T21:00:12.284244.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_08_09T21_00_12.284244", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T21:00:12.284244.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T21:00:12.284244.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_08_09T21_00_12.284244", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-09T21:00:12.284244.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-09T21:00:12.284244.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_08_09T21_00_12.284244", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T21:00:12.284244.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T21:00:12.284244.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_08_09T21_00_12.284244", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T21:00:12.284244.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T21:00:12.284244.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_08_09T21_00_12.284244", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T21:00:12.284244.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T21:00:12.284244.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_08_09T21_00_12.284244", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T21:00:12.284244.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T21:00:12.284244.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_08_09T21_00_12.284244", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T21:00:12.284244.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T21:00:12.284244.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_08_09T21_00_12.284244", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T21:00:12.284244.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T21:00:12.284244.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_08_09T21_00_12.284244", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T21:00:12.284244.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T21:00:12.284244.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_08_09T21_00_12.284244", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T21:00:12.284244.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T21:00:12.284244.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_08_09T21_00_12.284244", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T21:00:12.284244.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T21:00:12.284244.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_08_09T21_00_12.284244", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T21:00:12.284244.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T21:00:12.284244.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_08_09T21_00_12.284244", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T21:00:12.284244.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T21:00:12.284244.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_08_09T21_00_12.284244", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T21:00:12.284244.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T21:00:12.284244.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_08_09T21_00_12.284244", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T21:00:12.284244.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T21:00:12.284244.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_08_09T21_00_12.284244", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T21:00:12.284244.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T21:00:12.284244.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_08_09T21_00_12.284244", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-09T21:00:12.284244.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-09T21:00:12.284244.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_08_09T21_00_12.284244", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T21:00:12.284244.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T21:00:12.284244.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_08_09T21_00_12.284244", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-09T21:00:12.284244.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-09T21:00:12.284244.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_08_09T21_00_12.284244", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T21:00:12.284244.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T21:00:12.284244.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_08_09T21_00_12.284244", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T21:00:12.284244.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T21:00:12.284244.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_08_09T21_00_12.284244", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T21:00:12.284244.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T21:00:12.284244.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_08_09T21_00_12.284244", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-09T21:00:12.284244.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-09T21:00:12.284244.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_08_09T21_00_12.284244", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-09T21:00:12.284244.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-09T21:00:12.284244.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_08_09T21_00_12.284244", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T21:00:12.284244.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T21:00:12.284244.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_08_09T21_00_12.284244", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T21:00:12.284244.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T21:00:12.284244.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_08_09T21_00_12.284244", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T21:00:12.284244.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T21:00:12.284244.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_08_09T21_00_12.284244", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T21:00:12.284244.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T21:00:12.284244.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_08_09T21_00_12.284244", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-09T21:00:12.284244.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-09T21:00:12.284244.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_08_09T21_00_12.284244", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-09T21:00:12.284244.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-09T21:00:12.284244.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_08_09T21_00_12.284244", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-09T21:00:12.284244.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-09T21:00:12.284244.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_08_09T21_00_12.284244", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T21:00:12.284244.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T21:00:12.284244.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_08_09T21_00_12.284244", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-09T21:00:12.284244.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-09T21:00:12.284244.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_08_09T21_00_12.284244", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T21:00:12.284244.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T21:00:12.284244.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_08_09T21_00_12.284244", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T21:00:12.284244.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T21:00:12.284244.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_08_09T21_00_12.284244", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-09T21:00:12.284244.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-09T21:00:12.284244.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_08_09T21_00_12.284244", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-09T21:00:12.284244.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-09T21:00:12.284244.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_08_09T21_00_12.284244", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-09T21:00:12.284244.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-09T21:00:12.284244.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_08_09T21_00_12.284244", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T21:00:12.284244.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T21:00:12.284244.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_08_09T21_00_12.284244", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-09T21:00:12.284244.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-09T21:00:12.284244.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_08_09T21_00_12.284244", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-09T21:00:12.284244.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-09T21:00:12.284244.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_08_09T21_00_12.284244", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-09T21:00:12.284244.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-09T21:00:12.284244.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_08_09T21_00_12.284244", "path": ["results_2023-08-09T21:00:12.284244.parquet"]}, {"split": "latest", "path": ["results_2023-08-09T21:00:12.284244.parquet"]}]}]}
2023-08-27T11:39:26+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of yihan6324/llama2-7b-instructmining-40k-sharegpt ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model yihan6324/llama2-7b-instructmining-40k-sharegpt on the Open LLM Leaderboard. The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-08-09T21:00:12.284244 (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of yihan6324/llama2-7b-instructmining-40k-sharegpt", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model yihan6324/llama2-7b-instructmining-40k-sharegpt on the Open LLM Leaderboard.\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-08-09T21:00:12.284244 (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of yihan6324/llama2-7b-instructmining-40k-sharegpt", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model yihan6324/llama2-7b-instructmining-40k-sharegpt on the Open LLM Leaderboard.\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-08-09T21:00:12.284244 (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 29, 31, 177, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of yihan6324/llama2-7b-instructmining-40k-sharegpt## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model yihan6324/llama2-7b-instructmining-40k-sharegpt on the Open LLM Leaderboard.\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-08-09T21:00:12.284244 (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
c3b2581ef8712a6f0acab3442fa980147fad3973
# Dataset Card for "mmlu-high_school_computer_science-neg-prepend-fix" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
joey234/mmlu-high_school_computer_science-neg-prepend-fix
[ "region:us" ]
2023-08-18T11:02:10+00:00
{"configs": [{"config_name": "default", "data_files": [{"split": "dev", "path": "data/dev-*"}, {"split": "test", "path": "data/test-*"}]}], "dataset_info": {"features": [{"name": "question", "dtype": "string"}, {"name": "choices", "sequence": "string"}, {"name": "answer", "dtype": {"class_label": {"names": {"0": "A", "1": "B", "2": "C", "3": "D"}}}}, {"name": "negate_openai_prompt", "struct": [{"name": "content", "dtype": "string"}, {"name": "role", "dtype": "string"}]}, {"name": "neg_question", "dtype": "string"}, {"name": "fewshot_context", "dtype": "string"}, {"name": "ori_prompt", "dtype": "string"}], "splits": [{"name": "dev", "num_bytes": 10062, "num_examples": 5}, {"name": "test", "num_bytes": 381800, "num_examples": 100}], "download_size": 22073, "dataset_size": 391862}}
2023-08-21T06:35:18+00:00
[]
[]
TAGS #region-us
# Dataset Card for "mmlu-high_school_computer_science-neg-prepend-fix" More Information needed
[ "# Dataset Card for \"mmlu-high_school_computer_science-neg-prepend-fix\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"mmlu-high_school_computer_science-neg-prepend-fix\"\n\nMore Information needed" ]
[ 6, 29 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"mmlu-high_school_computer_science-neg-prepend-fix\"\n\nMore Information needed" ]
a866c7df845b27ef8625ea7224b2b2502670d8c5
# Dataset Card for Evaluation run of FPHam/Free_Sydney_13b_HF ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/FPHam/Free_Sydney_13b_HF - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [FPHam/Free_Sydney_13b_HF](https://huggingface.co/FPHam/Free_Sydney_13b_HF) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_FPHam__Free_Sydney_13b_HF", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-10-15T05:42:30.698824](https://huggingface.co/datasets/open-llm-leaderboard/details_FPHam__Free_Sydney_13b_HF/blob/main/results_2023-10-15T05-42-30.698824.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.0016778523489932886, "em_stderr": 0.00041913301788268446, "f1": 0.06131187080536917, "f1_stderr": 0.0013635599924355774, "acc": 0.4258996525195177, "acc_stderr": 0.009976510388912537 }, "harness|drop|3": { "em": 0.0016778523489932886, "em_stderr": 0.00041913301788268446, "f1": 0.06131187080536917, "f1_stderr": 0.0013635599924355774 }, "harness|gsm8k|5": { "acc": 0.09173616376042457, "acc_stderr": 0.007950942148339331 }, "harness|winogrande|5": { "acc": 0.7600631412786109, "acc_stderr": 0.012002078629485742 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_FPHam__Free_Sydney_13b_HF
[ "region:us" ]
2023-08-18T11:02:12+00:00
{"pretty_name": "Evaluation run of FPHam/Free_Sydney_13b_HF", "dataset_summary": "Dataset automatically created during the evaluation run of model [FPHam/Free_Sydney_13b_HF](https://huggingface.co/FPHam/Free_Sydney_13b_HF) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_FPHam__Free_Sydney_13b_HF\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-15T05:42:30.698824](https://huggingface.co/datasets/open-llm-leaderboard/details_FPHam__Free_Sydney_13b_HF/blob/main/results_2023-10-15T05-42-30.698824.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0016778523489932886,\n \"em_stderr\": 0.00041913301788268446,\n \"f1\": 0.06131187080536917,\n \"f1_stderr\": 0.0013635599924355774,\n \"acc\": 0.4258996525195177,\n \"acc_stderr\": 0.009976510388912537\n },\n \"harness|drop|3\": {\n \"em\": 0.0016778523489932886,\n \"em_stderr\": 0.00041913301788268446,\n \"f1\": 0.06131187080536917,\n \"f1_stderr\": 0.0013635599924355774\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.09173616376042457,\n \"acc_stderr\": 0.007950942148339331\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7600631412786109,\n \"acc_stderr\": 0.012002078629485742\n }\n}\n```", "repo_url": "https://huggingface.co/FPHam/Free_Sydney_13b_HF", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_25T10_56_58.779734", "path": ["**/details_harness|arc:challenge|25_2023-07-25T10:56:58.779734.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-25T10:56:58.779734.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_15T05_42_30.698824", "path": ["**/details_harness|drop|3_2023-10-15T05-42-30.698824.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-15T05-42-30.698824.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_15T05_42_30.698824", "path": ["**/details_harness|gsm8k|5_2023-10-15T05-42-30.698824.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-15T05-42-30.698824.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_25T10_56_58.779734", "path": ["**/details_harness|hellaswag|10_2023-07-25T10:56:58.779734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-25T10:56:58.779734.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_25T10_56_58.779734", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-25T10:56:58.779734.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-25T10:56:58.779734.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-25T10:56:58.779734.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-25T10:56:58.779734.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-25T10:56:58.779734.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-25T10:56:58.779734.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-25T10:56:58.779734.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-25T10:56:58.779734.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-25T10:56:58.779734.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-25T10:56:58.779734.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-25T10:56:58.779734.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-25T10:56:58.779734.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-25T10:56:58.779734.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-25T10:56:58.779734.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-25T10:56:58.779734.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-25T10:56:58.779734.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-25T10:56:58.779734.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-25T10:56:58.779734.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-25T10:56:58.779734.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-25T10:56:58.779734.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-25T10:56:58.779734.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-25T10:56:58.779734.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-25T10:56:58.779734.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-25T10:56:58.779734.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-25T10:56:58.779734.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-25T10:56:58.779734.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-25T10:56:58.779734.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-25T10:56:58.779734.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-25T10:56:58.779734.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-25T10:56:58.779734.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-25T10:56:58.779734.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-25T10:56:58.779734.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-25T10:56:58.779734.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-25T10:56:58.779734.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-25T10:56:58.779734.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-25T10:56:58.779734.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-25T10:56:58.779734.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-25T10:56:58.779734.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-25T10:56:58.779734.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-25T10:56:58.779734.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-25T10:56:58.779734.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-25T10:56:58.779734.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-25T10:56:58.779734.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-25T10:56:58.779734.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-25T10:56:58.779734.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-25T10:56:58.779734.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-25T10:56:58.779734.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-25T10:56:58.779734.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-25T10:56:58.779734.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-25T10:56:58.779734.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-25T10:56:58.779734.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-25T10:56:58.779734.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-25T10:56:58.779734.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-25T10:56:58.779734.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-25T10:56:58.779734.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-25T10:56:58.779734.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-25T10:56:58.779734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-25T10:56:58.779734.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-25T10:56:58.779734.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-25T10:56:58.779734.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-25T10:56:58.779734.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-25T10:56:58.779734.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-25T10:56:58.779734.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-25T10:56:58.779734.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-25T10:56:58.779734.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-25T10:56:58.779734.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-25T10:56:58.779734.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-25T10:56:58.779734.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-25T10:56:58.779734.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-25T10:56:58.779734.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-25T10:56:58.779734.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-25T10:56:58.779734.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-25T10:56:58.779734.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-25T10:56:58.779734.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-25T10:56:58.779734.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-25T10:56:58.779734.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-25T10:56:58.779734.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-25T10:56:58.779734.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-25T10:56:58.779734.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-25T10:56:58.779734.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-25T10:56:58.779734.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-25T10:56:58.779734.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-25T10:56:58.779734.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-25T10:56:58.779734.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-25T10:56:58.779734.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-25T10:56:58.779734.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-25T10:56:58.779734.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-25T10:56:58.779734.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-25T10:56:58.779734.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-25T10:56:58.779734.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-25T10:56:58.779734.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-25T10:56:58.779734.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-25T10:56:58.779734.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-25T10:56:58.779734.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-25T10:56:58.779734.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-25T10:56:58.779734.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-25T10:56:58.779734.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-25T10:56:58.779734.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-25T10:56:58.779734.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-25T10:56:58.779734.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-25T10:56:58.779734.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-25T10:56:58.779734.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-25T10:56:58.779734.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-25T10:56:58.779734.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-25T10:56:58.779734.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-25T10:56:58.779734.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-25T10:56:58.779734.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-25T10:56:58.779734.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-25T10:56:58.779734.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-25T10:56:58.779734.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-25T10:56:58.779734.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-25T10:56:58.779734.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-25T10:56:58.779734.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-25T10:56:58.779734.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_25T10_56_58.779734", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-25T10:56:58.779734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-25T10:56:58.779734.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_25T10_56_58.779734", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-25T10:56:58.779734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-25T10:56:58.779734.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_25T10_56_58.779734", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-25T10:56:58.779734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-25T10:56:58.779734.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_25T10_56_58.779734", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-25T10:56:58.779734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-25T10:56:58.779734.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_25T10_56_58.779734", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-25T10:56:58.779734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-25T10:56:58.779734.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_25T10_56_58.779734", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-25T10:56:58.779734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-25T10:56:58.779734.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_25T10_56_58.779734", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-25T10:56:58.779734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-25T10:56:58.779734.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_25T10_56_58.779734", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-25T10:56:58.779734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-25T10:56:58.779734.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_25T10_56_58.779734", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-25T10:56:58.779734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-25T10:56:58.779734.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_25T10_56_58.779734", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-25T10:56:58.779734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-25T10:56:58.779734.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_25T10_56_58.779734", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-25T10:56:58.779734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-25T10:56:58.779734.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_25T10_56_58.779734", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-25T10:56:58.779734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-25T10:56:58.779734.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_25T10_56_58.779734", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-25T10:56:58.779734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-25T10:56:58.779734.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_25T10_56_58.779734", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-25T10:56:58.779734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-25T10:56:58.779734.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_25T10_56_58.779734", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-25T10:56:58.779734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-25T10:56:58.779734.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_25T10_56_58.779734", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-25T10:56:58.779734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-25T10:56:58.779734.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_25T10_56_58.779734", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-25T10:56:58.779734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-25T10:56:58.779734.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_25T10_56_58.779734", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-25T10:56:58.779734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-25T10:56:58.779734.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_25T10_56_58.779734", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-25T10:56:58.779734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-25T10:56:58.779734.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_25T10_56_58.779734", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-25T10:56:58.779734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-25T10:56:58.779734.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_25T10_56_58.779734", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-25T10:56:58.779734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-25T10:56:58.779734.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_25T10_56_58.779734", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-25T10:56:58.779734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-25T10:56:58.779734.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_25T10_56_58.779734", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-25T10:56:58.779734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-25T10:56:58.779734.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_25T10_56_58.779734", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-25T10:56:58.779734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-25T10:56:58.779734.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_25T10_56_58.779734", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-25T10:56:58.779734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-25T10:56:58.779734.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_25T10_56_58.779734", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-25T10:56:58.779734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-25T10:56:58.779734.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_25T10_56_58.779734", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-25T10:56:58.779734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-25T10:56:58.779734.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_25T10_56_58.779734", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-25T10:56:58.779734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-25T10:56:58.779734.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_25T10_56_58.779734", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-25T10:56:58.779734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-25T10:56:58.779734.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_25T10_56_58.779734", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-25T10:56:58.779734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-25T10:56:58.779734.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_25T10_56_58.779734", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-25T10:56:58.779734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-25T10:56:58.779734.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_25T10_56_58.779734", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-25T10:56:58.779734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-25T10:56:58.779734.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_25T10_56_58.779734", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-25T10:56:58.779734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-25T10:56:58.779734.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_25T10_56_58.779734", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-25T10:56:58.779734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-25T10:56:58.779734.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_25T10_56_58.779734", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-25T10:56:58.779734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-25T10:56:58.779734.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_25T10_56_58.779734", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-25T10:56:58.779734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-25T10:56:58.779734.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_25T10_56_58.779734", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-25T10:56:58.779734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-25T10:56:58.779734.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_25T10_56_58.779734", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-25T10:56:58.779734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-25T10:56:58.779734.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_25T10_56_58.779734", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-25T10:56:58.779734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-25T10:56:58.779734.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_25T10_56_58.779734", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-25T10:56:58.779734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-25T10:56:58.779734.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_25T10_56_58.779734", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-25T10:56:58.779734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-25T10:56:58.779734.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_25T10_56_58.779734", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-25T10:56:58.779734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-25T10:56:58.779734.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_25T10_56_58.779734", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-25T10:56:58.779734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-25T10:56:58.779734.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_25T10_56_58.779734", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-25T10:56:58.779734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-25T10:56:58.779734.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_25T10_56_58.779734", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-25T10:56:58.779734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-25T10:56:58.779734.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_25T10_56_58.779734", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-25T10:56:58.779734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-25T10:56:58.779734.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_25T10_56_58.779734", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-25T10:56:58.779734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-25T10:56:58.779734.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_25T10_56_58.779734", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-25T10:56:58.779734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-25T10:56:58.779734.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_25T10_56_58.779734", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-25T10:56:58.779734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-25T10:56:58.779734.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_25T10_56_58.779734", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-25T10:56:58.779734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-25T10:56:58.779734.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_25T10_56_58.779734", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-25T10:56:58.779734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-25T10:56:58.779734.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_25T10_56_58.779734", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-25T10:56:58.779734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-25T10:56:58.779734.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_25T10_56_58.779734", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-25T10:56:58.779734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-25T10:56:58.779734.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_25T10_56_58.779734", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-25T10:56:58.779734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-25T10:56:58.779734.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_25T10_56_58.779734", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-25T10:56:58.779734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-25T10:56:58.779734.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_25T10_56_58.779734", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-25T10:56:58.779734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-25T10:56:58.779734.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_25T10_56_58.779734", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-25T10:56:58.779734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-25T10:56:58.779734.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_25T10_56_58.779734", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-25T10:56:58.779734.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-25T10:56:58.779734.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_15T05_42_30.698824", "path": ["**/details_harness|winogrande|5_2023-10-15T05-42-30.698824.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-15T05-42-30.698824.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_25T10_56_58.779734", "path": ["results_2023-07-25T10:56:58.779734.parquet"]}, {"split": "2023_10_15T05_42_30.698824", "path": ["results_2023-10-15T05-42-30.698824.parquet"]}, {"split": "latest", "path": ["results_2023-10-15T05-42-30.698824.parquet"]}]}]}
2023-10-15T04:42:43+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of FPHam/Free_Sydney_13b_HF ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model FPHam/Free_Sydney_13b_HF on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-10-15T05:42:30.698824(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of FPHam/Free_Sydney_13b_HF", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model FPHam/Free_Sydney_13b_HF on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-15T05:42:30.698824(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of FPHam/Free_Sydney_13b_HF", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model FPHam/Free_Sydney_13b_HF on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-15T05:42:30.698824(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 23, 31, 171, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of FPHam/Free_Sydney_13b_HF## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model FPHam/Free_Sydney_13b_HF on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-15T05:42:30.698824(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
ea7f3ba3611efb11a80b6ce6c62e4e9d4dc5054a
# Dataset Card for Evaluation run of Azure99/blossom-v2-3b ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/Azure99/blossom-v2-3b - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [Azure99/blossom-v2-3b](https://huggingface.co/Azure99/blossom-v2-3b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_Azure99__blossom-v2-3b", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-09-16T18:36:49.609194](https://huggingface.co/datasets/open-llm-leaderboard/details_Azure99__blossom-v2-3b/blob/main/results_2023-09-16T18-36-49.609194.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.034395973154362415, "em_stderr": 0.0018663495487686885, "f1": 0.11167470637583889, "f1_stderr": 0.0023912000923338094, "acc": 0.2966551039299941, "acc_stderr": 0.007917209289296998 }, "harness|drop|3": { "em": 0.034395973154362415, "em_stderr": 0.0018663495487686885, "f1": 0.11167470637583889, "f1_stderr": 0.0023912000923338094 }, "harness|gsm8k|5": { "acc": 0.00530705079605762, "acc_stderr": 0.002001305720948054 }, "harness|winogrande|5": { "acc": 0.5880031570639306, "acc_stderr": 0.013833112857645942 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_Azure99__blossom-v2-3b
[ "region:us" ]
2023-08-18T11:02:20+00:00
{"pretty_name": "Evaluation run of Azure99/blossom-v2-3b", "dataset_summary": "Dataset automatically created during the evaluation run of model [Azure99/blossom-v2-3b](https://huggingface.co/Azure99/blossom-v2-3b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Azure99__blossom-v2-3b\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-09-16T18:36:49.609194](https://huggingface.co/datasets/open-llm-leaderboard/details_Azure99__blossom-v2-3b/blob/main/results_2023-09-16T18-36-49.609194.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.034395973154362415,\n \"em_stderr\": 0.0018663495487686885,\n \"f1\": 0.11167470637583889,\n \"f1_stderr\": 0.0023912000923338094,\n \"acc\": 0.2966551039299941,\n \"acc_stderr\": 0.007917209289296998\n },\n \"harness|drop|3\": {\n \"em\": 0.034395973154362415,\n \"em_stderr\": 0.0018663495487686885,\n \"f1\": 0.11167470637583889,\n \"f1_stderr\": 0.0023912000923338094\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.00530705079605762,\n \"acc_stderr\": 0.002001305720948054\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.5880031570639306,\n \"acc_stderr\": 0.013833112857645942\n }\n}\n```", "repo_url": "https://huggingface.co/Azure99/blossom-v2-3b", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_08_09T15_22_00.974376", "path": ["**/details_harness|arc:challenge|25_2023-08-09T15:22:00.974376.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-08-09T15:22:00.974376.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_09_16T18_36_49.609194", "path": ["**/details_harness|drop|3_2023-09-16T18-36-49.609194.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-09-16T18-36-49.609194.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_09_16T18_36_49.609194", "path": ["**/details_harness|gsm8k|5_2023-09-16T18-36-49.609194.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-09-16T18-36-49.609194.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_08_09T15_22_00.974376", "path": ["**/details_harness|hellaswag|10_2023-08-09T15:22:00.974376.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-08-09T15:22:00.974376.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_08_09T15_22_00.974376", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T15:22:00.974376.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-09T15:22:00.974376.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-09T15:22:00.974376.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T15:22:00.974376.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T15:22:00.974376.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-09T15:22:00.974376.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T15:22:00.974376.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T15:22:00.974376.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T15:22:00.974376.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T15:22:00.974376.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-09T15:22:00.974376.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-09T15:22:00.974376.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T15:22:00.974376.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-09T15:22:00.974376.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T15:22:00.974376.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T15:22:00.974376.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T15:22:00.974376.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-09T15:22:00.974376.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T15:22:00.974376.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T15:22:00.974376.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T15:22:00.974376.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T15:22:00.974376.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T15:22:00.974376.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T15:22:00.974376.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T15:22:00.974376.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T15:22:00.974376.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T15:22:00.974376.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T15:22:00.974376.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T15:22:00.974376.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T15:22:00.974376.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T15:22:00.974376.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T15:22:00.974376.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-09T15:22:00.974376.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T15:22:00.974376.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-09T15:22:00.974376.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T15:22:00.974376.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T15:22:00.974376.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T15:22:00.974376.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-09T15:22:00.974376.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-09T15:22:00.974376.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T15:22:00.974376.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T15:22:00.974376.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T15:22:00.974376.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T15:22:00.974376.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-09T15:22:00.974376.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-09T15:22:00.974376.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-09T15:22:00.974376.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T15:22:00.974376.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-09T15:22:00.974376.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T15:22:00.974376.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T15:22:00.974376.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-09T15:22:00.974376.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-09T15:22:00.974376.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-09T15:22:00.974376.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T15:22:00.974376.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-09T15:22:00.974376.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-09T15:22:00.974376.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T15:22:00.974376.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-09T15:22:00.974376.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-09T15:22:00.974376.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T15:22:00.974376.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T15:22:00.974376.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-09T15:22:00.974376.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T15:22:00.974376.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T15:22:00.974376.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T15:22:00.974376.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T15:22:00.974376.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-09T15:22:00.974376.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-09T15:22:00.974376.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T15:22:00.974376.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-09T15:22:00.974376.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T15:22:00.974376.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T15:22:00.974376.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T15:22:00.974376.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-09T15:22:00.974376.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T15:22:00.974376.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T15:22:00.974376.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T15:22:00.974376.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T15:22:00.974376.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T15:22:00.974376.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T15:22:00.974376.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T15:22:00.974376.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T15:22:00.974376.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T15:22:00.974376.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T15:22:00.974376.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T15:22:00.974376.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T15:22:00.974376.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T15:22:00.974376.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T15:22:00.974376.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-09T15:22:00.974376.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T15:22:00.974376.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-09T15:22:00.974376.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T15:22:00.974376.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T15:22:00.974376.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T15:22:00.974376.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-09T15:22:00.974376.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-09T15:22:00.974376.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T15:22:00.974376.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T15:22:00.974376.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T15:22:00.974376.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T15:22:00.974376.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-09T15:22:00.974376.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-09T15:22:00.974376.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-09T15:22:00.974376.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T15:22:00.974376.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-09T15:22:00.974376.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T15:22:00.974376.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T15:22:00.974376.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-09T15:22:00.974376.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-09T15:22:00.974376.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-09T15:22:00.974376.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T15:22:00.974376.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-09T15:22:00.974376.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-09T15:22:00.974376.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_08_09T15_22_00.974376", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T15:22:00.974376.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T15:22:00.974376.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_08_09T15_22_00.974376", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-09T15:22:00.974376.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-09T15:22:00.974376.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_08_09T15_22_00.974376", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-09T15:22:00.974376.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-09T15:22:00.974376.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_08_09T15_22_00.974376", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T15:22:00.974376.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T15:22:00.974376.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_08_09T15_22_00.974376", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T15:22:00.974376.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T15:22:00.974376.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_08_09T15_22_00.974376", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-09T15:22:00.974376.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-09T15:22:00.974376.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_08_09T15_22_00.974376", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T15:22:00.974376.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T15:22:00.974376.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_08_09T15_22_00.974376", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T15:22:00.974376.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T15:22:00.974376.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_08_09T15_22_00.974376", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T15:22:00.974376.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T15:22:00.974376.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_08_09T15_22_00.974376", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T15:22:00.974376.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T15:22:00.974376.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_08_09T15_22_00.974376", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-09T15:22:00.974376.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-09T15:22:00.974376.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_08_09T15_22_00.974376", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-09T15:22:00.974376.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-09T15:22:00.974376.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_08_09T15_22_00.974376", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T15:22:00.974376.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T15:22:00.974376.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_08_09T15_22_00.974376", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-09T15:22:00.974376.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-09T15:22:00.974376.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_08_09T15_22_00.974376", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T15:22:00.974376.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T15:22:00.974376.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_08_09T15_22_00.974376", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T15:22:00.974376.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T15:22:00.974376.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_08_09T15_22_00.974376", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T15:22:00.974376.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T15:22:00.974376.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_08_09T15_22_00.974376", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-09T15:22:00.974376.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-09T15:22:00.974376.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_08_09T15_22_00.974376", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T15:22:00.974376.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T15:22:00.974376.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_08_09T15_22_00.974376", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T15:22:00.974376.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T15:22:00.974376.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_08_09T15_22_00.974376", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T15:22:00.974376.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T15:22:00.974376.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_08_09T15_22_00.974376", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T15:22:00.974376.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T15:22:00.974376.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_08_09T15_22_00.974376", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T15:22:00.974376.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T15:22:00.974376.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_08_09T15_22_00.974376", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T15:22:00.974376.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T15:22:00.974376.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_08_09T15_22_00.974376", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T15:22:00.974376.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T15:22:00.974376.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_08_09T15_22_00.974376", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T15:22:00.974376.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T15:22:00.974376.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_08_09T15_22_00.974376", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T15:22:00.974376.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T15:22:00.974376.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_08_09T15_22_00.974376", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T15:22:00.974376.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T15:22:00.974376.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_08_09T15_22_00.974376", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T15:22:00.974376.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T15:22:00.974376.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_08_09T15_22_00.974376", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T15:22:00.974376.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T15:22:00.974376.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_08_09T15_22_00.974376", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T15:22:00.974376.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T15:22:00.974376.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_08_09T15_22_00.974376", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T15:22:00.974376.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T15:22:00.974376.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_08_09T15_22_00.974376", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-09T15:22:00.974376.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-09T15:22:00.974376.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_08_09T15_22_00.974376", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T15:22:00.974376.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T15:22:00.974376.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_08_09T15_22_00.974376", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-09T15:22:00.974376.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-09T15:22:00.974376.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_08_09T15_22_00.974376", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T15:22:00.974376.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T15:22:00.974376.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_08_09T15_22_00.974376", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T15:22:00.974376.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T15:22:00.974376.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_08_09T15_22_00.974376", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T15:22:00.974376.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T15:22:00.974376.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_08_09T15_22_00.974376", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-09T15:22:00.974376.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-09T15:22:00.974376.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_08_09T15_22_00.974376", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-09T15:22:00.974376.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-09T15:22:00.974376.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_08_09T15_22_00.974376", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T15:22:00.974376.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T15:22:00.974376.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_08_09T15_22_00.974376", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T15:22:00.974376.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T15:22:00.974376.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_08_09T15_22_00.974376", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T15:22:00.974376.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T15:22:00.974376.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_08_09T15_22_00.974376", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T15:22:00.974376.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T15:22:00.974376.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_08_09T15_22_00.974376", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-09T15:22:00.974376.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-09T15:22:00.974376.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_08_09T15_22_00.974376", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-09T15:22:00.974376.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-09T15:22:00.974376.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_08_09T15_22_00.974376", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-09T15:22:00.974376.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-09T15:22:00.974376.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_08_09T15_22_00.974376", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T15:22:00.974376.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T15:22:00.974376.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_08_09T15_22_00.974376", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-09T15:22:00.974376.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-09T15:22:00.974376.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_08_09T15_22_00.974376", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T15:22:00.974376.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T15:22:00.974376.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_08_09T15_22_00.974376", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T15:22:00.974376.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T15:22:00.974376.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_08_09T15_22_00.974376", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-09T15:22:00.974376.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-09T15:22:00.974376.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_08_09T15_22_00.974376", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-09T15:22:00.974376.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-09T15:22:00.974376.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_08_09T15_22_00.974376", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-09T15:22:00.974376.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-09T15:22:00.974376.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_08_09T15_22_00.974376", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T15:22:00.974376.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T15:22:00.974376.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_08_09T15_22_00.974376", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-09T15:22:00.974376.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-09T15:22:00.974376.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_08_09T15_22_00.974376", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-09T15:22:00.974376.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-09T15:22:00.974376.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_08_09T15_22_00.974376", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-09T15:22:00.974376.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-09T15:22:00.974376.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_09_16T18_36_49.609194", "path": ["**/details_harness|winogrande|5_2023-09-16T18-36-49.609194.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-09-16T18-36-49.609194.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_08_09T15_22_00.974376", "path": ["results_2023-08-09T15:22:00.974376.parquet"]}, {"split": "2023_09_16T18_36_49.609194", "path": ["results_2023-09-16T18-36-49.609194.parquet"]}, {"split": "latest", "path": ["results_2023-09-16T18-36-49.609194.parquet"]}]}]}
2023-09-16T17:37:00+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of Azure99/blossom-v2-3b ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model Azure99/blossom-v2-3b on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-09-16T18:36:49.609194(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of Azure99/blossom-v2-3b", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model Azure99/blossom-v2-3b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-09-16T18:36:49.609194(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of Azure99/blossom-v2-3b", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model Azure99/blossom-v2-3b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-09-16T18:36:49.609194(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 19, 31, 167, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of Azure99/blossom-v2-3b## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model Azure99/blossom-v2-3b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-09-16T18:36:49.609194(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
c5c63055d675eedd467f2776545afb9fa1623075
# Dataset Card for "mmlu-high_school_european_history-neg-prepend-fix" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
joey234/mmlu-high_school_european_history-neg-prepend-fix
[ "region:us" ]
2023-08-18T11:02:26+00:00
{"configs": [{"config_name": "default", "data_files": [{"split": "dev", "path": "data/dev-*"}, {"split": "test", "path": "data/test-*"}]}], "dataset_info": {"features": [{"name": "question", "dtype": "string"}, {"name": "choices", "sequence": "string"}, {"name": "answer", "dtype": {"class_label": {"names": {"0": "A", "1": "B", "2": "C", "3": "D"}}}}, {"name": "negate_openai_prompt", "struct": [{"name": "content", "dtype": "string"}, {"name": "role", "dtype": "string"}]}, {"name": "neg_question", "dtype": "string"}, {"name": "fewshot_context", "dtype": "string"}, {"name": "ori_prompt", "dtype": "string"}], "splits": [{"name": "dev", "num_bytes": 35805, "num_examples": 5}, {"name": "test", "num_bytes": 1243562, "num_examples": 165}], "download_size": 66756, "dataset_size": 1279367}}
2023-08-21T06:35:30+00:00
[]
[]
TAGS #region-us
# Dataset Card for "mmlu-high_school_european_history-neg-prepend-fix" More Information needed
[ "# Dataset Card for \"mmlu-high_school_european_history-neg-prepend-fix\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"mmlu-high_school_european_history-neg-prepend-fix\"\n\nMore Information needed" ]
[ 6, 31 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"mmlu-high_school_european_history-neg-prepend-fix\"\n\nMore Information needed" ]
41a901f1d460c636a50c27511f829ebf39d35688
# Dataset Card for Evaluation run of haonan-li/bactrian-x-llama-13b-merged ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/haonan-li/bactrian-x-llama-13b-merged - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [haonan-li/bactrian-x-llama-13b-merged](https://huggingface.co/haonan-li/bactrian-x-llama-13b-merged) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_haonan-li__bactrian-x-llama-13b-merged", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-09-18T01:46:24.914160](https://huggingface.co/datasets/open-llm-leaderboard/details_haonan-li__bactrian-x-llama-13b-merged/blob/main/results_2023-09-18T01-46-24.914160.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.2553481543624161, "em_stderr": 0.004465629087714431, "f1": 0.31091652684563814, "f1_stderr": 0.004443751758152442, "acc": 0.3974435920159074, "acc_stderr": 0.009316527734088942 }, "harness|drop|3": { "em": 0.2553481543624161, "em_stderr": 0.004465629087714431, "f1": 0.31091652684563814, "f1_stderr": 0.004443751758152442 }, "harness|gsm8k|5": { "acc": 0.05534495830174375, "acc_stderr": 0.006298221796179595 }, "harness|winogrande|5": { "acc": 0.739542225730071, "acc_stderr": 0.012334833671998289 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_haonan-li__bactrian-x-llama-13b-merged
[ "region:us" ]
2023-08-18T11:02:29+00:00
{"pretty_name": "Evaluation run of haonan-li/bactrian-x-llama-13b-merged", "dataset_summary": "Dataset automatically created during the evaluation run of model [haonan-li/bactrian-x-llama-13b-merged](https://huggingface.co/haonan-li/bactrian-x-llama-13b-merged) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_haonan-li__bactrian-x-llama-13b-merged\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-09-18T01:46:24.914160](https://huggingface.co/datasets/open-llm-leaderboard/details_haonan-li__bactrian-x-llama-13b-merged/blob/main/results_2023-09-18T01-46-24.914160.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.2553481543624161,\n \"em_stderr\": 0.004465629087714431,\n \"f1\": 0.31091652684563814,\n \"f1_stderr\": 0.004443751758152442,\n \"acc\": 0.3974435920159074,\n \"acc_stderr\": 0.009316527734088942\n },\n \"harness|drop|3\": {\n \"em\": 0.2553481543624161,\n \"em_stderr\": 0.004465629087714431,\n \"f1\": 0.31091652684563814,\n \"f1_stderr\": 0.004443751758152442\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.05534495830174375,\n \"acc_stderr\": 0.006298221796179595\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.739542225730071,\n \"acc_stderr\": 0.012334833671998289\n }\n}\n```", "repo_url": "https://huggingface.co/haonan-li/bactrian-x-llama-13b-merged", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_19T19_16_59.483464", "path": ["**/details_harness|arc:challenge|25_2023-07-19T19:16:59.483464.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-19T19:16:59.483464.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_09_18T01_46_24.914160", "path": ["**/details_harness|drop|3_2023-09-18T01-46-24.914160.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-09-18T01-46-24.914160.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_09_18T01_46_24.914160", "path": ["**/details_harness|gsm8k|5_2023-09-18T01-46-24.914160.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-09-18T01-46-24.914160.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_19T19_16_59.483464", "path": ["**/details_harness|hellaswag|10_2023-07-19T19:16:59.483464.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-19T19:16:59.483464.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_19T19_16_59.483464", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T19:16:59.483464.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T19:16:59.483464.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T19:16:59.483464.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T19:16:59.483464.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T19:16:59.483464.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T19:16:59.483464.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T19:16:59.483464.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T19:16:59.483464.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T19:16:59.483464.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T19:16:59.483464.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T19:16:59.483464.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T19:16:59.483464.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T19:16:59.483464.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T19:16:59.483464.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T19:16:59.483464.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T19:16:59.483464.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T19:16:59.483464.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T19:16:59.483464.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T19:16:59.483464.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T19:16:59.483464.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T19:16:59.483464.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T19:16:59.483464.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T19:16:59.483464.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T19:16:59.483464.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T19:16:59.483464.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T19:16:59.483464.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T19:16:59.483464.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T19:16:59.483464.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T19:16:59.483464.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T19:16:59.483464.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T19:16:59.483464.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T19:16:59.483464.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T19:16:59.483464.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T19:16:59.483464.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T19:16:59.483464.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T19:16:59.483464.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T19:16:59.483464.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T19:16:59.483464.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T19:16:59.483464.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T19:16:59.483464.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T19:16:59.483464.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T19:16:59.483464.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T19:16:59.483464.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T19:16:59.483464.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T19:16:59.483464.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T19:16:59.483464.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T19:16:59.483464.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T19:16:59.483464.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T19:16:59.483464.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T19:16:59.483464.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T19:16:59.483464.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T19:16:59.483464.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T19:16:59.483464.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T19:16:59.483464.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T19:16:59.483464.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T19:16:59.483464.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T19:16:59.483464.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T19:16:59.483464.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T19:16:59.483464.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T19:16:59.483464.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T19:16:59.483464.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T19:16:59.483464.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T19:16:59.483464.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T19:16:59.483464.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T19:16:59.483464.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T19:16:59.483464.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T19:16:59.483464.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T19:16:59.483464.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T19:16:59.483464.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T19:16:59.483464.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T19:16:59.483464.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T19:16:59.483464.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T19:16:59.483464.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T19:16:59.483464.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T19:16:59.483464.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T19:16:59.483464.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T19:16:59.483464.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T19:16:59.483464.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T19:16:59.483464.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T19:16:59.483464.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T19:16:59.483464.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T19:16:59.483464.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T19:16:59.483464.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T19:16:59.483464.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T19:16:59.483464.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T19:16:59.483464.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T19:16:59.483464.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T19:16:59.483464.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T19:16:59.483464.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T19:16:59.483464.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T19:16:59.483464.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T19:16:59.483464.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T19:16:59.483464.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T19:16:59.483464.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T19:16:59.483464.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T19:16:59.483464.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T19:16:59.483464.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T19:16:59.483464.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T19:16:59.483464.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T19:16:59.483464.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T19:16:59.483464.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T19:16:59.483464.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T19:16:59.483464.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T19:16:59.483464.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T19:16:59.483464.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T19:16:59.483464.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T19:16:59.483464.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T19:16:59.483464.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T19:16:59.483464.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T19:16:59.483464.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T19:16:59.483464.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T19:16:59.483464.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T19:16:59.483464.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T19:16:59.483464.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_19T19_16_59.483464", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T19:16:59.483464.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T19:16:59.483464.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_19T19_16_59.483464", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T19:16:59.483464.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T19:16:59.483464.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_19T19_16_59.483464", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T19:16:59.483464.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T19:16:59.483464.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_19T19_16_59.483464", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T19:16:59.483464.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T19:16:59.483464.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_19T19_16_59.483464", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T19:16:59.483464.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T19:16:59.483464.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_19T19_16_59.483464", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T19:16:59.483464.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T19:16:59.483464.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_19T19_16_59.483464", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T19:16:59.483464.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T19:16:59.483464.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_19T19_16_59.483464", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T19:16:59.483464.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T19:16:59.483464.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_19T19_16_59.483464", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T19:16:59.483464.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T19:16:59.483464.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_19T19_16_59.483464", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T19:16:59.483464.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T19:16:59.483464.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_19T19_16_59.483464", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T19:16:59.483464.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T19:16:59.483464.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_19T19_16_59.483464", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T19:16:59.483464.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T19:16:59.483464.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_19T19_16_59.483464", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T19:16:59.483464.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T19:16:59.483464.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_19T19_16_59.483464", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T19:16:59.483464.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T19:16:59.483464.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_19T19_16_59.483464", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T19:16:59.483464.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T19:16:59.483464.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_19T19_16_59.483464", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T19:16:59.483464.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T19:16:59.483464.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_19T19_16_59.483464", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T19:16:59.483464.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T19:16:59.483464.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_19T19_16_59.483464", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T19:16:59.483464.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T19:16:59.483464.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_19T19_16_59.483464", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T19:16:59.483464.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T19:16:59.483464.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_19T19_16_59.483464", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T19:16:59.483464.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T19:16:59.483464.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_19T19_16_59.483464", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T19:16:59.483464.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T19:16:59.483464.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_19T19_16_59.483464", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T19:16:59.483464.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T19:16:59.483464.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_19T19_16_59.483464", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T19:16:59.483464.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T19:16:59.483464.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_19T19_16_59.483464", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T19:16:59.483464.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T19:16:59.483464.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_19T19_16_59.483464", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T19:16:59.483464.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T19:16:59.483464.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_19T19_16_59.483464", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T19:16:59.483464.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T19:16:59.483464.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_19T19_16_59.483464", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T19:16:59.483464.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T19:16:59.483464.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_19T19_16_59.483464", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T19:16:59.483464.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T19:16:59.483464.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_19T19_16_59.483464", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T19:16:59.483464.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T19:16:59.483464.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_19T19_16_59.483464", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T19:16:59.483464.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T19:16:59.483464.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_19T19_16_59.483464", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T19:16:59.483464.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T19:16:59.483464.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_19T19_16_59.483464", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T19:16:59.483464.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T19:16:59.483464.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_19T19_16_59.483464", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T19:16:59.483464.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T19:16:59.483464.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_19T19_16_59.483464", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T19:16:59.483464.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T19:16:59.483464.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_19T19_16_59.483464", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T19:16:59.483464.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T19:16:59.483464.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_19T19_16_59.483464", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T19:16:59.483464.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T19:16:59.483464.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_19T19_16_59.483464", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T19:16:59.483464.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T19:16:59.483464.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_19T19_16_59.483464", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T19:16:59.483464.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T19:16:59.483464.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_19T19_16_59.483464", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T19:16:59.483464.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T19:16:59.483464.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_19T19_16_59.483464", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T19:16:59.483464.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T19:16:59.483464.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_19T19_16_59.483464", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T19:16:59.483464.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T19:16:59.483464.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_19T19_16_59.483464", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T19:16:59.483464.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T19:16:59.483464.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_19T19_16_59.483464", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T19:16:59.483464.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T19:16:59.483464.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_19T19_16_59.483464", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T19:16:59.483464.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T19:16:59.483464.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_19T19_16_59.483464", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T19:16:59.483464.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T19:16:59.483464.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_19T19_16_59.483464", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T19:16:59.483464.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T19:16:59.483464.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_19T19_16_59.483464", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T19:16:59.483464.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T19:16:59.483464.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_19T19_16_59.483464", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T19:16:59.483464.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T19:16:59.483464.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_19T19_16_59.483464", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T19:16:59.483464.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T19:16:59.483464.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_19T19_16_59.483464", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T19:16:59.483464.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T19:16:59.483464.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_19T19_16_59.483464", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T19:16:59.483464.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T19:16:59.483464.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_19T19_16_59.483464", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T19:16:59.483464.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T19:16:59.483464.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_19T19_16_59.483464", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T19:16:59.483464.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T19:16:59.483464.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_19T19_16_59.483464", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T19:16:59.483464.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T19:16:59.483464.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_19T19_16_59.483464", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T19:16:59.483464.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T19:16:59.483464.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_19T19_16_59.483464", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T19:16:59.483464.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T19:16:59.483464.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_19T19_16_59.483464", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T19:16:59.483464.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T19:16:59.483464.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_19T19_16_59.483464", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T19:16:59.483464.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T19:16:59.483464.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_09_18T01_46_24.914160", "path": ["**/details_harness|winogrande|5_2023-09-18T01-46-24.914160.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-09-18T01-46-24.914160.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_19T19_16_59.483464", "path": ["results_2023-07-19T19:16:59.483464.parquet"]}, {"split": "2023_09_18T01_46_24.914160", "path": ["results_2023-09-18T01-46-24.914160.parquet"]}, {"split": "latest", "path": ["results_2023-09-18T01-46-24.914160.parquet"]}]}]}
2023-09-18T00:46:36+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of haonan-li/bactrian-x-llama-13b-merged ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model haonan-li/bactrian-x-llama-13b-merged on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-09-18T01:46:24.914160(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of haonan-li/bactrian-x-llama-13b-merged", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model haonan-li/bactrian-x-llama-13b-merged on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-09-18T01:46:24.914160(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of haonan-li/bactrian-x-llama-13b-merged", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model haonan-li/bactrian-x-llama-13b-merged on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-09-18T01:46:24.914160(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 27, 31, 175, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of haonan-li/bactrian-x-llama-13b-merged## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model haonan-li/bactrian-x-llama-13b-merged on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-09-18T01:46:24.914160(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
209fb80052313c560694848375cc07a80729c4d4
# Dataset Card for Evaluation run of heegyu/LIMA2-7b-hf ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/heegyu/LIMA2-7b-hf - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [heegyu/LIMA2-7b-hf](https://huggingface.co/heegyu/LIMA2-7b-hf) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_heegyu__LIMA2-7b-hf", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-10-17T05:21:15.602627](https://huggingface.co/datasets/open-llm-leaderboard/details_heegyu__LIMA2-7b-hf/blob/main/results_2023-10-17T05-21-15.602627.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.01960989932885906, "em_stderr": 0.0014199622282460504, "f1": 0.07630557885906028, "f1_stderr": 0.0019245142453210734, "acc": 0.3689776582077379, "acc_stderr": 0.009099296828401368 }, "harness|drop|3": { "em": 0.01960989932885906, "em_stderr": 0.0014199622282460504, "f1": 0.07630557885906028, "f1_stderr": 0.0019245142453210734 }, "harness|gsm8k|5": { "acc": 0.03866565579984837, "acc_stderr": 0.0053105831620980076 }, "harness|winogrande|5": { "acc": 0.6992896606156275, "acc_stderr": 0.012888010494704727 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_heegyu__LIMA2-7b-hf
[ "region:us" ]
2023-08-18T11:02:38+00:00
{"pretty_name": "Evaluation run of heegyu/LIMA2-7b-hf", "dataset_summary": "Dataset automatically created during the evaluation run of model [heegyu/LIMA2-7b-hf](https://huggingface.co/heegyu/LIMA2-7b-hf) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_heegyu__LIMA2-7b-hf\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-17T05:21:15.602627](https://huggingface.co/datasets/open-llm-leaderboard/details_heegyu__LIMA2-7b-hf/blob/main/results_2023-10-17T05-21-15.602627.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.01960989932885906,\n \"em_stderr\": 0.0014199622282460504,\n \"f1\": 0.07630557885906028,\n \"f1_stderr\": 0.0019245142453210734,\n \"acc\": 0.3689776582077379,\n \"acc_stderr\": 0.009099296828401368\n },\n \"harness|drop|3\": {\n \"em\": 0.01960989932885906,\n \"em_stderr\": 0.0014199622282460504,\n \"f1\": 0.07630557885906028,\n \"f1_stderr\": 0.0019245142453210734\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.03866565579984837,\n \"acc_stderr\": 0.0053105831620980076\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.6992896606156275,\n \"acc_stderr\": 0.012888010494704727\n }\n}\n```", "repo_url": "https://huggingface.co/heegyu/LIMA2-7b-hf", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_08_09T10_35_20.569922", "path": ["**/details_harness|arc:challenge|25_2023-08-09T10:35:20.569922.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-08-09T10:35:20.569922.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_17T05_21_15.602627", "path": ["**/details_harness|drop|3_2023-10-17T05-21-15.602627.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-17T05-21-15.602627.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_17T05_21_15.602627", "path": ["**/details_harness|gsm8k|5_2023-10-17T05-21-15.602627.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-17T05-21-15.602627.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_08_09T10_35_20.569922", "path": ["**/details_harness|hellaswag|10_2023-08-09T10:35:20.569922.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-08-09T10:35:20.569922.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_08_09T10_35_20.569922", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T10:35:20.569922.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-09T10:35:20.569922.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-09T10:35:20.569922.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T10:35:20.569922.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T10:35:20.569922.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-09T10:35:20.569922.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T10:35:20.569922.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T10:35:20.569922.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T10:35:20.569922.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T10:35:20.569922.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-09T10:35:20.569922.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-09T10:35:20.569922.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T10:35:20.569922.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-09T10:35:20.569922.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T10:35:20.569922.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T10:35:20.569922.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T10:35:20.569922.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-09T10:35:20.569922.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T10:35:20.569922.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T10:35:20.569922.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T10:35:20.569922.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T10:35:20.569922.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T10:35:20.569922.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T10:35:20.569922.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T10:35:20.569922.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T10:35:20.569922.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T10:35:20.569922.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T10:35:20.569922.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T10:35:20.569922.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T10:35:20.569922.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T10:35:20.569922.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T10:35:20.569922.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-09T10:35:20.569922.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T10:35:20.569922.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-09T10:35:20.569922.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T10:35:20.569922.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T10:35:20.569922.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T10:35:20.569922.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-09T10:35:20.569922.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-09T10:35:20.569922.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T10:35:20.569922.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T10:35:20.569922.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T10:35:20.569922.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T10:35:20.569922.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-09T10:35:20.569922.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-09T10:35:20.569922.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-09T10:35:20.569922.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T10:35:20.569922.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-09T10:35:20.569922.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T10:35:20.569922.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T10:35:20.569922.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-09T10:35:20.569922.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-09T10:35:20.569922.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-09T10:35:20.569922.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T10:35:20.569922.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-09T10:35:20.569922.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-09T10:35:20.569922.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T10:35:20.569922.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-09T10:35:20.569922.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-09T10:35:20.569922.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T10:35:20.569922.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T10:35:20.569922.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-09T10:35:20.569922.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T10:35:20.569922.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T10:35:20.569922.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T10:35:20.569922.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T10:35:20.569922.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-09T10:35:20.569922.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-09T10:35:20.569922.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T10:35:20.569922.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-09T10:35:20.569922.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T10:35:20.569922.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T10:35:20.569922.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T10:35:20.569922.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-09T10:35:20.569922.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T10:35:20.569922.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T10:35:20.569922.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T10:35:20.569922.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T10:35:20.569922.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T10:35:20.569922.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T10:35:20.569922.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T10:35:20.569922.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T10:35:20.569922.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T10:35:20.569922.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T10:35:20.569922.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T10:35:20.569922.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T10:35:20.569922.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T10:35:20.569922.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T10:35:20.569922.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-09T10:35:20.569922.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T10:35:20.569922.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-09T10:35:20.569922.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T10:35:20.569922.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T10:35:20.569922.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T10:35:20.569922.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-09T10:35:20.569922.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-09T10:35:20.569922.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T10:35:20.569922.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T10:35:20.569922.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T10:35:20.569922.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T10:35:20.569922.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-09T10:35:20.569922.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-09T10:35:20.569922.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-09T10:35:20.569922.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T10:35:20.569922.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-09T10:35:20.569922.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T10:35:20.569922.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T10:35:20.569922.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-09T10:35:20.569922.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-09T10:35:20.569922.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-09T10:35:20.569922.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T10:35:20.569922.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-09T10:35:20.569922.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-09T10:35:20.569922.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_08_09T10_35_20.569922", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T10:35:20.569922.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T10:35:20.569922.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_08_09T10_35_20.569922", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-09T10:35:20.569922.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-09T10:35:20.569922.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_08_09T10_35_20.569922", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-09T10:35:20.569922.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-09T10:35:20.569922.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_08_09T10_35_20.569922", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T10:35:20.569922.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T10:35:20.569922.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_08_09T10_35_20.569922", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T10:35:20.569922.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T10:35:20.569922.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_08_09T10_35_20.569922", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-09T10:35:20.569922.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-09T10:35:20.569922.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_08_09T10_35_20.569922", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T10:35:20.569922.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T10:35:20.569922.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_08_09T10_35_20.569922", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T10:35:20.569922.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T10:35:20.569922.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_08_09T10_35_20.569922", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T10:35:20.569922.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T10:35:20.569922.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_08_09T10_35_20.569922", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T10:35:20.569922.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T10:35:20.569922.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_08_09T10_35_20.569922", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-09T10:35:20.569922.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-09T10:35:20.569922.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_08_09T10_35_20.569922", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-09T10:35:20.569922.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-09T10:35:20.569922.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_08_09T10_35_20.569922", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T10:35:20.569922.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T10:35:20.569922.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_08_09T10_35_20.569922", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-09T10:35:20.569922.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-09T10:35:20.569922.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_08_09T10_35_20.569922", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T10:35:20.569922.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T10:35:20.569922.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_08_09T10_35_20.569922", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T10:35:20.569922.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T10:35:20.569922.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_08_09T10_35_20.569922", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T10:35:20.569922.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T10:35:20.569922.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_08_09T10_35_20.569922", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-09T10:35:20.569922.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-09T10:35:20.569922.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_08_09T10_35_20.569922", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T10:35:20.569922.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T10:35:20.569922.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_08_09T10_35_20.569922", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T10:35:20.569922.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T10:35:20.569922.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_08_09T10_35_20.569922", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T10:35:20.569922.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T10:35:20.569922.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_08_09T10_35_20.569922", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T10:35:20.569922.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T10:35:20.569922.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_08_09T10_35_20.569922", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T10:35:20.569922.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T10:35:20.569922.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_08_09T10_35_20.569922", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T10:35:20.569922.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T10:35:20.569922.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_08_09T10_35_20.569922", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T10:35:20.569922.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T10:35:20.569922.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_08_09T10_35_20.569922", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T10:35:20.569922.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T10:35:20.569922.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_08_09T10_35_20.569922", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T10:35:20.569922.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T10:35:20.569922.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_08_09T10_35_20.569922", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T10:35:20.569922.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T10:35:20.569922.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_08_09T10_35_20.569922", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T10:35:20.569922.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T10:35:20.569922.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_08_09T10_35_20.569922", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T10:35:20.569922.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T10:35:20.569922.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_08_09T10_35_20.569922", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T10:35:20.569922.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T10:35:20.569922.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_08_09T10_35_20.569922", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T10:35:20.569922.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T10:35:20.569922.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_08_09T10_35_20.569922", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-09T10:35:20.569922.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-09T10:35:20.569922.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_08_09T10_35_20.569922", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T10:35:20.569922.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T10:35:20.569922.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_08_09T10_35_20.569922", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-09T10:35:20.569922.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-09T10:35:20.569922.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_08_09T10_35_20.569922", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T10:35:20.569922.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T10:35:20.569922.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_08_09T10_35_20.569922", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T10:35:20.569922.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T10:35:20.569922.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_08_09T10_35_20.569922", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T10:35:20.569922.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T10:35:20.569922.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_08_09T10_35_20.569922", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-09T10:35:20.569922.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-09T10:35:20.569922.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_08_09T10_35_20.569922", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-09T10:35:20.569922.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-09T10:35:20.569922.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_08_09T10_35_20.569922", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T10:35:20.569922.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T10:35:20.569922.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_08_09T10_35_20.569922", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T10:35:20.569922.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T10:35:20.569922.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_08_09T10_35_20.569922", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T10:35:20.569922.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T10:35:20.569922.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_08_09T10_35_20.569922", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T10:35:20.569922.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T10:35:20.569922.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_08_09T10_35_20.569922", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-09T10:35:20.569922.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-09T10:35:20.569922.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_08_09T10_35_20.569922", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-09T10:35:20.569922.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-09T10:35:20.569922.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_08_09T10_35_20.569922", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-09T10:35:20.569922.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-09T10:35:20.569922.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_08_09T10_35_20.569922", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T10:35:20.569922.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T10:35:20.569922.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_08_09T10_35_20.569922", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-09T10:35:20.569922.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-09T10:35:20.569922.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_08_09T10_35_20.569922", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T10:35:20.569922.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T10:35:20.569922.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_08_09T10_35_20.569922", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T10:35:20.569922.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T10:35:20.569922.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_08_09T10_35_20.569922", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-09T10:35:20.569922.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-09T10:35:20.569922.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_08_09T10_35_20.569922", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-09T10:35:20.569922.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-09T10:35:20.569922.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_08_09T10_35_20.569922", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-09T10:35:20.569922.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-09T10:35:20.569922.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_08_09T10_35_20.569922", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T10:35:20.569922.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T10:35:20.569922.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_08_09T10_35_20.569922", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-09T10:35:20.569922.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-09T10:35:20.569922.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_08_09T10_35_20.569922", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-09T10:35:20.569922.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-09T10:35:20.569922.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_08_09T10_35_20.569922", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-09T10:35:20.569922.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-09T10:35:20.569922.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_17T05_21_15.602627", "path": ["**/details_harness|winogrande|5_2023-10-17T05-21-15.602627.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-17T05-21-15.602627.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_08_09T10_35_20.569922", "path": ["results_2023-08-09T10:35:20.569922.parquet"]}, {"split": "2023_10_17T05_21_15.602627", "path": ["results_2023-10-17T05-21-15.602627.parquet"]}, {"split": "latest", "path": ["results_2023-10-17T05-21-15.602627.parquet"]}]}]}
2023-10-17T04:21:28+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of heegyu/LIMA2-7b-hf ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model heegyu/LIMA2-7b-hf on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-10-17T05:21:15.602627(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of heegyu/LIMA2-7b-hf", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model heegyu/LIMA2-7b-hf on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-17T05:21:15.602627(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of heegyu/LIMA2-7b-hf", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model heegyu/LIMA2-7b-hf on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-17T05:21:15.602627(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 20, 31, 168, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of heegyu/LIMA2-7b-hf## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model heegyu/LIMA2-7b-hf on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-17T05:21:15.602627(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
98df7167b8ee4f5e919668d3612d58dae23c0b10
# Dataset Card for "mmlu-high_school_geography-neg-prepend-fix" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
joey234/mmlu-high_school_geography-neg-prepend-fix
[ "region:us" ]
2023-08-18T11:02:41+00:00
{"configs": [{"config_name": "default", "data_files": [{"split": "dev", "path": "data/dev-*"}, {"split": "test", "path": "data/test-*"}]}], "dataset_info": {"features": [{"name": "question", "dtype": "string"}, {"name": "choices", "sequence": "string"}, {"name": "answer", "dtype": {"class_label": {"names": {"0": "A", "1": "B", "2": "C", "3": "D"}}}}, {"name": "negate_openai_prompt", "struct": [{"name": "content", "dtype": "string"}, {"name": "role", "dtype": "string"}]}, {"name": "neg_question", "dtype": "string"}, {"name": "fewshot_context", "dtype": "string"}, {"name": "ori_prompt", "dtype": "string"}], "splits": [{"name": "dev", "num_bytes": 5845, "num_examples": 5}, {"name": "test", "num_bytes": 477448, "num_examples": 198}], "download_size": 12727, "dataset_size": 483293}}
2023-08-21T06:35:42+00:00
[]
[]
TAGS #region-us
# Dataset Card for "mmlu-high_school_geography-neg-prepend-fix" More Information needed
[ "# Dataset Card for \"mmlu-high_school_geography-neg-prepend-fix\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"mmlu-high_school_geography-neg-prepend-fix\"\n\nMore Information needed" ]
[ 6, 28 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"mmlu-high_school_geography-neg-prepend-fix\"\n\nMore Information needed" ]
5604e6871e37c06a1f93525983b67d77c8e9f9b4
# Dataset Card for Evaluation run of heegyu/WizardVicuna-Uncensored-3B-0719 ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/heegyu/WizardVicuna-Uncensored-3B-0719 - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [heegyu/WizardVicuna-Uncensored-3B-0719](https://huggingface.co/heegyu/WizardVicuna-Uncensored-3B-0719) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_heegyu__WizardVicuna-Uncensored-3B-0719", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-10-19T03:10:00.849734](https://huggingface.co/datasets/open-llm-leaderboard/details_heegyu__WizardVicuna-Uncensored-3B-0719/blob/main/results_2023-10-19T03-10-00.849734.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.0032508389261744967, "em_stderr": 0.0005829486708558908, "f1": 0.05307046979865784, "f1_stderr": 0.0013744215109358906, "acc": 0.32454958283792285, "acc_stderr": 0.008214760837520624 }, "harness|drop|3": { "em": 0.0032508389261744967, "em_stderr": 0.0005829486708558908, "f1": 0.05307046979865784, "f1_stderr": 0.0013744215109358906 }, "harness|gsm8k|5": { "acc": 0.011372251705837756, "acc_stderr": 0.002920666198788741 }, "harness|winogrande|5": { "acc": 0.6377269139700079, "acc_stderr": 0.013508855476252508 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_heegyu__WizardVicuna-Uncensored-3B-0719
[ "region:us" ]
2023-08-18T11:02:47+00:00
{"pretty_name": "Evaluation run of heegyu/WizardVicuna-Uncensored-3B-0719", "dataset_summary": "Dataset automatically created during the evaluation run of model [heegyu/WizardVicuna-Uncensored-3B-0719](https://huggingface.co/heegyu/WizardVicuna-Uncensored-3B-0719) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_heegyu__WizardVicuna-Uncensored-3B-0719\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-19T03:10:00.849734](https://huggingface.co/datasets/open-llm-leaderboard/details_heegyu__WizardVicuna-Uncensored-3B-0719/blob/main/results_2023-10-19T03-10-00.849734.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0032508389261744967,\n \"em_stderr\": 0.0005829486708558908,\n \"f1\": 0.05307046979865784,\n \"f1_stderr\": 0.0013744215109358906,\n \"acc\": 0.32454958283792285,\n \"acc_stderr\": 0.008214760837520624\n },\n \"harness|drop|3\": {\n \"em\": 0.0032508389261744967,\n \"em_stderr\": 0.0005829486708558908,\n \"f1\": 0.05307046979865784,\n \"f1_stderr\": 0.0013744215109358906\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.011372251705837756,\n \"acc_stderr\": 0.002920666198788741\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.6377269139700079,\n \"acc_stderr\": 0.013508855476252508\n }\n}\n```", "repo_url": "https://huggingface.co/heegyu/WizardVicuna-Uncensored-3B-0719", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_24T10_29_51.933578", "path": ["**/details_harness|arc:challenge|25_2023-07-24T10:29:51.933578.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-24T10:29:51.933578.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_19T03_10_00.849734", "path": ["**/details_harness|drop|3_2023-10-19T03-10-00.849734.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-19T03-10-00.849734.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_19T03_10_00.849734", "path": ["**/details_harness|gsm8k|5_2023-10-19T03-10-00.849734.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-19T03-10-00.849734.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_24T10_29_51.933578", "path": ["**/details_harness|hellaswag|10_2023-07-24T10:29:51.933578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-24T10:29:51.933578.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_24T10_29_51.933578", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T10:29:51.933578.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-24T10:29:51.933578.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-24T10:29:51.933578.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T10:29:51.933578.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T10:29:51.933578.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-24T10:29:51.933578.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T10:29:51.933578.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T10:29:51.933578.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T10:29:51.933578.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T10:29:51.933578.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-24T10:29:51.933578.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-24T10:29:51.933578.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T10:29:51.933578.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-24T10:29:51.933578.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T10:29:51.933578.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T10:29:51.933578.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T10:29:51.933578.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-24T10:29:51.933578.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T10:29:51.933578.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T10:29:51.933578.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T10:29:51.933578.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T10:29:51.933578.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T10:29:51.933578.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T10:29:51.933578.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T10:29:51.933578.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T10:29:51.933578.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T10:29:51.933578.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T10:29:51.933578.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T10:29:51.933578.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T10:29:51.933578.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T10:29:51.933578.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T10:29:51.933578.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-24T10:29:51.933578.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T10:29:51.933578.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-24T10:29:51.933578.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T10:29:51.933578.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T10:29:51.933578.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T10:29:51.933578.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-24T10:29:51.933578.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-24T10:29:51.933578.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T10:29:51.933578.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T10:29:51.933578.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T10:29:51.933578.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T10:29:51.933578.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-24T10:29:51.933578.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-24T10:29:51.933578.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-24T10:29:51.933578.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T10:29:51.933578.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-24T10:29:51.933578.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T10:29:51.933578.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T10:29:51.933578.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-24T10:29:51.933578.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-24T10:29:51.933578.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-24T10:29:51.933578.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T10:29:51.933578.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-24T10:29:51.933578.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-24T10:29:51.933578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T10:29:51.933578.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-24T10:29:51.933578.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-24T10:29:51.933578.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T10:29:51.933578.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T10:29:51.933578.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-24T10:29:51.933578.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T10:29:51.933578.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T10:29:51.933578.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T10:29:51.933578.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T10:29:51.933578.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-24T10:29:51.933578.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-24T10:29:51.933578.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T10:29:51.933578.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-24T10:29:51.933578.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T10:29:51.933578.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T10:29:51.933578.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T10:29:51.933578.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-24T10:29:51.933578.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T10:29:51.933578.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T10:29:51.933578.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T10:29:51.933578.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T10:29:51.933578.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T10:29:51.933578.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T10:29:51.933578.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T10:29:51.933578.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T10:29:51.933578.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T10:29:51.933578.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T10:29:51.933578.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T10:29:51.933578.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T10:29:51.933578.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T10:29:51.933578.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T10:29:51.933578.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-24T10:29:51.933578.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T10:29:51.933578.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-24T10:29:51.933578.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T10:29:51.933578.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T10:29:51.933578.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T10:29:51.933578.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-24T10:29:51.933578.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-24T10:29:51.933578.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T10:29:51.933578.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T10:29:51.933578.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T10:29:51.933578.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T10:29:51.933578.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-24T10:29:51.933578.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-24T10:29:51.933578.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-24T10:29:51.933578.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T10:29:51.933578.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-24T10:29:51.933578.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T10:29:51.933578.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T10:29:51.933578.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-24T10:29:51.933578.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-24T10:29:51.933578.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-24T10:29:51.933578.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T10:29:51.933578.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-24T10:29:51.933578.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-24T10:29:51.933578.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_24T10_29_51.933578", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T10:29:51.933578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T10:29:51.933578.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_24T10_29_51.933578", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-24T10:29:51.933578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-24T10:29:51.933578.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_24T10_29_51.933578", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-24T10:29:51.933578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-24T10:29:51.933578.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_24T10_29_51.933578", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T10:29:51.933578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T10:29:51.933578.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_24T10_29_51.933578", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T10:29:51.933578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T10:29:51.933578.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_24T10_29_51.933578", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-24T10:29:51.933578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-24T10:29:51.933578.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_24T10_29_51.933578", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T10:29:51.933578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T10:29:51.933578.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_24T10_29_51.933578", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T10:29:51.933578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T10:29:51.933578.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_24T10_29_51.933578", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T10:29:51.933578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T10:29:51.933578.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_24T10_29_51.933578", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T10:29:51.933578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T10:29:51.933578.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_24T10_29_51.933578", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-24T10:29:51.933578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-24T10:29:51.933578.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_24T10_29_51.933578", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-24T10:29:51.933578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-24T10:29:51.933578.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_24T10_29_51.933578", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T10:29:51.933578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T10:29:51.933578.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_24T10_29_51.933578", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-24T10:29:51.933578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-24T10:29:51.933578.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_24T10_29_51.933578", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T10:29:51.933578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T10:29:51.933578.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_24T10_29_51.933578", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T10:29:51.933578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T10:29:51.933578.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_24T10_29_51.933578", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T10:29:51.933578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T10:29:51.933578.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_24T10_29_51.933578", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-24T10:29:51.933578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-24T10:29:51.933578.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_24T10_29_51.933578", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T10:29:51.933578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T10:29:51.933578.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_24T10_29_51.933578", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T10:29:51.933578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T10:29:51.933578.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_24T10_29_51.933578", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T10:29:51.933578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T10:29:51.933578.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_24T10_29_51.933578", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T10:29:51.933578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T10:29:51.933578.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_24T10_29_51.933578", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T10:29:51.933578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T10:29:51.933578.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_24T10_29_51.933578", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T10:29:51.933578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T10:29:51.933578.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_24T10_29_51.933578", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T10:29:51.933578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T10:29:51.933578.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_24T10_29_51.933578", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T10:29:51.933578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T10:29:51.933578.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_24T10_29_51.933578", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T10:29:51.933578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T10:29:51.933578.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_24T10_29_51.933578", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T10:29:51.933578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T10:29:51.933578.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_24T10_29_51.933578", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T10:29:51.933578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T10:29:51.933578.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_24T10_29_51.933578", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T10:29:51.933578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T10:29:51.933578.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_24T10_29_51.933578", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T10:29:51.933578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T10:29:51.933578.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_24T10_29_51.933578", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T10:29:51.933578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T10:29:51.933578.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_24T10_29_51.933578", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-24T10:29:51.933578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-24T10:29:51.933578.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_24T10_29_51.933578", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T10:29:51.933578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T10:29:51.933578.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_24T10_29_51.933578", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-24T10:29:51.933578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-24T10:29:51.933578.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_24T10_29_51.933578", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T10:29:51.933578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T10:29:51.933578.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_24T10_29_51.933578", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T10:29:51.933578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T10:29:51.933578.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_24T10_29_51.933578", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T10:29:51.933578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T10:29:51.933578.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_24T10_29_51.933578", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-24T10:29:51.933578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-24T10:29:51.933578.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_24T10_29_51.933578", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-24T10:29:51.933578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-24T10:29:51.933578.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_24T10_29_51.933578", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T10:29:51.933578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T10:29:51.933578.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_24T10_29_51.933578", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T10:29:51.933578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T10:29:51.933578.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_24T10_29_51.933578", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T10:29:51.933578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T10:29:51.933578.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_24T10_29_51.933578", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T10:29:51.933578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T10:29:51.933578.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_24T10_29_51.933578", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-24T10:29:51.933578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-24T10:29:51.933578.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_24T10_29_51.933578", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-24T10:29:51.933578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-24T10:29:51.933578.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_24T10_29_51.933578", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-24T10:29:51.933578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-24T10:29:51.933578.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_24T10_29_51.933578", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T10:29:51.933578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T10:29:51.933578.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_24T10_29_51.933578", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-24T10:29:51.933578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-24T10:29:51.933578.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_24T10_29_51.933578", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T10:29:51.933578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T10:29:51.933578.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_24T10_29_51.933578", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T10:29:51.933578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T10:29:51.933578.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_24T10_29_51.933578", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-24T10:29:51.933578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-24T10:29:51.933578.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_24T10_29_51.933578", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-24T10:29:51.933578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-24T10:29:51.933578.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_24T10_29_51.933578", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-24T10:29:51.933578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-24T10:29:51.933578.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_24T10_29_51.933578", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T10:29:51.933578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T10:29:51.933578.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_24T10_29_51.933578", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-24T10:29:51.933578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-24T10:29:51.933578.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_24T10_29_51.933578", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-24T10:29:51.933578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-24T10:29:51.933578.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_24T10_29_51.933578", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-24T10:29:51.933578.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-24T10:29:51.933578.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_19T03_10_00.849734", "path": ["**/details_harness|winogrande|5_2023-10-19T03-10-00.849734.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-19T03-10-00.849734.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_24T10_29_51.933578", "path": ["results_2023-07-24T10:29:51.933578.parquet"]}, {"split": "2023_10_19T03_10_00.849734", "path": ["results_2023-10-19T03-10-00.849734.parquet"]}, {"split": "latest", "path": ["results_2023-10-19T03-10-00.849734.parquet"]}]}]}
2023-10-19T02:10:12+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of heegyu/WizardVicuna-Uncensored-3B-0719 ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model heegyu/WizardVicuna-Uncensored-3B-0719 on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-10-19T03:10:00.849734(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of heegyu/WizardVicuna-Uncensored-3B-0719", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model heegyu/WizardVicuna-Uncensored-3B-0719 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-19T03:10:00.849734(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of heegyu/WizardVicuna-Uncensored-3B-0719", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model heegyu/WizardVicuna-Uncensored-3B-0719 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-19T03:10:00.849734(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 28, 31, 176, 66, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of heegyu/WizardVicuna-Uncensored-3B-0719## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model heegyu/WizardVicuna-Uncensored-3B-0719 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-19T03:10:00.849734(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
193b4db8c44489b98a37514c18f8956a6a199539
# Dataset Card for "mmlu-high_school_government_and_politics-neg-prepend-fix" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
joey234/mmlu-high_school_government_and_politics-neg-prepend-fix
[ "region:us" ]
2023-08-18T11:02:55+00:00
{"configs": [{"config_name": "default", "data_files": [{"split": "dev", "path": "data/dev-*"}, {"split": "test", "path": "data/test-*"}]}], "dataset_info": {"features": [{"name": "question", "dtype": "string"}, {"name": "choices", "sequence": "string"}, {"name": "answer", "dtype": {"class_label": {"names": {"0": "A", "1": "B", "2": "C", "3": "D"}}}}, {"name": "negate_openai_prompt", "struct": [{"name": "content", "dtype": "string"}, {"name": "role", "dtype": "string"}]}, {"name": "neg_question", "dtype": "string"}, {"name": "fewshot_context", "dtype": "string"}, {"name": "ori_prompt", "dtype": "string"}], "splits": [{"name": "dev", "num_bytes": 6650, "num_examples": 5}, {"name": "test", "num_bytes": 592819, "num_examples": 193}], "download_size": 13885, "dataset_size": 599469}}
2023-08-21T06:35:54+00:00
[]
[]
TAGS #region-us
# Dataset Card for "mmlu-high_school_government_and_politics-neg-prepend-fix" More Information needed
[ "# Dataset Card for \"mmlu-high_school_government_and_politics-neg-prepend-fix\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"mmlu-high_school_government_and_politics-neg-prepend-fix\"\n\nMore Information needed" ]
[ 6, 33 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"mmlu-high_school_government_and_politics-neg-prepend-fix\"\n\nMore Information needed" ]
f39317eb473cefdd5c99a03f1c24f0d9f1c8d387
# Dataset Card for Evaluation run of heegyu/LIMA2-13b-hf ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/heegyu/LIMA2-13b-hf - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [heegyu/LIMA2-13b-hf](https://huggingface.co/heegyu/LIMA2-13b-hf) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_heegyu__LIMA2-13b-hf", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-10-22T00:28:18.061876](https://huggingface.co/datasets/open-llm-leaderboard/details_heegyu__LIMA2-13b-hf/blob/main/results_2023-10-22T00-28-18.061876.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.2590184563758389, "em_stderr": 0.004486510640529356, "f1": 0.3212950922818803, "f1_stderr": 0.004447928613953936, "acc": 0.3950291202646285, "acc_stderr": 0.009430155888357935 }, "harness|drop|3": { "em": 0.2590184563758389, "em_stderr": 0.004486510640529356, "f1": 0.3212950922818803, "f1_stderr": 0.004447928613953936 }, "harness|gsm8k|5": { "acc": 0.0576194086429113, "acc_stderr": 0.006418593319822863 }, "harness|winogrande|5": { "acc": 0.7324388318863457, "acc_stderr": 0.012441718456893009 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_heegyu__LIMA2-13b-hf
[ "region:us" ]
2023-08-18T11:02:55+00:00
{"pretty_name": "Evaluation run of heegyu/LIMA2-13b-hf", "dataset_summary": "Dataset automatically created during the evaluation run of model [heegyu/LIMA2-13b-hf](https://huggingface.co/heegyu/LIMA2-13b-hf) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_heegyu__LIMA2-13b-hf\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-22T00:28:18.061876](https://huggingface.co/datasets/open-llm-leaderboard/details_heegyu__LIMA2-13b-hf/blob/main/results_2023-10-22T00-28-18.061876.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.2590184563758389,\n \"em_stderr\": 0.004486510640529356,\n \"f1\": 0.3212950922818803,\n \"f1_stderr\": 0.004447928613953936,\n \"acc\": 0.3950291202646285,\n \"acc_stderr\": 0.009430155888357935\n },\n \"harness|drop|3\": {\n \"em\": 0.2590184563758389,\n \"em_stderr\": 0.004486510640529356,\n \"f1\": 0.3212950922818803,\n \"f1_stderr\": 0.004447928613953936\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0576194086429113,\n \"acc_stderr\": 0.006418593319822863\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7324388318863457,\n \"acc_stderr\": 0.012441718456893009\n }\n}\n```", "repo_url": "https://huggingface.co/heegyu/LIMA2-13b-hf", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_08_09T15_19_08.555277", "path": ["**/details_harness|arc:challenge|25_2023-08-09T15:19:08.555277.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-08-09T15:19:08.555277.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_22T00_28_18.061876", "path": ["**/details_harness|drop|3_2023-10-22T00-28-18.061876.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-22T00-28-18.061876.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_22T00_28_18.061876", "path": ["**/details_harness|gsm8k|5_2023-10-22T00-28-18.061876.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-22T00-28-18.061876.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_08_09T15_19_08.555277", "path": ["**/details_harness|hellaswag|10_2023-08-09T15:19:08.555277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-08-09T15:19:08.555277.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_08_09T15_19_08.555277", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T15:19:08.555277.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-09T15:19:08.555277.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-09T15:19:08.555277.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T15:19:08.555277.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T15:19:08.555277.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-09T15:19:08.555277.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T15:19:08.555277.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T15:19:08.555277.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T15:19:08.555277.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T15:19:08.555277.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-09T15:19:08.555277.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-09T15:19:08.555277.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T15:19:08.555277.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-09T15:19:08.555277.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T15:19:08.555277.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T15:19:08.555277.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T15:19:08.555277.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-09T15:19:08.555277.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T15:19:08.555277.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T15:19:08.555277.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T15:19:08.555277.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T15:19:08.555277.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T15:19:08.555277.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T15:19:08.555277.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T15:19:08.555277.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T15:19:08.555277.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T15:19:08.555277.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T15:19:08.555277.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T15:19:08.555277.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T15:19:08.555277.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T15:19:08.555277.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T15:19:08.555277.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-09T15:19:08.555277.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T15:19:08.555277.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-09T15:19:08.555277.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T15:19:08.555277.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T15:19:08.555277.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T15:19:08.555277.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-09T15:19:08.555277.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-09T15:19:08.555277.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T15:19:08.555277.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T15:19:08.555277.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T15:19:08.555277.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T15:19:08.555277.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-09T15:19:08.555277.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-09T15:19:08.555277.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-09T15:19:08.555277.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T15:19:08.555277.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-09T15:19:08.555277.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T15:19:08.555277.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T15:19:08.555277.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-09T15:19:08.555277.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-09T15:19:08.555277.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-09T15:19:08.555277.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T15:19:08.555277.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-09T15:19:08.555277.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-09T15:19:08.555277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T15:19:08.555277.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-09T15:19:08.555277.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-09T15:19:08.555277.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T15:19:08.555277.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T15:19:08.555277.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-09T15:19:08.555277.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T15:19:08.555277.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T15:19:08.555277.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T15:19:08.555277.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T15:19:08.555277.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-09T15:19:08.555277.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-09T15:19:08.555277.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T15:19:08.555277.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-09T15:19:08.555277.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T15:19:08.555277.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T15:19:08.555277.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T15:19:08.555277.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-09T15:19:08.555277.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T15:19:08.555277.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T15:19:08.555277.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T15:19:08.555277.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T15:19:08.555277.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T15:19:08.555277.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T15:19:08.555277.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T15:19:08.555277.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T15:19:08.555277.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T15:19:08.555277.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T15:19:08.555277.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T15:19:08.555277.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T15:19:08.555277.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T15:19:08.555277.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T15:19:08.555277.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-09T15:19:08.555277.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T15:19:08.555277.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-09T15:19:08.555277.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T15:19:08.555277.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T15:19:08.555277.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T15:19:08.555277.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-09T15:19:08.555277.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-09T15:19:08.555277.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T15:19:08.555277.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T15:19:08.555277.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T15:19:08.555277.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T15:19:08.555277.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-09T15:19:08.555277.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-09T15:19:08.555277.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-09T15:19:08.555277.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T15:19:08.555277.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-09T15:19:08.555277.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T15:19:08.555277.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T15:19:08.555277.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-09T15:19:08.555277.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-09T15:19:08.555277.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-09T15:19:08.555277.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T15:19:08.555277.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-09T15:19:08.555277.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-09T15:19:08.555277.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_08_09T15_19_08.555277", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T15:19:08.555277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T15:19:08.555277.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_08_09T15_19_08.555277", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-09T15:19:08.555277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-09T15:19:08.555277.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_08_09T15_19_08.555277", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-09T15:19:08.555277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-09T15:19:08.555277.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_08_09T15_19_08.555277", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T15:19:08.555277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T15:19:08.555277.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_08_09T15_19_08.555277", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T15:19:08.555277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T15:19:08.555277.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_08_09T15_19_08.555277", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-09T15:19:08.555277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-09T15:19:08.555277.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_08_09T15_19_08.555277", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T15:19:08.555277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T15:19:08.555277.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_08_09T15_19_08.555277", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T15:19:08.555277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T15:19:08.555277.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_08_09T15_19_08.555277", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T15:19:08.555277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T15:19:08.555277.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_08_09T15_19_08.555277", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T15:19:08.555277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T15:19:08.555277.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_08_09T15_19_08.555277", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-09T15:19:08.555277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-09T15:19:08.555277.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_08_09T15_19_08.555277", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-09T15:19:08.555277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-09T15:19:08.555277.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_08_09T15_19_08.555277", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T15:19:08.555277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T15:19:08.555277.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_08_09T15_19_08.555277", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-09T15:19:08.555277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-09T15:19:08.555277.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_08_09T15_19_08.555277", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T15:19:08.555277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T15:19:08.555277.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_08_09T15_19_08.555277", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T15:19:08.555277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T15:19:08.555277.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_08_09T15_19_08.555277", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T15:19:08.555277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T15:19:08.555277.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_08_09T15_19_08.555277", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-09T15:19:08.555277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-09T15:19:08.555277.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_08_09T15_19_08.555277", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T15:19:08.555277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T15:19:08.555277.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_08_09T15_19_08.555277", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T15:19:08.555277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T15:19:08.555277.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_08_09T15_19_08.555277", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T15:19:08.555277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T15:19:08.555277.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_08_09T15_19_08.555277", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T15:19:08.555277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T15:19:08.555277.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_08_09T15_19_08.555277", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T15:19:08.555277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T15:19:08.555277.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_08_09T15_19_08.555277", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T15:19:08.555277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T15:19:08.555277.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_08_09T15_19_08.555277", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T15:19:08.555277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T15:19:08.555277.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_08_09T15_19_08.555277", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T15:19:08.555277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T15:19:08.555277.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_08_09T15_19_08.555277", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T15:19:08.555277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T15:19:08.555277.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_08_09T15_19_08.555277", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T15:19:08.555277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T15:19:08.555277.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_08_09T15_19_08.555277", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T15:19:08.555277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T15:19:08.555277.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_08_09T15_19_08.555277", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T15:19:08.555277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T15:19:08.555277.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_08_09T15_19_08.555277", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T15:19:08.555277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T15:19:08.555277.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_08_09T15_19_08.555277", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T15:19:08.555277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T15:19:08.555277.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_08_09T15_19_08.555277", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-09T15:19:08.555277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-09T15:19:08.555277.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_08_09T15_19_08.555277", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T15:19:08.555277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T15:19:08.555277.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_08_09T15_19_08.555277", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-09T15:19:08.555277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-09T15:19:08.555277.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_08_09T15_19_08.555277", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T15:19:08.555277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T15:19:08.555277.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_08_09T15_19_08.555277", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T15:19:08.555277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T15:19:08.555277.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_08_09T15_19_08.555277", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T15:19:08.555277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T15:19:08.555277.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_08_09T15_19_08.555277", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-09T15:19:08.555277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-09T15:19:08.555277.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_08_09T15_19_08.555277", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-09T15:19:08.555277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-09T15:19:08.555277.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_08_09T15_19_08.555277", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T15:19:08.555277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T15:19:08.555277.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_08_09T15_19_08.555277", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T15:19:08.555277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T15:19:08.555277.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_08_09T15_19_08.555277", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T15:19:08.555277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T15:19:08.555277.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_08_09T15_19_08.555277", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T15:19:08.555277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T15:19:08.555277.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_08_09T15_19_08.555277", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-09T15:19:08.555277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-09T15:19:08.555277.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_08_09T15_19_08.555277", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-09T15:19:08.555277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-09T15:19:08.555277.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_08_09T15_19_08.555277", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-09T15:19:08.555277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-09T15:19:08.555277.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_08_09T15_19_08.555277", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T15:19:08.555277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T15:19:08.555277.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_08_09T15_19_08.555277", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-09T15:19:08.555277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-09T15:19:08.555277.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_08_09T15_19_08.555277", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T15:19:08.555277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T15:19:08.555277.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_08_09T15_19_08.555277", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T15:19:08.555277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T15:19:08.555277.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_08_09T15_19_08.555277", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-09T15:19:08.555277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-09T15:19:08.555277.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_08_09T15_19_08.555277", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-09T15:19:08.555277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-09T15:19:08.555277.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_08_09T15_19_08.555277", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-09T15:19:08.555277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-09T15:19:08.555277.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_08_09T15_19_08.555277", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T15:19:08.555277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T15:19:08.555277.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_08_09T15_19_08.555277", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-09T15:19:08.555277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-09T15:19:08.555277.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_08_09T15_19_08.555277", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-09T15:19:08.555277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-09T15:19:08.555277.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_08_09T15_19_08.555277", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-09T15:19:08.555277.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-09T15:19:08.555277.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_22T00_28_18.061876", "path": ["**/details_harness|winogrande|5_2023-10-22T00-28-18.061876.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-22T00-28-18.061876.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_08_09T15_19_08.555277", "path": ["results_2023-08-09T15:19:08.555277.parquet"]}, {"split": "2023_10_22T00_28_18.061876", "path": ["results_2023-10-22T00-28-18.061876.parquet"]}, {"split": "latest", "path": ["results_2023-10-22T00-28-18.061876.parquet"]}]}]}
2023-10-21T23:28:30+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of heegyu/LIMA2-13b-hf ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model heegyu/LIMA2-13b-hf on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-10-22T00:28:18.061876(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of heegyu/LIMA2-13b-hf", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model heegyu/LIMA2-13b-hf on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-22T00:28:18.061876(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of heegyu/LIMA2-13b-hf", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model heegyu/LIMA2-13b-hf on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-22T00:28:18.061876(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 20, 31, 168, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of heegyu/LIMA2-13b-hf## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model heegyu/LIMA2-13b-hf on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-22T00:28:18.061876(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
f14fe6c625c8b4bad409692710cfc36c0a4670af
# Dataset Card for Evaluation run of heegyu/RedTulu-Uncensored-3B-0719 ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/heegyu/RedTulu-Uncensored-3B-0719 - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [heegyu/RedTulu-Uncensored-3B-0719](https://huggingface.co/heegyu/RedTulu-Uncensored-3B-0719) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_heegyu__RedTulu-Uncensored-3B-0719", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-10-21T23:03:56.733813](https://huggingface.co/datasets/open-llm-leaderboard/details_heegyu__RedTulu-Uncensored-3B-0719/blob/main/results_2023-10-21T23-03-56.733813.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.23049496644295303, "em_stderr": 0.004312966716420502, "f1": 0.27103292785234895, "f1_stderr": 0.004338201280350465, "acc": 0.3231323148471164, "acc_stderr": 0.008861776299208445 }, "harness|drop|3": { "em": 0.23049496644295303, "em_stderr": 0.004312966716420502, "f1": 0.27103292785234895, "f1_stderr": 0.004338201280350465 }, "harness|gsm8k|5": { "acc": 0.022744503411675512, "acc_stderr": 0.004106620637749709 }, "harness|winogrande|5": { "acc": 0.6235201262825573, "acc_stderr": 0.013616931960667182 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_heegyu__RedTulu-Uncensored-3B-0719
[ "region:us" ]
2023-08-18T11:03:04+00:00
{"pretty_name": "Evaluation run of heegyu/RedTulu-Uncensored-3B-0719", "dataset_summary": "Dataset automatically created during the evaluation run of model [heegyu/RedTulu-Uncensored-3B-0719](https://huggingface.co/heegyu/RedTulu-Uncensored-3B-0719) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_heegyu__RedTulu-Uncensored-3B-0719\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-21T23:03:56.733813](https://huggingface.co/datasets/open-llm-leaderboard/details_heegyu__RedTulu-Uncensored-3B-0719/blob/main/results_2023-10-21T23-03-56.733813.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.23049496644295303,\n \"em_stderr\": 0.004312966716420502,\n \"f1\": 0.27103292785234895,\n \"f1_stderr\": 0.004338201280350465,\n \"acc\": 0.3231323148471164,\n \"acc_stderr\": 0.008861776299208445\n },\n \"harness|drop|3\": {\n \"em\": 0.23049496644295303,\n \"em_stderr\": 0.004312966716420502,\n \"f1\": 0.27103292785234895,\n \"f1_stderr\": 0.004338201280350465\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.022744503411675512,\n \"acc_stderr\": 0.004106620637749709\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.6235201262825573,\n \"acc_stderr\": 0.013616931960667182\n }\n}\n```", "repo_url": "https://huggingface.co/heegyu/RedTulu-Uncensored-3B-0719", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_24T10_33_22.624051", "path": ["**/details_harness|arc:challenge|25_2023-07-24T10:33:22.624051.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-24T10:33:22.624051.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_21T23_03_56.733813", "path": ["**/details_harness|drop|3_2023-10-21T23-03-56.733813.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-21T23-03-56.733813.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_21T23_03_56.733813", "path": ["**/details_harness|gsm8k|5_2023-10-21T23-03-56.733813.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-21T23-03-56.733813.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_24T10_33_22.624051", "path": ["**/details_harness|hellaswag|10_2023-07-24T10:33:22.624051.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-24T10:33:22.624051.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_24T10_33_22.624051", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T10:33:22.624051.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-24T10:33:22.624051.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-24T10:33:22.624051.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T10:33:22.624051.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T10:33:22.624051.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-24T10:33:22.624051.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T10:33:22.624051.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T10:33:22.624051.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T10:33:22.624051.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T10:33:22.624051.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-24T10:33:22.624051.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-24T10:33:22.624051.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T10:33:22.624051.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-24T10:33:22.624051.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T10:33:22.624051.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T10:33:22.624051.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T10:33:22.624051.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-24T10:33:22.624051.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T10:33:22.624051.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T10:33:22.624051.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T10:33:22.624051.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T10:33:22.624051.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T10:33:22.624051.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T10:33:22.624051.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T10:33:22.624051.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T10:33:22.624051.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T10:33:22.624051.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T10:33:22.624051.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T10:33:22.624051.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T10:33:22.624051.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T10:33:22.624051.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T10:33:22.624051.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-24T10:33:22.624051.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T10:33:22.624051.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-24T10:33:22.624051.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T10:33:22.624051.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T10:33:22.624051.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T10:33:22.624051.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-24T10:33:22.624051.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-24T10:33:22.624051.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T10:33:22.624051.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T10:33:22.624051.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T10:33:22.624051.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T10:33:22.624051.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-24T10:33:22.624051.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-24T10:33:22.624051.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-24T10:33:22.624051.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T10:33:22.624051.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-24T10:33:22.624051.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T10:33:22.624051.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T10:33:22.624051.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-24T10:33:22.624051.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-24T10:33:22.624051.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-24T10:33:22.624051.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T10:33:22.624051.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-24T10:33:22.624051.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-24T10:33:22.624051.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T10:33:22.624051.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-24T10:33:22.624051.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-24T10:33:22.624051.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T10:33:22.624051.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T10:33:22.624051.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-24T10:33:22.624051.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T10:33:22.624051.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T10:33:22.624051.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T10:33:22.624051.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T10:33:22.624051.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-24T10:33:22.624051.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-24T10:33:22.624051.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T10:33:22.624051.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-24T10:33:22.624051.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T10:33:22.624051.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T10:33:22.624051.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T10:33:22.624051.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-24T10:33:22.624051.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T10:33:22.624051.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T10:33:22.624051.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T10:33:22.624051.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T10:33:22.624051.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T10:33:22.624051.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T10:33:22.624051.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T10:33:22.624051.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T10:33:22.624051.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T10:33:22.624051.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T10:33:22.624051.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T10:33:22.624051.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T10:33:22.624051.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T10:33:22.624051.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T10:33:22.624051.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-24T10:33:22.624051.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T10:33:22.624051.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-24T10:33:22.624051.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T10:33:22.624051.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T10:33:22.624051.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T10:33:22.624051.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-24T10:33:22.624051.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-24T10:33:22.624051.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T10:33:22.624051.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T10:33:22.624051.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T10:33:22.624051.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T10:33:22.624051.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-24T10:33:22.624051.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-24T10:33:22.624051.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-24T10:33:22.624051.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T10:33:22.624051.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-24T10:33:22.624051.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T10:33:22.624051.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T10:33:22.624051.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-24T10:33:22.624051.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-24T10:33:22.624051.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-24T10:33:22.624051.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T10:33:22.624051.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-24T10:33:22.624051.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-24T10:33:22.624051.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_24T10_33_22.624051", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T10:33:22.624051.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T10:33:22.624051.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_24T10_33_22.624051", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-24T10:33:22.624051.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-24T10:33:22.624051.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_24T10_33_22.624051", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-24T10:33:22.624051.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-24T10:33:22.624051.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_24T10_33_22.624051", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T10:33:22.624051.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T10:33:22.624051.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_24T10_33_22.624051", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T10:33:22.624051.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T10:33:22.624051.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_24T10_33_22.624051", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-24T10:33:22.624051.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-24T10:33:22.624051.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_24T10_33_22.624051", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T10:33:22.624051.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T10:33:22.624051.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_24T10_33_22.624051", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T10:33:22.624051.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T10:33:22.624051.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_24T10_33_22.624051", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T10:33:22.624051.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T10:33:22.624051.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_24T10_33_22.624051", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T10:33:22.624051.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T10:33:22.624051.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_24T10_33_22.624051", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-24T10:33:22.624051.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-24T10:33:22.624051.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_24T10_33_22.624051", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-24T10:33:22.624051.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-24T10:33:22.624051.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_24T10_33_22.624051", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T10:33:22.624051.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T10:33:22.624051.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_24T10_33_22.624051", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-24T10:33:22.624051.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-24T10:33:22.624051.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_24T10_33_22.624051", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T10:33:22.624051.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T10:33:22.624051.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_24T10_33_22.624051", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T10:33:22.624051.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T10:33:22.624051.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_24T10_33_22.624051", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T10:33:22.624051.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T10:33:22.624051.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_24T10_33_22.624051", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-24T10:33:22.624051.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-24T10:33:22.624051.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_24T10_33_22.624051", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T10:33:22.624051.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T10:33:22.624051.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_24T10_33_22.624051", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T10:33:22.624051.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T10:33:22.624051.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_24T10_33_22.624051", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T10:33:22.624051.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T10:33:22.624051.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_24T10_33_22.624051", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T10:33:22.624051.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T10:33:22.624051.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_24T10_33_22.624051", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T10:33:22.624051.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T10:33:22.624051.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_24T10_33_22.624051", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T10:33:22.624051.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T10:33:22.624051.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_24T10_33_22.624051", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T10:33:22.624051.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T10:33:22.624051.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_24T10_33_22.624051", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T10:33:22.624051.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T10:33:22.624051.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_24T10_33_22.624051", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T10:33:22.624051.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T10:33:22.624051.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_24T10_33_22.624051", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T10:33:22.624051.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T10:33:22.624051.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_24T10_33_22.624051", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T10:33:22.624051.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T10:33:22.624051.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_24T10_33_22.624051", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T10:33:22.624051.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T10:33:22.624051.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_24T10_33_22.624051", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T10:33:22.624051.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T10:33:22.624051.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_24T10_33_22.624051", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T10:33:22.624051.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T10:33:22.624051.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_24T10_33_22.624051", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-24T10:33:22.624051.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-24T10:33:22.624051.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_24T10_33_22.624051", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T10:33:22.624051.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T10:33:22.624051.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_24T10_33_22.624051", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-24T10:33:22.624051.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-24T10:33:22.624051.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_24T10_33_22.624051", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T10:33:22.624051.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T10:33:22.624051.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_24T10_33_22.624051", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T10:33:22.624051.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T10:33:22.624051.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_24T10_33_22.624051", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T10:33:22.624051.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T10:33:22.624051.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_24T10_33_22.624051", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-24T10:33:22.624051.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-24T10:33:22.624051.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_24T10_33_22.624051", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-24T10:33:22.624051.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-24T10:33:22.624051.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_24T10_33_22.624051", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T10:33:22.624051.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T10:33:22.624051.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_24T10_33_22.624051", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T10:33:22.624051.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T10:33:22.624051.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_24T10_33_22.624051", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T10:33:22.624051.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T10:33:22.624051.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_24T10_33_22.624051", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T10:33:22.624051.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T10:33:22.624051.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_24T10_33_22.624051", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-24T10:33:22.624051.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-24T10:33:22.624051.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_24T10_33_22.624051", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-24T10:33:22.624051.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-24T10:33:22.624051.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_24T10_33_22.624051", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-24T10:33:22.624051.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-24T10:33:22.624051.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_24T10_33_22.624051", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T10:33:22.624051.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T10:33:22.624051.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_24T10_33_22.624051", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-24T10:33:22.624051.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-24T10:33:22.624051.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_24T10_33_22.624051", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T10:33:22.624051.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T10:33:22.624051.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_24T10_33_22.624051", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T10:33:22.624051.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T10:33:22.624051.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_24T10_33_22.624051", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-24T10:33:22.624051.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-24T10:33:22.624051.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_24T10_33_22.624051", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-24T10:33:22.624051.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-24T10:33:22.624051.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_24T10_33_22.624051", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-24T10:33:22.624051.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-24T10:33:22.624051.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_24T10_33_22.624051", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T10:33:22.624051.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T10:33:22.624051.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_24T10_33_22.624051", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-24T10:33:22.624051.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-24T10:33:22.624051.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_24T10_33_22.624051", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-24T10:33:22.624051.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-24T10:33:22.624051.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_24T10_33_22.624051", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-24T10:33:22.624051.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-24T10:33:22.624051.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_21T23_03_56.733813", "path": ["**/details_harness|winogrande|5_2023-10-21T23-03-56.733813.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-21T23-03-56.733813.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_24T10_33_22.624051", "path": ["results_2023-07-24T10:33:22.624051.parquet"]}, {"split": "2023_10_21T23_03_56.733813", "path": ["results_2023-10-21T23-03-56.733813.parquet"]}, {"split": "latest", "path": ["results_2023-10-21T23-03-56.733813.parquet"]}]}]}
2023-10-21T22:04:09+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of heegyu/RedTulu-Uncensored-3B-0719 ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model heegyu/RedTulu-Uncensored-3B-0719 on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-10-21T23:03:56.733813(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of heegyu/RedTulu-Uncensored-3B-0719", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model heegyu/RedTulu-Uncensored-3B-0719 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-21T23:03:56.733813(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of heegyu/RedTulu-Uncensored-3B-0719", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model heegyu/RedTulu-Uncensored-3B-0719 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-21T23:03:56.733813(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 25, 31, 173, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of heegyu/RedTulu-Uncensored-3B-0719## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model heegyu/RedTulu-Uncensored-3B-0719 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-21T23:03:56.733813(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
9402b037f0bf030511757b1238151914b79fdb0b
# Dataset Card for "mmlu-high_school_macroeconomics-neg-prepend-fix" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
joey234/mmlu-high_school_macroeconomics-neg-prepend-fix
[ "region:us" ]
2023-08-18T11:03:10+00:00
{"configs": [{"config_name": "default", "data_files": [{"split": "dev", "path": "data/dev-*"}, {"split": "test", "path": "data/test-*"}]}], "dataset_info": {"features": [{"name": "question", "dtype": "string"}, {"name": "choices", "sequence": "string"}, {"name": "answer", "dtype": {"class_label": {"names": {"0": "A", "1": "B", "2": "C", "3": "D"}}}}, {"name": "negate_openai_prompt", "struct": [{"name": "content", "dtype": "string"}, {"name": "role", "dtype": "string"}]}, {"name": "neg_question", "dtype": "string"}, {"name": "fewshot_context", "dtype": "string"}, {"name": "ori_prompt", "dtype": "string"}], "splits": [{"name": "dev", "num_bytes": 5403, "num_examples": 5}, {"name": "test", "num_bytes": 994039, "num_examples": 390}], "download_size": 12073, "dataset_size": 999442}}
2023-08-21T06:36:06+00:00
[]
[]
TAGS #region-us
# Dataset Card for "mmlu-high_school_macroeconomics-neg-prepend-fix" More Information needed
[ "# Dataset Card for \"mmlu-high_school_macroeconomics-neg-prepend-fix\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"mmlu-high_school_macroeconomics-neg-prepend-fix\"\n\nMore Information needed" ]
[ 6, 30 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"mmlu-high_school_macroeconomics-neg-prepend-fix\"\n\nMore Information needed" ]
170eded7246b40b3ff7d730f8e9c9cc90e3b0679
# Dataset Card for Evaluation run of heegyu/WizardVicuna-3B-0719 ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/heegyu/WizardVicuna-3B-0719 - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [heegyu/WizardVicuna-3B-0719](https://huggingface.co/heegyu/WizardVicuna-3B-0719) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_heegyu__WizardVicuna-3B-0719", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-10-16T02:17:13.901825](https://huggingface.co/datasets/open-llm-leaderboard/details_heegyu__WizardVicuna-3B-0719/blob/main/results_2023-10-16T02-17-13.901825.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.0037751677852348995, "em_stderr": 0.0006280387809484416, "f1": 0.051164010067114116, "f1_stderr": 0.0013358298092264022, "acc": 0.32304884054493466, "acc_stderr": 0.007945880591434263 }, "harness|drop|3": { "em": 0.0037751677852348995, "em_stderr": 0.0006280387809484416, "f1": 0.051164010067114116, "f1_stderr": 0.0013358298092264022 }, "harness|gsm8k|5": { "acc": 0.0075815011372251705, "acc_stderr": 0.0023892815120772344 }, "harness|winogrande|5": { "acc": 0.6385161799526441, "acc_stderr": 0.013502479670791292 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_heegyu__WizardVicuna-3B-0719
[ "region:us" ]
2023-08-18T11:03:13+00:00
{"pretty_name": "Evaluation run of heegyu/WizardVicuna-3B-0719", "dataset_summary": "Dataset automatically created during the evaluation run of model [heegyu/WizardVicuna-3B-0719](https://huggingface.co/heegyu/WizardVicuna-3B-0719) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_heegyu__WizardVicuna-3B-0719\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-16T02:17:13.901825](https://huggingface.co/datasets/open-llm-leaderboard/details_heegyu__WizardVicuna-3B-0719/blob/main/results_2023-10-16T02-17-13.901825.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0037751677852348995,\n \"em_stderr\": 0.0006280387809484416,\n \"f1\": 0.051164010067114116,\n \"f1_stderr\": 0.0013358298092264022,\n \"acc\": 0.32304884054493466,\n \"acc_stderr\": 0.007945880591434263\n },\n \"harness|drop|3\": {\n \"em\": 0.0037751677852348995,\n \"em_stderr\": 0.0006280387809484416,\n \"f1\": 0.051164010067114116,\n \"f1_stderr\": 0.0013358298092264022\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0075815011372251705,\n \"acc_stderr\": 0.0023892815120772344\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.6385161799526441,\n \"acc_stderr\": 0.013502479670791292\n }\n}\n```", "repo_url": "https://huggingface.co/heegyu/WizardVicuna-3B-0719", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_24T10_31_33.839492", "path": ["**/details_harness|arc:challenge|25_2023-07-24T10:31:33.839492.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-24T10:31:33.839492.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_16T02_17_13.901825", "path": ["**/details_harness|drop|3_2023-10-16T02-17-13.901825.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-16T02-17-13.901825.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_16T02_17_13.901825", "path": ["**/details_harness|gsm8k|5_2023-10-16T02-17-13.901825.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-16T02-17-13.901825.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_24T10_31_33.839492", "path": ["**/details_harness|hellaswag|10_2023-07-24T10:31:33.839492.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-24T10:31:33.839492.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_24T10_31_33.839492", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T10:31:33.839492.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-24T10:31:33.839492.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-24T10:31:33.839492.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T10:31:33.839492.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T10:31:33.839492.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-24T10:31:33.839492.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T10:31:33.839492.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T10:31:33.839492.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T10:31:33.839492.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T10:31:33.839492.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-24T10:31:33.839492.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-24T10:31:33.839492.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T10:31:33.839492.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-24T10:31:33.839492.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T10:31:33.839492.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T10:31:33.839492.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T10:31:33.839492.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-24T10:31:33.839492.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T10:31:33.839492.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T10:31:33.839492.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T10:31:33.839492.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T10:31:33.839492.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T10:31:33.839492.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T10:31:33.839492.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T10:31:33.839492.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T10:31:33.839492.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T10:31:33.839492.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T10:31:33.839492.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T10:31:33.839492.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T10:31:33.839492.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T10:31:33.839492.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T10:31:33.839492.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-24T10:31:33.839492.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T10:31:33.839492.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-24T10:31:33.839492.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T10:31:33.839492.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T10:31:33.839492.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T10:31:33.839492.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-24T10:31:33.839492.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-24T10:31:33.839492.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T10:31:33.839492.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T10:31:33.839492.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T10:31:33.839492.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T10:31:33.839492.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-24T10:31:33.839492.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-24T10:31:33.839492.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-24T10:31:33.839492.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T10:31:33.839492.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-24T10:31:33.839492.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T10:31:33.839492.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T10:31:33.839492.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-24T10:31:33.839492.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-24T10:31:33.839492.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-24T10:31:33.839492.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T10:31:33.839492.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-24T10:31:33.839492.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-24T10:31:33.839492.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T10:31:33.839492.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-24T10:31:33.839492.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-24T10:31:33.839492.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T10:31:33.839492.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T10:31:33.839492.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-24T10:31:33.839492.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T10:31:33.839492.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T10:31:33.839492.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T10:31:33.839492.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T10:31:33.839492.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-24T10:31:33.839492.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-24T10:31:33.839492.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T10:31:33.839492.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-24T10:31:33.839492.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T10:31:33.839492.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T10:31:33.839492.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T10:31:33.839492.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-24T10:31:33.839492.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T10:31:33.839492.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T10:31:33.839492.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T10:31:33.839492.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T10:31:33.839492.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T10:31:33.839492.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T10:31:33.839492.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T10:31:33.839492.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T10:31:33.839492.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T10:31:33.839492.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T10:31:33.839492.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T10:31:33.839492.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T10:31:33.839492.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T10:31:33.839492.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T10:31:33.839492.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-24T10:31:33.839492.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T10:31:33.839492.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-24T10:31:33.839492.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T10:31:33.839492.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T10:31:33.839492.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T10:31:33.839492.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-24T10:31:33.839492.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-24T10:31:33.839492.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T10:31:33.839492.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T10:31:33.839492.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T10:31:33.839492.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T10:31:33.839492.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-24T10:31:33.839492.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-24T10:31:33.839492.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-24T10:31:33.839492.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T10:31:33.839492.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-24T10:31:33.839492.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T10:31:33.839492.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T10:31:33.839492.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-24T10:31:33.839492.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-24T10:31:33.839492.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-24T10:31:33.839492.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T10:31:33.839492.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-24T10:31:33.839492.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-24T10:31:33.839492.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_24T10_31_33.839492", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T10:31:33.839492.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T10:31:33.839492.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_24T10_31_33.839492", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-24T10:31:33.839492.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-24T10:31:33.839492.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_24T10_31_33.839492", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-24T10:31:33.839492.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-24T10:31:33.839492.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_24T10_31_33.839492", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T10:31:33.839492.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T10:31:33.839492.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_24T10_31_33.839492", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T10:31:33.839492.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T10:31:33.839492.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_24T10_31_33.839492", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-24T10:31:33.839492.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-24T10:31:33.839492.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_24T10_31_33.839492", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T10:31:33.839492.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T10:31:33.839492.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_24T10_31_33.839492", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T10:31:33.839492.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T10:31:33.839492.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_24T10_31_33.839492", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T10:31:33.839492.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T10:31:33.839492.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_24T10_31_33.839492", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T10:31:33.839492.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T10:31:33.839492.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_24T10_31_33.839492", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-24T10:31:33.839492.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-24T10:31:33.839492.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_24T10_31_33.839492", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-24T10:31:33.839492.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-24T10:31:33.839492.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_24T10_31_33.839492", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T10:31:33.839492.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T10:31:33.839492.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_24T10_31_33.839492", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-24T10:31:33.839492.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-24T10:31:33.839492.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_24T10_31_33.839492", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T10:31:33.839492.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T10:31:33.839492.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_24T10_31_33.839492", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T10:31:33.839492.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T10:31:33.839492.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_24T10_31_33.839492", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T10:31:33.839492.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T10:31:33.839492.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_24T10_31_33.839492", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-24T10:31:33.839492.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-24T10:31:33.839492.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_24T10_31_33.839492", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T10:31:33.839492.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T10:31:33.839492.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_24T10_31_33.839492", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T10:31:33.839492.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T10:31:33.839492.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_24T10_31_33.839492", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T10:31:33.839492.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T10:31:33.839492.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_24T10_31_33.839492", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T10:31:33.839492.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T10:31:33.839492.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_24T10_31_33.839492", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T10:31:33.839492.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T10:31:33.839492.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_24T10_31_33.839492", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T10:31:33.839492.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T10:31:33.839492.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_24T10_31_33.839492", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T10:31:33.839492.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T10:31:33.839492.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_24T10_31_33.839492", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T10:31:33.839492.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T10:31:33.839492.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_24T10_31_33.839492", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T10:31:33.839492.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T10:31:33.839492.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_24T10_31_33.839492", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T10:31:33.839492.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T10:31:33.839492.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_24T10_31_33.839492", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T10:31:33.839492.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T10:31:33.839492.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_24T10_31_33.839492", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T10:31:33.839492.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T10:31:33.839492.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_24T10_31_33.839492", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T10:31:33.839492.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T10:31:33.839492.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_24T10_31_33.839492", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T10:31:33.839492.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T10:31:33.839492.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_24T10_31_33.839492", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-24T10:31:33.839492.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-24T10:31:33.839492.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_24T10_31_33.839492", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T10:31:33.839492.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T10:31:33.839492.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_24T10_31_33.839492", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-24T10:31:33.839492.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-24T10:31:33.839492.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_24T10_31_33.839492", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T10:31:33.839492.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T10:31:33.839492.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_24T10_31_33.839492", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T10:31:33.839492.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T10:31:33.839492.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_24T10_31_33.839492", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T10:31:33.839492.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T10:31:33.839492.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_24T10_31_33.839492", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-24T10:31:33.839492.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-24T10:31:33.839492.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_24T10_31_33.839492", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-24T10:31:33.839492.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-24T10:31:33.839492.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_24T10_31_33.839492", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T10:31:33.839492.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T10:31:33.839492.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_24T10_31_33.839492", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T10:31:33.839492.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T10:31:33.839492.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_24T10_31_33.839492", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T10:31:33.839492.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T10:31:33.839492.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_24T10_31_33.839492", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T10:31:33.839492.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T10:31:33.839492.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_24T10_31_33.839492", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-24T10:31:33.839492.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-24T10:31:33.839492.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_24T10_31_33.839492", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-24T10:31:33.839492.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-24T10:31:33.839492.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_24T10_31_33.839492", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-24T10:31:33.839492.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-24T10:31:33.839492.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_24T10_31_33.839492", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T10:31:33.839492.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T10:31:33.839492.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_24T10_31_33.839492", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-24T10:31:33.839492.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-24T10:31:33.839492.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_24T10_31_33.839492", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T10:31:33.839492.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T10:31:33.839492.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_24T10_31_33.839492", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T10:31:33.839492.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T10:31:33.839492.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_24T10_31_33.839492", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-24T10:31:33.839492.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-24T10:31:33.839492.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_24T10_31_33.839492", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-24T10:31:33.839492.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-24T10:31:33.839492.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_24T10_31_33.839492", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-24T10:31:33.839492.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-24T10:31:33.839492.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_24T10_31_33.839492", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T10:31:33.839492.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T10:31:33.839492.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_24T10_31_33.839492", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-24T10:31:33.839492.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-24T10:31:33.839492.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_24T10_31_33.839492", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-24T10:31:33.839492.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-24T10:31:33.839492.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_24T10_31_33.839492", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-24T10:31:33.839492.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-24T10:31:33.839492.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_16T02_17_13.901825", "path": ["**/details_harness|winogrande|5_2023-10-16T02-17-13.901825.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-16T02-17-13.901825.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_24T10_31_33.839492", "path": ["results_2023-07-24T10:31:33.839492.parquet"]}, {"split": "2023_10_16T02_17_13.901825", "path": ["results_2023-10-16T02-17-13.901825.parquet"]}, {"split": "latest", "path": ["results_2023-10-16T02-17-13.901825.parquet"]}]}]}
2023-10-16T01:17:26+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of heegyu/WizardVicuna-3B-0719 ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model heegyu/WizardVicuna-3B-0719 on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-10-16T02:17:13.901825(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of heegyu/WizardVicuna-3B-0719", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model heegyu/WizardVicuna-3B-0719 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-16T02:17:13.901825(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of heegyu/WizardVicuna-3B-0719", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model heegyu/WizardVicuna-3B-0719 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-16T02:17:13.901825(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 24, 31, 172, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of heegyu/WizardVicuna-3B-0719## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model heegyu/WizardVicuna-3B-0719 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-16T02:17:13.901825(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
9238cee7dce8a55d9f424c69c11c2d869d418d27
# Dataset Card for Evaluation run of heegyu/WizardVicuna2-13b-hf ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/heegyu/WizardVicuna2-13b-hf - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [heegyu/WizardVicuna2-13b-hf](https://huggingface.co/heegyu/WizardVicuna2-13b-hf) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_heegyu__WizardVicuna2-13b-hf", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-10-23T20:35:02.988920](https://huggingface.co/datasets/open-llm-leaderboard/details_heegyu__WizardVicuna2-13b-hf/blob/main/results_2023-10-23T20-35-02.988920.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.17806208053691275, "em_stderr": 0.003917823631096753, "f1": 0.23031459731543547, "f1_stderr": 0.003944169111986955, "acc": 0.4045526704895304, "acc_stderr": 0.009815196819519213 }, "harness|drop|3": { "em": 0.17806208053691275, "em_stderr": 0.003917823631096753, "f1": 0.23031459731543547, "f1_stderr": 0.003944169111986955 }, "harness|gsm8k|5": { "acc": 0.07429871114480667, "acc_stderr": 0.007223844172845566 }, "harness|winogrande|5": { "acc": 0.7348066298342542, "acc_stderr": 0.01240654946619286 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_heegyu__WizardVicuna2-13b-hf
[ "region:us" ]
2023-08-18T11:03:22+00:00
{"pretty_name": "Evaluation run of heegyu/WizardVicuna2-13b-hf", "dataset_summary": "Dataset automatically created during the evaluation run of model [heegyu/WizardVicuna2-13b-hf](https://huggingface.co/heegyu/WizardVicuna2-13b-hf) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_heegyu__WizardVicuna2-13b-hf\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-23T20:35:02.988920](https://huggingface.co/datasets/open-llm-leaderboard/details_heegyu__WizardVicuna2-13b-hf/blob/main/results_2023-10-23T20-35-02.988920.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.17806208053691275,\n \"em_stderr\": 0.003917823631096753,\n \"f1\": 0.23031459731543547,\n \"f1_stderr\": 0.003944169111986955,\n \"acc\": 0.4045526704895304,\n \"acc_stderr\": 0.009815196819519213\n },\n \"harness|drop|3\": {\n \"em\": 0.17806208053691275,\n \"em_stderr\": 0.003917823631096753,\n \"f1\": 0.23031459731543547,\n \"f1_stderr\": 0.003944169111986955\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.07429871114480667,\n \"acc_stderr\": 0.007223844172845566\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7348066298342542,\n \"acc_stderr\": 0.01240654946619286\n }\n}\n```", "repo_url": "https://huggingface.co/heegyu/WizardVicuna2-13b-hf", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_08_09T15_23_39.656390", "path": ["**/details_harness|arc:challenge|25_2023-08-09T15:23:39.656390.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-08-09T15:23:39.656390.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_23T20_35_02.988920", "path": ["**/details_harness|drop|3_2023-10-23T20-35-02.988920.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-23T20-35-02.988920.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_23T20_35_02.988920", "path": ["**/details_harness|gsm8k|5_2023-10-23T20-35-02.988920.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-23T20-35-02.988920.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_08_09T15_23_39.656390", "path": ["**/details_harness|hellaswag|10_2023-08-09T15:23:39.656390.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-08-09T15:23:39.656390.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_08_09T15_23_39.656390", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T15:23:39.656390.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-09T15:23:39.656390.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-09T15:23:39.656390.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T15:23:39.656390.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T15:23:39.656390.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-09T15:23:39.656390.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T15:23:39.656390.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T15:23:39.656390.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T15:23:39.656390.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T15:23:39.656390.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-09T15:23:39.656390.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-09T15:23:39.656390.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T15:23:39.656390.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-09T15:23:39.656390.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T15:23:39.656390.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T15:23:39.656390.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T15:23:39.656390.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-09T15:23:39.656390.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T15:23:39.656390.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T15:23:39.656390.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T15:23:39.656390.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T15:23:39.656390.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T15:23:39.656390.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T15:23:39.656390.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T15:23:39.656390.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T15:23:39.656390.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T15:23:39.656390.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T15:23:39.656390.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T15:23:39.656390.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T15:23:39.656390.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T15:23:39.656390.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T15:23:39.656390.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-09T15:23:39.656390.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T15:23:39.656390.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-09T15:23:39.656390.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T15:23:39.656390.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T15:23:39.656390.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T15:23:39.656390.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-09T15:23:39.656390.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-09T15:23:39.656390.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T15:23:39.656390.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T15:23:39.656390.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T15:23:39.656390.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T15:23:39.656390.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-09T15:23:39.656390.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-09T15:23:39.656390.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-09T15:23:39.656390.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T15:23:39.656390.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-09T15:23:39.656390.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T15:23:39.656390.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T15:23:39.656390.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-09T15:23:39.656390.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-09T15:23:39.656390.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-09T15:23:39.656390.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T15:23:39.656390.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-09T15:23:39.656390.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-09T15:23:39.656390.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T15:23:39.656390.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-09T15:23:39.656390.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-09T15:23:39.656390.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T15:23:39.656390.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T15:23:39.656390.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-09T15:23:39.656390.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T15:23:39.656390.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T15:23:39.656390.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T15:23:39.656390.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T15:23:39.656390.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-09T15:23:39.656390.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-09T15:23:39.656390.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T15:23:39.656390.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-09T15:23:39.656390.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T15:23:39.656390.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T15:23:39.656390.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T15:23:39.656390.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-09T15:23:39.656390.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T15:23:39.656390.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T15:23:39.656390.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T15:23:39.656390.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T15:23:39.656390.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T15:23:39.656390.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T15:23:39.656390.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T15:23:39.656390.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T15:23:39.656390.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T15:23:39.656390.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T15:23:39.656390.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T15:23:39.656390.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T15:23:39.656390.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T15:23:39.656390.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T15:23:39.656390.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-09T15:23:39.656390.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T15:23:39.656390.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-09T15:23:39.656390.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T15:23:39.656390.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T15:23:39.656390.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T15:23:39.656390.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-09T15:23:39.656390.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-09T15:23:39.656390.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T15:23:39.656390.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T15:23:39.656390.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T15:23:39.656390.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T15:23:39.656390.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-09T15:23:39.656390.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-09T15:23:39.656390.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-09T15:23:39.656390.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T15:23:39.656390.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-09T15:23:39.656390.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T15:23:39.656390.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T15:23:39.656390.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-09T15:23:39.656390.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-09T15:23:39.656390.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-09T15:23:39.656390.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T15:23:39.656390.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-09T15:23:39.656390.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-09T15:23:39.656390.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_08_09T15_23_39.656390", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T15:23:39.656390.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T15:23:39.656390.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_08_09T15_23_39.656390", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-09T15:23:39.656390.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-09T15:23:39.656390.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_08_09T15_23_39.656390", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-09T15:23:39.656390.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-09T15:23:39.656390.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_08_09T15_23_39.656390", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T15:23:39.656390.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T15:23:39.656390.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_08_09T15_23_39.656390", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T15:23:39.656390.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T15:23:39.656390.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_08_09T15_23_39.656390", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-09T15:23:39.656390.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-09T15:23:39.656390.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_08_09T15_23_39.656390", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T15:23:39.656390.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T15:23:39.656390.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_08_09T15_23_39.656390", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T15:23:39.656390.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T15:23:39.656390.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_08_09T15_23_39.656390", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T15:23:39.656390.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T15:23:39.656390.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_08_09T15_23_39.656390", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T15:23:39.656390.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T15:23:39.656390.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_08_09T15_23_39.656390", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-09T15:23:39.656390.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-09T15:23:39.656390.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_08_09T15_23_39.656390", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-09T15:23:39.656390.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-09T15:23:39.656390.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_08_09T15_23_39.656390", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T15:23:39.656390.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T15:23:39.656390.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_08_09T15_23_39.656390", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-09T15:23:39.656390.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-09T15:23:39.656390.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_08_09T15_23_39.656390", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T15:23:39.656390.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T15:23:39.656390.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_08_09T15_23_39.656390", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T15:23:39.656390.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T15:23:39.656390.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_08_09T15_23_39.656390", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T15:23:39.656390.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T15:23:39.656390.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_08_09T15_23_39.656390", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-09T15:23:39.656390.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-09T15:23:39.656390.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_08_09T15_23_39.656390", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T15:23:39.656390.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T15:23:39.656390.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_08_09T15_23_39.656390", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T15:23:39.656390.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T15:23:39.656390.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_08_09T15_23_39.656390", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T15:23:39.656390.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T15:23:39.656390.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_08_09T15_23_39.656390", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T15:23:39.656390.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T15:23:39.656390.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_08_09T15_23_39.656390", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T15:23:39.656390.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T15:23:39.656390.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_08_09T15_23_39.656390", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T15:23:39.656390.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T15:23:39.656390.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_08_09T15_23_39.656390", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T15:23:39.656390.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T15:23:39.656390.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_08_09T15_23_39.656390", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T15:23:39.656390.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T15:23:39.656390.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_08_09T15_23_39.656390", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T15:23:39.656390.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T15:23:39.656390.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_08_09T15_23_39.656390", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T15:23:39.656390.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T15:23:39.656390.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_08_09T15_23_39.656390", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T15:23:39.656390.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T15:23:39.656390.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_08_09T15_23_39.656390", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T15:23:39.656390.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T15:23:39.656390.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_08_09T15_23_39.656390", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T15:23:39.656390.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T15:23:39.656390.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_08_09T15_23_39.656390", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T15:23:39.656390.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T15:23:39.656390.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_08_09T15_23_39.656390", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-09T15:23:39.656390.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-09T15:23:39.656390.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_08_09T15_23_39.656390", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T15:23:39.656390.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T15:23:39.656390.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_08_09T15_23_39.656390", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-09T15:23:39.656390.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-09T15:23:39.656390.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_08_09T15_23_39.656390", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T15:23:39.656390.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T15:23:39.656390.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_08_09T15_23_39.656390", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T15:23:39.656390.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T15:23:39.656390.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_08_09T15_23_39.656390", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T15:23:39.656390.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T15:23:39.656390.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_08_09T15_23_39.656390", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-09T15:23:39.656390.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-09T15:23:39.656390.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_08_09T15_23_39.656390", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-09T15:23:39.656390.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-09T15:23:39.656390.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_08_09T15_23_39.656390", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T15:23:39.656390.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T15:23:39.656390.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_08_09T15_23_39.656390", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T15:23:39.656390.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T15:23:39.656390.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_08_09T15_23_39.656390", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T15:23:39.656390.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T15:23:39.656390.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_08_09T15_23_39.656390", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T15:23:39.656390.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T15:23:39.656390.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_08_09T15_23_39.656390", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-09T15:23:39.656390.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-09T15:23:39.656390.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_08_09T15_23_39.656390", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-09T15:23:39.656390.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-09T15:23:39.656390.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_08_09T15_23_39.656390", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-09T15:23:39.656390.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-09T15:23:39.656390.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_08_09T15_23_39.656390", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T15:23:39.656390.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T15:23:39.656390.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_08_09T15_23_39.656390", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-09T15:23:39.656390.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-09T15:23:39.656390.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_08_09T15_23_39.656390", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T15:23:39.656390.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T15:23:39.656390.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_08_09T15_23_39.656390", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T15:23:39.656390.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T15:23:39.656390.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_08_09T15_23_39.656390", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-09T15:23:39.656390.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-09T15:23:39.656390.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_08_09T15_23_39.656390", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-09T15:23:39.656390.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-09T15:23:39.656390.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_08_09T15_23_39.656390", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-09T15:23:39.656390.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-09T15:23:39.656390.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_08_09T15_23_39.656390", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T15:23:39.656390.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T15:23:39.656390.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_08_09T15_23_39.656390", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-09T15:23:39.656390.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-09T15:23:39.656390.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_08_09T15_23_39.656390", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-09T15:23:39.656390.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-09T15:23:39.656390.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_08_09T15_23_39.656390", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-09T15:23:39.656390.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-09T15:23:39.656390.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_23T20_35_02.988920", "path": ["**/details_harness|winogrande|5_2023-10-23T20-35-02.988920.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-23T20-35-02.988920.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_08_09T15_23_39.656390", "path": ["results_2023-08-09T15:23:39.656390.parquet"]}, {"split": "2023_10_23T20_35_02.988920", "path": ["results_2023-10-23T20-35-02.988920.parquet"]}, {"split": "latest", "path": ["results_2023-10-23T20-35-02.988920.parquet"]}]}]}
2023-10-23T19:35:15+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of heegyu/WizardVicuna2-13b-hf ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model heegyu/WizardVicuna2-13b-hf on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-10-23T20:35:02.988920(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of heegyu/WizardVicuna2-13b-hf", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model heegyu/WizardVicuna2-13b-hf on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-23T20:35:02.988920(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of heegyu/WizardVicuna2-13b-hf", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model heegyu/WizardVicuna2-13b-hf on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-23T20:35:02.988920(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 25, 31, 173, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of heegyu/WizardVicuna2-13b-hf## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model heegyu/WizardVicuna2-13b-hf on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-23T20:35:02.988920(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
2aa0db11310d92062576b8063dc648bee13ef72b
# Dataset Card for "mmlu-high_school_mathematics-neg-prepend-fix" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
joey234/mmlu-high_school_mathematics-neg-prepend-fix
[ "region:us" ]
2023-08-18T11:03:25+00:00
{"configs": [{"config_name": "default", "data_files": [{"split": "dev", "path": "data/dev-*"}, {"split": "test", "path": "data/test-*"}]}], "dataset_info": {"features": [{"name": "question", "dtype": "string"}, {"name": "choices", "sequence": "string"}, {"name": "answer", "dtype": {"class_label": {"names": {"0": "A", "1": "B", "2": "C", "3": "D"}}}}, {"name": "negate_openai_prompt", "struct": [{"name": "content", "dtype": "string"}, {"name": "role", "dtype": "string"}]}, {"name": "neg_question", "dtype": "string"}, {"name": "fewshot_context", "dtype": "string"}, {"name": "ori_prompt", "dtype": "string"}], "splits": [{"name": "dev", "num_bytes": 6798, "num_examples": 5}, {"name": "test", "num_bytes": 654905, "num_examples": 270}], "download_size": 15082, "dataset_size": 661703}}
2023-08-21T06:36:19+00:00
[]
[]
TAGS #region-us
# Dataset Card for "mmlu-high_school_mathematics-neg-prepend-fix" More Information needed
[ "# Dataset Card for \"mmlu-high_school_mathematics-neg-prepend-fix\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"mmlu-high_school_mathematics-neg-prepend-fix\"\n\nMore Information needed" ]
[ 6, 29 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"mmlu-high_school_mathematics-neg-prepend-fix\"\n\nMore Information needed" ]
7840d5c6449fa8fe6bb0f94dbbb87c96f8c101a8
# Dataset Card for Evaluation run of Writer/palmyra-base ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/Writer/palmyra-base - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [Writer/palmyra-base](https://huggingface.co/Writer/palmyra-base) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 122 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_Writer__palmyra-base", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-10-17T01:27:06.940630](https://huggingface.co/datasets/open-llm-leaderboard/details_Writer__palmyra-base/blob/main/results_2023-10-17T01-27-06.940630.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.0010486577181208054, "em_stderr": 0.0003314581465219252, "f1": 0.04800964765100684, "f1_stderr": 0.0011968648184797989, "acc": 0.29537785734929894, "acc_stderr": 0.00829420088462589 }, "harness|drop|3": { "em": 0.0010486577181208054, "em_stderr": 0.0003314581465219252, "f1": 0.04800964765100684, "f1_stderr": 0.0011968648184797989 }, "harness|gsm8k|5": { "acc": 0.009855951478392721, "acc_stderr": 0.0027210765770416625 }, "harness|winogrande|5": { "acc": 0.5808997632202052, "acc_stderr": 0.013867325192210116 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_Writer__palmyra-base
[ "region:us" ]
2023-08-18T11:03:30+00:00
{"pretty_name": "Evaluation run of Writer/palmyra-base", "dataset_summary": "Dataset automatically created during the evaluation run of model [Writer/palmyra-base](https://huggingface.co/Writer/palmyra-base) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 122 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Writer__palmyra-base\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-17T01:27:06.940630](https://huggingface.co/datasets/open-llm-leaderboard/details_Writer__palmyra-base/blob/main/results_2023-10-17T01-27-06.940630.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0010486577181208054,\n \"em_stderr\": 0.0003314581465219252,\n \"f1\": 0.04800964765100684,\n \"f1_stderr\": 0.0011968648184797989,\n \"acc\": 0.29537785734929894,\n \"acc_stderr\": 0.00829420088462589\n },\n \"harness|drop|3\": {\n \"em\": 0.0010486577181208054,\n \"em_stderr\": 0.0003314581465219252,\n \"f1\": 0.04800964765100684,\n \"f1_stderr\": 0.0011968648184797989\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.009855951478392721,\n \"acc_stderr\": 0.0027210765770416625\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.5808997632202052,\n \"acc_stderr\": 0.013867325192210116\n }\n}\n```", "repo_url": "https://huggingface.co/Writer/palmyra-base", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_19T12_49_48.066230", "path": ["**/details_harness|arc:challenge|25_2023-07-19T12:49:48.066230.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-19T12:49:48.066230.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_17T01_27_06.940630", "path": ["**/details_harness|drop|3_2023-10-17T01-27-06.940630.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-17T01-27-06.940630.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_17T01_27_06.940630", "path": ["**/details_harness|gsm8k|5_2023-10-17T01-27-06.940630.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-17T01-27-06.940630.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_19T12_49_48.066230", "path": ["**/details_harness|hellaswag|10_2023-07-19T12:49:48.066230.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-19T12:49:48.066230.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_19T12_49_48.066230", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T12:49:48.066230.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T12:49:48.066230.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T12:49:48.066230.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T12:49:48.066230.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T12:49:48.066230.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T12:49:48.066230.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T12:49:48.066230.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T12:49:48.066230.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T12:49:48.066230.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T12:49:48.066230.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T12:49:48.066230.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T12:49:48.066230.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T12:49:48.066230.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T12:49:48.066230.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T12:49:48.066230.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T12:49:48.066230.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T12:49:48.066230.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T12:49:48.066230.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T12:49:48.066230.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T12:49:48.066230.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T12:49:48.066230.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T12:49:48.066230.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T12:49:48.066230.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T12:49:48.066230.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T12:49:48.066230.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T12:49:48.066230.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T12:49:48.066230.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T12:49:48.066230.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T12:49:48.066230.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T12:49:48.066230.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T12:49:48.066230.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T12:49:48.066230.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T12:49:48.066230.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T12:49:48.066230.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T12:49:48.066230.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T12:49:48.066230.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T12:49:48.066230.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T12:49:48.066230.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T12:49:48.066230.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T12:49:48.066230.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T12:49:48.066230.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T12:49:48.066230.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T12:49:48.066230.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T12:49:48.066230.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T12:49:48.066230.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T12:49:48.066230.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T12:49:48.066230.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T12:49:48.066230.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T12:49:48.066230.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T12:49:48.066230.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T12:49:48.066230.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T12:49:48.066230.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T12:49:48.066230.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T12:49:48.066230.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T12:49:48.066230.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T12:49:48.066230.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T12:49:48.066230.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T12:49:48.066230.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T12:49:48.066230.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T12:49:48.066230.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T12:49:48.066230.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T12:49:48.066230.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T12:49:48.066230.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T12:49:48.066230.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T12:49:48.066230.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T12:49:48.066230.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T12:49:48.066230.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T12:49:48.066230.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T12:49:48.066230.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T12:49:48.066230.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T12:49:48.066230.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T12:49:48.066230.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T12:49:48.066230.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T12:49:48.066230.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T12:49:48.066230.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T12:49:48.066230.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T12:49:48.066230.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T12:49:48.066230.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T12:49:48.066230.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T12:49:48.066230.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T12:49:48.066230.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T12:49:48.066230.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T12:49:48.066230.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T12:49:48.066230.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T12:49:48.066230.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T12:49:48.066230.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T12:49:48.066230.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T12:49:48.066230.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T12:49:48.066230.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T12:49:48.066230.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T12:49:48.066230.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T12:49:48.066230.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T12:49:48.066230.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T12:49:48.066230.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T12:49:48.066230.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T12:49:48.066230.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T12:49:48.066230.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T12:49:48.066230.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T12:49:48.066230.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T12:49:48.066230.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T12:49:48.066230.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T12:49:48.066230.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T12:49:48.066230.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T12:49:48.066230.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T12:49:48.066230.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T12:49:48.066230.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T12:49:48.066230.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T12:49:48.066230.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T12:49:48.066230.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T12:49:48.066230.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T12:49:48.066230.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T12:49:48.066230.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T12:49:48.066230.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T12:49:48.066230.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_19T12_49_48.066230", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T12:49:48.066230.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T12:49:48.066230.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_19T12_49_48.066230", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T12:49:48.066230.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T12:49:48.066230.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_19T12_49_48.066230", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T12:49:48.066230.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T12:49:48.066230.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_19T12_49_48.066230", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T12:49:48.066230.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T12:49:48.066230.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_19T12_49_48.066230", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T12:49:48.066230.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T12:49:48.066230.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_19T12_49_48.066230", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T12:49:48.066230.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T12:49:48.066230.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_19T12_49_48.066230", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T12:49:48.066230.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T12:49:48.066230.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_19T12_49_48.066230", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T12:49:48.066230.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T12:49:48.066230.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_19T12_49_48.066230", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T12:49:48.066230.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T12:49:48.066230.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_19T12_49_48.066230", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T12:49:48.066230.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T12:49:48.066230.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_19T12_49_48.066230", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T12:49:48.066230.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T12:49:48.066230.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_19T12_49_48.066230", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T12:49:48.066230.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T12:49:48.066230.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_19T12_49_48.066230", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T12:49:48.066230.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T12:49:48.066230.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_19T12_49_48.066230", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T12:49:48.066230.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T12:49:48.066230.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_19T12_49_48.066230", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T12:49:48.066230.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T12:49:48.066230.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_19T12_49_48.066230", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T12:49:48.066230.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T12:49:48.066230.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_19T12_49_48.066230", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T12:49:48.066230.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T12:49:48.066230.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_19T12_49_48.066230", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T12:49:48.066230.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T12:49:48.066230.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_19T12_49_48.066230", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T12:49:48.066230.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T12:49:48.066230.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_19T12_49_48.066230", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T12:49:48.066230.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T12:49:48.066230.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_19T12_49_48.066230", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T12:49:48.066230.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T12:49:48.066230.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_19T12_49_48.066230", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T12:49:48.066230.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T12:49:48.066230.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_19T12_49_48.066230", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T12:49:48.066230.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T12:49:48.066230.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_19T12_49_48.066230", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T12:49:48.066230.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T12:49:48.066230.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_19T12_49_48.066230", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T12:49:48.066230.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T12:49:48.066230.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_19T12_49_48.066230", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T12:49:48.066230.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T12:49:48.066230.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_19T12_49_48.066230", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T12:49:48.066230.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T12:49:48.066230.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_19T12_49_48.066230", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T12:49:48.066230.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T12:49:48.066230.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_19T12_49_48.066230", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T12:49:48.066230.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T12:49:48.066230.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_19T12_49_48.066230", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T12:49:48.066230.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T12:49:48.066230.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_19T12_49_48.066230", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T12:49:48.066230.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T12:49:48.066230.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_19T12_49_48.066230", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T12:49:48.066230.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T12:49:48.066230.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_19T12_49_48.066230", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T12:49:48.066230.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T12:49:48.066230.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_19T12_49_48.066230", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T12:49:48.066230.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T12:49:48.066230.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_19T12_49_48.066230", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T12:49:48.066230.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T12:49:48.066230.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_19T12_49_48.066230", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T12:49:48.066230.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T12:49:48.066230.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_19T12_49_48.066230", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T12:49:48.066230.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T12:49:48.066230.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_19T12_49_48.066230", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T12:49:48.066230.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T12:49:48.066230.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_19T12_49_48.066230", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T12:49:48.066230.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T12:49:48.066230.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_19T12_49_48.066230", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T12:49:48.066230.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T12:49:48.066230.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_19T12_49_48.066230", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T12:49:48.066230.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T12:49:48.066230.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_19T12_49_48.066230", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T12:49:48.066230.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T12:49:48.066230.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_19T12_49_48.066230", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T12:49:48.066230.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T12:49:48.066230.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_19T12_49_48.066230", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T12:49:48.066230.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T12:49:48.066230.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_19T12_49_48.066230", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T12:49:48.066230.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T12:49:48.066230.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_19T12_49_48.066230", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T12:49:48.066230.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T12:49:48.066230.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_19T12_49_48.066230", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T12:49:48.066230.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T12:49:48.066230.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_19T12_49_48.066230", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T12:49:48.066230.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T12:49:48.066230.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_19T12_49_48.066230", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T12:49:48.066230.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T12:49:48.066230.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_19T12_49_48.066230", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T12:49:48.066230.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T12:49:48.066230.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_19T12_49_48.066230", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T12:49:48.066230.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T12:49:48.066230.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_19T12_49_48.066230", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T12:49:48.066230.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T12:49:48.066230.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_19T12_49_48.066230", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T12:49:48.066230.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T12:49:48.066230.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_19T12_49_48.066230", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T12:49:48.066230.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T12:49:48.066230.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_19T12_49_48.066230", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T12:49:48.066230.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T12:49:48.066230.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_19T12_49_48.066230", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T12:49:48.066230.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T12:49:48.066230.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_19T12_49_48.066230", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T12:49:48.066230.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T12:49:48.066230.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_19T12_49_48.066230", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T12:49:48.066230.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T12:49:48.066230.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_17T01_27_06.940630", "path": ["**/details_harness|winogrande|5_2023-10-17T01-27-06.940630.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-17T01-27-06.940630.parquet"]}]}, {"config_name": "original_mmlu_5", "data_files": [{"split": "2023_08_28T20_42_00.075340", "path": ["**/details_original|mmlu:abstract_algebra|5_2023-08-28T20:42:00.075340.parquet", "**/details_original|mmlu:anatomy|5_2023-08-28T20:42:00.075340.parquet", "**/details_original|mmlu:astronomy|5_2023-08-28T20:42:00.075340.parquet", "**/details_original|mmlu:business_ethics|5_2023-08-28T20:42:00.075340.parquet", "**/details_original|mmlu:clinical_knowledge|5_2023-08-28T20:42:00.075340.parquet", "**/details_original|mmlu:college_biology|5_2023-08-28T20:42:00.075340.parquet", "**/details_original|mmlu:college_chemistry|5_2023-08-28T20:42:00.075340.parquet", "**/details_original|mmlu:college_computer_science|5_2023-08-28T20:42:00.075340.parquet", "**/details_original|mmlu:college_mathematics|5_2023-08-28T20:42:00.075340.parquet", "**/details_original|mmlu:college_medicine|5_2023-08-28T20:42:00.075340.parquet", "**/details_original|mmlu:college_physics|5_2023-08-28T20:42:00.075340.parquet", "**/details_original|mmlu:computer_security|5_2023-08-28T20:42:00.075340.parquet", "**/details_original|mmlu:conceptual_physics|5_2023-08-28T20:42:00.075340.parquet", "**/details_original|mmlu:econometrics|5_2023-08-28T20:42:00.075340.parquet", "**/details_original|mmlu:electrical_engineering|5_2023-08-28T20:42:00.075340.parquet", "**/details_original|mmlu:elementary_mathematics|5_2023-08-28T20:42:00.075340.parquet", "**/details_original|mmlu:formal_logic|5_2023-08-28T20:42:00.075340.parquet", "**/details_original|mmlu:global_facts|5_2023-08-28T20:42:00.075340.parquet", "**/details_original|mmlu:high_school_biology|5_2023-08-28T20:42:00.075340.parquet", "**/details_original|mmlu:high_school_chemistry|5_2023-08-28T20:42:00.075340.parquet", "**/details_original|mmlu:high_school_computer_science|5_2023-08-28T20:42:00.075340.parquet", "**/details_original|mmlu:high_school_european_history|5_2023-08-28T20:42:00.075340.parquet", "**/details_original|mmlu:high_school_geography|5_2023-08-28T20:42:00.075340.parquet", "**/details_original|mmlu:high_school_government_and_politics|5_2023-08-28T20:42:00.075340.parquet", "**/details_original|mmlu:high_school_macroeconomics|5_2023-08-28T20:42:00.075340.parquet", "**/details_original|mmlu:high_school_mathematics|5_2023-08-28T20:42:00.075340.parquet", "**/details_original|mmlu:high_school_microeconomics|5_2023-08-28T20:42:00.075340.parquet", "**/details_original|mmlu:high_school_physics|5_2023-08-28T20:42:00.075340.parquet", "**/details_original|mmlu:high_school_psychology|5_2023-08-28T20:42:00.075340.parquet", "**/details_original|mmlu:high_school_statistics|5_2023-08-28T20:42:00.075340.parquet", "**/details_original|mmlu:high_school_us_history|5_2023-08-28T20:42:00.075340.parquet", "**/details_original|mmlu:high_school_world_history|5_2023-08-28T20:42:00.075340.parquet", "**/details_original|mmlu:human_aging|5_2023-08-28T20:42:00.075340.parquet", "**/details_original|mmlu:human_sexuality|5_2023-08-28T20:42:00.075340.parquet", "**/details_original|mmlu:international_law|5_2023-08-28T20:42:00.075340.parquet", "**/details_original|mmlu:jurisprudence|5_2023-08-28T20:42:00.075340.parquet", "**/details_original|mmlu:logical_fallacies|5_2023-08-28T20:42:00.075340.parquet", "**/details_original|mmlu:machine_learning|5_2023-08-28T20:42:00.075340.parquet", "**/details_original|mmlu:management|5_2023-08-28T20:42:00.075340.parquet", "**/details_original|mmlu:marketing|5_2023-08-28T20:42:00.075340.parquet", "**/details_original|mmlu:medical_genetics|5_2023-08-28T20:42:00.075340.parquet", "**/details_original|mmlu:miscellaneous|5_2023-08-28T20:42:00.075340.parquet", "**/details_original|mmlu:moral_disputes|5_2023-08-28T20:42:00.075340.parquet", "**/details_original|mmlu:moral_scenarios|5_2023-08-28T20:42:00.075340.parquet", "**/details_original|mmlu:nutrition|5_2023-08-28T20:42:00.075340.parquet", "**/details_original|mmlu:philosophy|5_2023-08-28T20:42:00.075340.parquet", "**/details_original|mmlu:prehistory|5_2023-08-28T20:42:00.075340.parquet", "**/details_original|mmlu:professional_accounting|5_2023-08-28T20:42:00.075340.parquet", "**/details_original|mmlu:professional_law|5_2023-08-28T20:42:00.075340.parquet", "**/details_original|mmlu:professional_medicine|5_2023-08-28T20:42:00.075340.parquet", "**/details_original|mmlu:professional_psychology|5_2023-08-28T20:42:00.075340.parquet", "**/details_original|mmlu:public_relations|5_2023-08-28T20:42:00.075340.parquet", "**/details_original|mmlu:security_studies|5_2023-08-28T20:42:00.075340.parquet", "**/details_original|mmlu:sociology|5_2023-08-28T20:42:00.075340.parquet", "**/details_original|mmlu:us_foreign_policy|5_2023-08-28T20:42:00.075340.parquet", "**/details_original|mmlu:virology|5_2023-08-28T20:42:00.075340.parquet", "**/details_original|mmlu:world_religions|5_2023-08-28T20:42:00.075340.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:abstract_algebra|5_2023-08-28T20:42:00.075340.parquet", "**/details_original|mmlu:anatomy|5_2023-08-28T20:42:00.075340.parquet", "**/details_original|mmlu:astronomy|5_2023-08-28T20:42:00.075340.parquet", "**/details_original|mmlu:business_ethics|5_2023-08-28T20:42:00.075340.parquet", "**/details_original|mmlu:clinical_knowledge|5_2023-08-28T20:42:00.075340.parquet", "**/details_original|mmlu:college_biology|5_2023-08-28T20:42:00.075340.parquet", "**/details_original|mmlu:college_chemistry|5_2023-08-28T20:42:00.075340.parquet", "**/details_original|mmlu:college_computer_science|5_2023-08-28T20:42:00.075340.parquet", "**/details_original|mmlu:college_mathematics|5_2023-08-28T20:42:00.075340.parquet", "**/details_original|mmlu:college_medicine|5_2023-08-28T20:42:00.075340.parquet", "**/details_original|mmlu:college_physics|5_2023-08-28T20:42:00.075340.parquet", "**/details_original|mmlu:computer_security|5_2023-08-28T20:42:00.075340.parquet", "**/details_original|mmlu:conceptual_physics|5_2023-08-28T20:42:00.075340.parquet", "**/details_original|mmlu:econometrics|5_2023-08-28T20:42:00.075340.parquet", "**/details_original|mmlu:electrical_engineering|5_2023-08-28T20:42:00.075340.parquet", "**/details_original|mmlu:elementary_mathematics|5_2023-08-28T20:42:00.075340.parquet", "**/details_original|mmlu:formal_logic|5_2023-08-28T20:42:00.075340.parquet", "**/details_original|mmlu:global_facts|5_2023-08-28T20:42:00.075340.parquet", "**/details_original|mmlu:high_school_biology|5_2023-08-28T20:42:00.075340.parquet", "**/details_original|mmlu:high_school_chemistry|5_2023-08-28T20:42:00.075340.parquet", "**/details_original|mmlu:high_school_computer_science|5_2023-08-28T20:42:00.075340.parquet", "**/details_original|mmlu:high_school_european_history|5_2023-08-28T20:42:00.075340.parquet", "**/details_original|mmlu:high_school_geography|5_2023-08-28T20:42:00.075340.parquet", "**/details_original|mmlu:high_school_government_and_politics|5_2023-08-28T20:42:00.075340.parquet", "**/details_original|mmlu:high_school_macroeconomics|5_2023-08-28T20:42:00.075340.parquet", "**/details_original|mmlu:high_school_mathematics|5_2023-08-28T20:42:00.075340.parquet", "**/details_original|mmlu:high_school_microeconomics|5_2023-08-28T20:42:00.075340.parquet", "**/details_original|mmlu:high_school_physics|5_2023-08-28T20:42:00.075340.parquet", "**/details_original|mmlu:high_school_psychology|5_2023-08-28T20:42:00.075340.parquet", "**/details_original|mmlu:high_school_statistics|5_2023-08-28T20:42:00.075340.parquet", "**/details_original|mmlu:high_school_us_history|5_2023-08-28T20:42:00.075340.parquet", "**/details_original|mmlu:high_school_world_history|5_2023-08-28T20:42:00.075340.parquet", "**/details_original|mmlu:human_aging|5_2023-08-28T20:42:00.075340.parquet", "**/details_original|mmlu:human_sexuality|5_2023-08-28T20:42:00.075340.parquet", "**/details_original|mmlu:international_law|5_2023-08-28T20:42:00.075340.parquet", "**/details_original|mmlu:jurisprudence|5_2023-08-28T20:42:00.075340.parquet", "**/details_original|mmlu:logical_fallacies|5_2023-08-28T20:42:00.075340.parquet", "**/details_original|mmlu:machine_learning|5_2023-08-28T20:42:00.075340.parquet", "**/details_original|mmlu:management|5_2023-08-28T20:42:00.075340.parquet", "**/details_original|mmlu:marketing|5_2023-08-28T20:42:00.075340.parquet", "**/details_original|mmlu:medical_genetics|5_2023-08-28T20:42:00.075340.parquet", "**/details_original|mmlu:miscellaneous|5_2023-08-28T20:42:00.075340.parquet", "**/details_original|mmlu:moral_disputes|5_2023-08-28T20:42:00.075340.parquet", "**/details_original|mmlu:moral_scenarios|5_2023-08-28T20:42:00.075340.parquet", "**/details_original|mmlu:nutrition|5_2023-08-28T20:42:00.075340.parquet", "**/details_original|mmlu:philosophy|5_2023-08-28T20:42:00.075340.parquet", "**/details_original|mmlu:prehistory|5_2023-08-28T20:42:00.075340.parquet", "**/details_original|mmlu:professional_accounting|5_2023-08-28T20:42:00.075340.parquet", "**/details_original|mmlu:professional_law|5_2023-08-28T20:42:00.075340.parquet", "**/details_original|mmlu:professional_medicine|5_2023-08-28T20:42:00.075340.parquet", "**/details_original|mmlu:professional_psychology|5_2023-08-28T20:42:00.075340.parquet", "**/details_original|mmlu:public_relations|5_2023-08-28T20:42:00.075340.parquet", "**/details_original|mmlu:security_studies|5_2023-08-28T20:42:00.075340.parquet", "**/details_original|mmlu:sociology|5_2023-08-28T20:42:00.075340.parquet", "**/details_original|mmlu:us_foreign_policy|5_2023-08-28T20:42:00.075340.parquet", "**/details_original|mmlu:virology|5_2023-08-28T20:42:00.075340.parquet", "**/details_original|mmlu:world_religions|5_2023-08-28T20:42:00.075340.parquet"]}]}, {"config_name": "original_mmlu_abstract_algebra_5", "data_files": [{"split": "2023_08_28T20_42_00.075340", "path": ["**/details_original|mmlu:abstract_algebra|5_2023-08-28T20:42:00.075340.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:abstract_algebra|5_2023-08-28T20:42:00.075340.parquet"]}]}, {"config_name": "original_mmlu_anatomy_5", "data_files": [{"split": "2023_08_28T20_42_00.075340", "path": ["**/details_original|mmlu:anatomy|5_2023-08-28T20:42:00.075340.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:anatomy|5_2023-08-28T20:42:00.075340.parquet"]}]}, {"config_name": "original_mmlu_astronomy_5", "data_files": [{"split": "2023_08_28T20_42_00.075340", "path": ["**/details_original|mmlu:astronomy|5_2023-08-28T20:42:00.075340.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:astronomy|5_2023-08-28T20:42:00.075340.parquet"]}]}, {"config_name": "original_mmlu_business_ethics_5", "data_files": [{"split": "2023_08_28T20_42_00.075340", "path": ["**/details_original|mmlu:business_ethics|5_2023-08-28T20:42:00.075340.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:business_ethics|5_2023-08-28T20:42:00.075340.parquet"]}]}, {"config_name": "original_mmlu_clinical_knowledge_5", "data_files": [{"split": "2023_08_28T20_42_00.075340", "path": ["**/details_original|mmlu:clinical_knowledge|5_2023-08-28T20:42:00.075340.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:clinical_knowledge|5_2023-08-28T20:42:00.075340.parquet"]}]}, {"config_name": "original_mmlu_college_biology_5", "data_files": [{"split": "2023_08_28T20_42_00.075340", "path": ["**/details_original|mmlu:college_biology|5_2023-08-28T20:42:00.075340.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:college_biology|5_2023-08-28T20:42:00.075340.parquet"]}]}, {"config_name": "original_mmlu_college_chemistry_5", "data_files": [{"split": "2023_08_28T20_42_00.075340", "path": ["**/details_original|mmlu:college_chemistry|5_2023-08-28T20:42:00.075340.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:college_chemistry|5_2023-08-28T20:42:00.075340.parquet"]}]}, {"config_name": "original_mmlu_college_computer_science_5", "data_files": [{"split": "2023_08_28T20_42_00.075340", "path": ["**/details_original|mmlu:college_computer_science|5_2023-08-28T20:42:00.075340.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:college_computer_science|5_2023-08-28T20:42:00.075340.parquet"]}]}, {"config_name": "original_mmlu_college_mathematics_5", "data_files": [{"split": "2023_08_28T20_42_00.075340", "path": ["**/details_original|mmlu:college_mathematics|5_2023-08-28T20:42:00.075340.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:college_mathematics|5_2023-08-28T20:42:00.075340.parquet"]}]}, {"config_name": "original_mmlu_college_medicine_5", "data_files": [{"split": "2023_08_28T20_42_00.075340", "path": ["**/details_original|mmlu:college_medicine|5_2023-08-28T20:42:00.075340.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:college_medicine|5_2023-08-28T20:42:00.075340.parquet"]}]}, {"config_name": "original_mmlu_college_physics_5", "data_files": [{"split": "2023_08_28T20_42_00.075340", "path": ["**/details_original|mmlu:college_physics|5_2023-08-28T20:42:00.075340.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:college_physics|5_2023-08-28T20:42:00.075340.parquet"]}]}, {"config_name": "original_mmlu_computer_security_5", "data_files": [{"split": "2023_08_28T20_42_00.075340", "path": ["**/details_original|mmlu:computer_security|5_2023-08-28T20:42:00.075340.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:computer_security|5_2023-08-28T20:42:00.075340.parquet"]}]}, {"config_name": "original_mmlu_conceptual_physics_5", "data_files": [{"split": "2023_08_28T20_42_00.075340", "path": ["**/details_original|mmlu:conceptual_physics|5_2023-08-28T20:42:00.075340.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:conceptual_physics|5_2023-08-28T20:42:00.075340.parquet"]}]}, {"config_name": "original_mmlu_econometrics_5", "data_files": [{"split": "2023_08_28T20_42_00.075340", "path": ["**/details_original|mmlu:econometrics|5_2023-08-28T20:42:00.075340.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:econometrics|5_2023-08-28T20:42:00.075340.parquet"]}]}, {"config_name": "original_mmlu_electrical_engineering_5", "data_files": [{"split": "2023_08_28T20_42_00.075340", "path": ["**/details_original|mmlu:electrical_engineering|5_2023-08-28T20:42:00.075340.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:electrical_engineering|5_2023-08-28T20:42:00.075340.parquet"]}]}, {"config_name": "original_mmlu_elementary_mathematics_5", "data_files": [{"split": "2023_08_28T20_42_00.075340", "path": ["**/details_original|mmlu:elementary_mathematics|5_2023-08-28T20:42:00.075340.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:elementary_mathematics|5_2023-08-28T20:42:00.075340.parquet"]}]}, {"config_name": "original_mmlu_formal_logic_5", "data_files": [{"split": "2023_08_28T20_42_00.075340", "path": ["**/details_original|mmlu:formal_logic|5_2023-08-28T20:42:00.075340.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:formal_logic|5_2023-08-28T20:42:00.075340.parquet"]}]}, {"config_name": "original_mmlu_global_facts_5", "data_files": [{"split": "2023_08_28T20_42_00.075340", "path": ["**/details_original|mmlu:global_facts|5_2023-08-28T20:42:00.075340.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:global_facts|5_2023-08-28T20:42:00.075340.parquet"]}]}, {"config_name": "original_mmlu_high_school_biology_5", "data_files": [{"split": "2023_08_28T20_42_00.075340", "path": ["**/details_original|mmlu:high_school_biology|5_2023-08-28T20:42:00.075340.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:high_school_biology|5_2023-08-28T20:42:00.075340.parquet"]}]}, {"config_name": "original_mmlu_high_school_chemistry_5", "data_files": [{"split": "2023_08_28T20_42_00.075340", "path": ["**/details_original|mmlu:high_school_chemistry|5_2023-08-28T20:42:00.075340.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:high_school_chemistry|5_2023-08-28T20:42:00.075340.parquet"]}]}, {"config_name": "original_mmlu_high_school_computer_science_5", "data_files": [{"split": "2023_08_28T20_42_00.075340", "path": ["**/details_original|mmlu:high_school_computer_science|5_2023-08-28T20:42:00.075340.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:high_school_computer_science|5_2023-08-28T20:42:00.075340.parquet"]}]}, {"config_name": "original_mmlu_high_school_european_history_5", "data_files": [{"split": "2023_08_28T20_42_00.075340", "path": ["**/details_original|mmlu:high_school_european_history|5_2023-08-28T20:42:00.075340.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:high_school_european_history|5_2023-08-28T20:42:00.075340.parquet"]}]}, {"config_name": "original_mmlu_high_school_geography_5", "data_files": [{"split": "2023_08_28T20_42_00.075340", "path": ["**/details_original|mmlu:high_school_geography|5_2023-08-28T20:42:00.075340.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:high_school_geography|5_2023-08-28T20:42:00.075340.parquet"]}]}, {"config_name": "original_mmlu_high_school_government_and_politics_5", "data_files": [{"split": "2023_08_28T20_42_00.075340", "path": ["**/details_original|mmlu:high_school_government_and_politics|5_2023-08-28T20:42:00.075340.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:high_school_government_and_politics|5_2023-08-28T20:42:00.075340.parquet"]}]}, {"config_name": "original_mmlu_high_school_macroeconomics_5", "data_files": [{"split": "2023_08_28T20_42_00.075340", "path": ["**/details_original|mmlu:high_school_macroeconomics|5_2023-08-28T20:42:00.075340.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:high_school_macroeconomics|5_2023-08-28T20:42:00.075340.parquet"]}]}, {"config_name": "original_mmlu_high_school_mathematics_5", "data_files": [{"split": "2023_08_28T20_42_00.075340", "path": ["**/details_original|mmlu:high_school_mathematics|5_2023-08-28T20:42:00.075340.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:high_school_mathematics|5_2023-08-28T20:42:00.075340.parquet"]}]}, {"config_name": "original_mmlu_high_school_microeconomics_5", "data_files": [{"split": "2023_08_28T20_42_00.075340", "path": ["**/details_original|mmlu:high_school_microeconomics|5_2023-08-28T20:42:00.075340.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:high_school_microeconomics|5_2023-08-28T20:42:00.075340.parquet"]}]}, {"config_name": "original_mmlu_high_school_physics_5", "data_files": [{"split": "2023_08_28T20_42_00.075340", "path": ["**/details_original|mmlu:high_school_physics|5_2023-08-28T20:42:00.075340.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:high_school_physics|5_2023-08-28T20:42:00.075340.parquet"]}]}, {"config_name": "original_mmlu_high_school_psychology_5", "data_files": [{"split": "2023_08_28T20_42_00.075340", "path": ["**/details_original|mmlu:high_school_psychology|5_2023-08-28T20:42:00.075340.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:high_school_psychology|5_2023-08-28T20:42:00.075340.parquet"]}]}, {"config_name": "original_mmlu_high_school_statistics_5", "data_files": [{"split": "2023_08_28T20_42_00.075340", "path": ["**/details_original|mmlu:high_school_statistics|5_2023-08-28T20:42:00.075340.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:high_school_statistics|5_2023-08-28T20:42:00.075340.parquet"]}]}, {"config_name": "original_mmlu_high_school_us_history_5", "data_files": [{"split": "2023_08_28T20_42_00.075340", "path": ["**/details_original|mmlu:high_school_us_history|5_2023-08-28T20:42:00.075340.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:high_school_us_history|5_2023-08-28T20:42:00.075340.parquet"]}]}, {"config_name": "original_mmlu_high_school_world_history_5", "data_files": [{"split": "2023_08_28T20_42_00.075340", "path": ["**/details_original|mmlu:high_school_world_history|5_2023-08-28T20:42:00.075340.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:high_school_world_history|5_2023-08-28T20:42:00.075340.parquet"]}]}, {"config_name": "original_mmlu_human_aging_5", "data_files": [{"split": "2023_08_28T20_42_00.075340", "path": ["**/details_original|mmlu:human_aging|5_2023-08-28T20:42:00.075340.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:human_aging|5_2023-08-28T20:42:00.075340.parquet"]}]}, {"config_name": "original_mmlu_human_sexuality_5", "data_files": [{"split": "2023_08_28T20_42_00.075340", "path": ["**/details_original|mmlu:human_sexuality|5_2023-08-28T20:42:00.075340.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:human_sexuality|5_2023-08-28T20:42:00.075340.parquet"]}]}, {"config_name": "original_mmlu_international_law_5", "data_files": [{"split": "2023_08_28T20_42_00.075340", "path": ["**/details_original|mmlu:international_law|5_2023-08-28T20:42:00.075340.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:international_law|5_2023-08-28T20:42:00.075340.parquet"]}]}, {"config_name": "original_mmlu_jurisprudence_5", "data_files": [{"split": "2023_08_28T20_42_00.075340", "path": ["**/details_original|mmlu:jurisprudence|5_2023-08-28T20:42:00.075340.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:jurisprudence|5_2023-08-28T20:42:00.075340.parquet"]}]}, {"config_name": "original_mmlu_logical_fallacies_5", "data_files": [{"split": "2023_08_28T20_42_00.075340", "path": ["**/details_original|mmlu:logical_fallacies|5_2023-08-28T20:42:00.075340.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:logical_fallacies|5_2023-08-28T20:42:00.075340.parquet"]}]}, {"config_name": "original_mmlu_machine_learning_5", "data_files": [{"split": "2023_08_28T20_42_00.075340", "path": ["**/details_original|mmlu:machine_learning|5_2023-08-28T20:42:00.075340.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:machine_learning|5_2023-08-28T20:42:00.075340.parquet"]}]}, {"config_name": "original_mmlu_management_5", "data_files": [{"split": "2023_08_28T20_42_00.075340", "path": ["**/details_original|mmlu:management|5_2023-08-28T20:42:00.075340.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:management|5_2023-08-28T20:42:00.075340.parquet"]}]}, {"config_name": "original_mmlu_marketing_5", "data_files": [{"split": "2023_08_28T20_42_00.075340", "path": ["**/details_original|mmlu:marketing|5_2023-08-28T20:42:00.075340.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:marketing|5_2023-08-28T20:42:00.075340.parquet"]}]}, {"config_name": "original_mmlu_medical_genetics_5", "data_files": [{"split": "2023_08_28T20_42_00.075340", "path": ["**/details_original|mmlu:medical_genetics|5_2023-08-28T20:42:00.075340.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:medical_genetics|5_2023-08-28T20:42:00.075340.parquet"]}]}, {"config_name": "original_mmlu_miscellaneous_5", "data_files": [{"split": "2023_08_28T20_42_00.075340", "path": ["**/details_original|mmlu:miscellaneous|5_2023-08-28T20:42:00.075340.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:miscellaneous|5_2023-08-28T20:42:00.075340.parquet"]}]}, {"config_name": "original_mmlu_moral_disputes_5", "data_files": [{"split": "2023_08_28T20_42_00.075340", "path": ["**/details_original|mmlu:moral_disputes|5_2023-08-28T20:42:00.075340.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:moral_disputes|5_2023-08-28T20:42:00.075340.parquet"]}]}, {"config_name": "original_mmlu_moral_scenarios_5", "data_files": [{"split": "2023_08_28T20_42_00.075340", "path": ["**/details_original|mmlu:moral_scenarios|5_2023-08-28T20:42:00.075340.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:moral_scenarios|5_2023-08-28T20:42:00.075340.parquet"]}]}, {"config_name": "original_mmlu_nutrition_5", "data_files": [{"split": "2023_08_28T20_42_00.075340", "path": ["**/details_original|mmlu:nutrition|5_2023-08-28T20:42:00.075340.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:nutrition|5_2023-08-28T20:42:00.075340.parquet"]}]}, {"config_name": "original_mmlu_philosophy_5", "data_files": [{"split": "2023_08_28T20_42_00.075340", "path": ["**/details_original|mmlu:philosophy|5_2023-08-28T20:42:00.075340.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:philosophy|5_2023-08-28T20:42:00.075340.parquet"]}]}, {"config_name": "original_mmlu_prehistory_5", "data_files": [{"split": "2023_08_28T20_42_00.075340", "path": ["**/details_original|mmlu:prehistory|5_2023-08-28T20:42:00.075340.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:prehistory|5_2023-08-28T20:42:00.075340.parquet"]}]}, {"config_name": "original_mmlu_professional_accounting_5", "data_files": [{"split": "2023_08_28T20_42_00.075340", "path": ["**/details_original|mmlu:professional_accounting|5_2023-08-28T20:42:00.075340.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:professional_accounting|5_2023-08-28T20:42:00.075340.parquet"]}]}, {"config_name": "original_mmlu_professional_law_5", "data_files": [{"split": "2023_08_28T20_42_00.075340", "path": ["**/details_original|mmlu:professional_law|5_2023-08-28T20:42:00.075340.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:professional_law|5_2023-08-28T20:42:00.075340.parquet"]}]}, {"config_name": "original_mmlu_professional_medicine_5", "data_files": [{"split": "2023_08_28T20_42_00.075340", "path": ["**/details_original|mmlu:professional_medicine|5_2023-08-28T20:42:00.075340.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:professional_medicine|5_2023-08-28T20:42:00.075340.parquet"]}]}, {"config_name": "original_mmlu_professional_psychology_5", "data_files": [{"split": "2023_08_28T20_42_00.075340", "path": ["**/details_original|mmlu:professional_psychology|5_2023-08-28T20:42:00.075340.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:professional_psychology|5_2023-08-28T20:42:00.075340.parquet"]}]}, {"config_name": "original_mmlu_public_relations_5", "data_files": [{"split": "2023_08_28T20_42_00.075340", "path": ["**/details_original|mmlu:public_relations|5_2023-08-28T20:42:00.075340.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:public_relations|5_2023-08-28T20:42:00.075340.parquet"]}]}, {"config_name": "original_mmlu_security_studies_5", "data_files": [{"split": "2023_08_28T20_42_00.075340", "path": ["**/details_original|mmlu:security_studies|5_2023-08-28T20:42:00.075340.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:security_studies|5_2023-08-28T20:42:00.075340.parquet"]}]}, {"config_name": "original_mmlu_sociology_5", "data_files": [{"split": "2023_08_28T20_42_00.075340", "path": ["**/details_original|mmlu:sociology|5_2023-08-28T20:42:00.075340.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:sociology|5_2023-08-28T20:42:00.075340.parquet"]}]}, {"config_name": "original_mmlu_us_foreign_policy_5", "data_files": [{"split": "2023_08_28T20_42_00.075340", "path": ["**/details_original|mmlu:us_foreign_policy|5_2023-08-28T20:42:00.075340.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:us_foreign_policy|5_2023-08-28T20:42:00.075340.parquet"]}]}, {"config_name": "original_mmlu_virology_5", "data_files": [{"split": "2023_08_28T20_42_00.075340", "path": ["**/details_original|mmlu:virology|5_2023-08-28T20:42:00.075340.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:virology|5_2023-08-28T20:42:00.075340.parquet"]}]}, {"config_name": "original_mmlu_world_religions_5", "data_files": [{"split": "2023_08_28T20_42_00.075340", "path": ["**/details_original|mmlu:world_religions|5_2023-08-28T20:42:00.075340.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:world_religions|5_2023-08-28T20:42:00.075340.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_19T12_49_48.066230", "path": ["results_2023-07-19T12:49:48.066230.parquet"]}, {"split": "2023_08_28T20_42_00.075340", "path": ["results_2023-08-28T20:42:00.075340.parquet"]}, {"split": "2023_10_17T01_27_06.940630", "path": ["results_2023-10-17T01-27-06.940630.parquet"]}, {"split": "latest", "path": ["results_2023-10-17T01-27-06.940630.parquet"]}]}]}
2023-10-17T00:27:19+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of Writer/palmyra-base ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model Writer/palmyra-base on the Open LLM Leaderboard. The dataset is composed of 122 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-10-17T01:27:06.940630(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of Writer/palmyra-base", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model Writer/palmyra-base on the Open LLM Leaderboard.\n\nThe dataset is composed of 122 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-17T01:27:06.940630(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of Writer/palmyra-base", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model Writer/palmyra-base on the Open LLM Leaderboard.\n\nThe dataset is composed of 122 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-17T01:27:06.940630(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 18, 31, 166, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of Writer/palmyra-base## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model Writer/palmyra-base on the Open LLM Leaderboard.\n\nThe dataset is composed of 122 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-17T01:27:06.940630(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
1ba71bffebeedac0b7cfee1e770992fbb7be8af4
# Dataset Card for Evaluation run of Writer/camel-5b-hf ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/Writer/camel-5b-hf - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [Writer/camel-5b-hf](https://huggingface.co/Writer/camel-5b-hf) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_Writer__camel-5b-hf", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-10-18T14:36:32.116490](https://huggingface.co/datasets/open-llm-leaderboard/details_Writer__camel-5b-hf/blob/main/results_2023-10-18T14-36-32.116490.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.08294882550335571, "em_stderr": 0.0028244998601496944, "f1": 0.14997168624161072, "f1_stderr": 0.003145718068946184, "acc": 0.3069466775731776, "acc_stderr": 0.007700124028579334 }, "harness|drop|3": { "em": 0.08294882550335571, "em_stderr": 0.0028244998601496944, "f1": 0.14997168624161072, "f1_stderr": 0.003145718068946184 }, "harness|gsm8k|5": { "acc": 0.0037907505686125853, "acc_stderr": 0.0016927007401502051 }, "harness|winogrande|5": { "acc": 0.6101026045777427, "acc_stderr": 0.013707547317008463 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_Writer__camel-5b-hf
[ "region:us" ]
2023-08-18T11:03:39+00:00
{"pretty_name": "Evaluation run of Writer/camel-5b-hf", "dataset_summary": "Dataset automatically created during the evaluation run of model [Writer/camel-5b-hf](https://huggingface.co/Writer/camel-5b-hf) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Writer__camel-5b-hf\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-18T14:36:32.116490](https://huggingface.co/datasets/open-llm-leaderboard/details_Writer__camel-5b-hf/blob/main/results_2023-10-18T14-36-32.116490.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.08294882550335571,\n \"em_stderr\": 0.0028244998601496944,\n \"f1\": 0.14997168624161072,\n \"f1_stderr\": 0.003145718068946184,\n \"acc\": 0.3069466775731776,\n \"acc_stderr\": 0.007700124028579334\n },\n \"harness|drop|3\": {\n \"em\": 0.08294882550335571,\n \"em_stderr\": 0.0028244998601496944,\n \"f1\": 0.14997168624161072,\n \"f1_stderr\": 0.003145718068946184\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0037907505686125853,\n \"acc_stderr\": 0.0016927007401502051\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.6101026045777427,\n \"acc_stderr\": 0.013707547317008463\n }\n}\n```", "repo_url": "https://huggingface.co/Writer/camel-5b-hf", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_19T15_25_02.904083", "path": ["**/details_harness|arc:challenge|25_2023-07-19T15:25:02.904083.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-19T15:25:02.904083.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_18T14_36_32.116490", "path": ["**/details_harness|drop|3_2023-10-18T14-36-32.116490.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-18T14-36-32.116490.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_18T14_36_32.116490", "path": ["**/details_harness|gsm8k|5_2023-10-18T14-36-32.116490.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-18T14-36-32.116490.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_19T15_25_02.904083", "path": ["**/details_harness|hellaswag|10_2023-07-19T15:25:02.904083.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-19T15:25:02.904083.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_19T15_25_02.904083", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T15:25:02.904083.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T15:25:02.904083.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T15:25:02.904083.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T15:25:02.904083.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T15:25:02.904083.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T15:25:02.904083.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T15:25:02.904083.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T15:25:02.904083.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T15:25:02.904083.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T15:25:02.904083.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T15:25:02.904083.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T15:25:02.904083.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T15:25:02.904083.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T15:25:02.904083.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T15:25:02.904083.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T15:25:02.904083.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T15:25:02.904083.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T15:25:02.904083.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T15:25:02.904083.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T15:25:02.904083.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T15:25:02.904083.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T15:25:02.904083.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T15:25:02.904083.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T15:25:02.904083.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T15:25:02.904083.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T15:25:02.904083.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T15:25:02.904083.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T15:25:02.904083.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T15:25:02.904083.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T15:25:02.904083.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T15:25:02.904083.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T15:25:02.904083.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T15:25:02.904083.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T15:25:02.904083.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T15:25:02.904083.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T15:25:02.904083.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T15:25:02.904083.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T15:25:02.904083.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T15:25:02.904083.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T15:25:02.904083.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T15:25:02.904083.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T15:25:02.904083.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T15:25:02.904083.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T15:25:02.904083.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T15:25:02.904083.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T15:25:02.904083.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T15:25:02.904083.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T15:25:02.904083.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T15:25:02.904083.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T15:25:02.904083.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T15:25:02.904083.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T15:25:02.904083.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T15:25:02.904083.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T15:25:02.904083.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T15:25:02.904083.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T15:25:02.904083.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T15:25:02.904083.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T15:25:02.904083.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T15:25:02.904083.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T15:25:02.904083.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T15:25:02.904083.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T15:25:02.904083.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T15:25:02.904083.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T15:25:02.904083.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T15:25:02.904083.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T15:25:02.904083.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T15:25:02.904083.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T15:25:02.904083.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T15:25:02.904083.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T15:25:02.904083.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T15:25:02.904083.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T15:25:02.904083.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T15:25:02.904083.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T15:25:02.904083.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T15:25:02.904083.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T15:25:02.904083.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T15:25:02.904083.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T15:25:02.904083.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T15:25:02.904083.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T15:25:02.904083.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T15:25:02.904083.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T15:25:02.904083.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T15:25:02.904083.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T15:25:02.904083.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T15:25:02.904083.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T15:25:02.904083.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T15:25:02.904083.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T15:25:02.904083.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T15:25:02.904083.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T15:25:02.904083.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T15:25:02.904083.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T15:25:02.904083.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T15:25:02.904083.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T15:25:02.904083.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T15:25:02.904083.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T15:25:02.904083.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T15:25:02.904083.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T15:25:02.904083.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T15:25:02.904083.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T15:25:02.904083.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T15:25:02.904083.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T15:25:02.904083.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T15:25:02.904083.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T15:25:02.904083.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T15:25:02.904083.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T15:25:02.904083.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T15:25:02.904083.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T15:25:02.904083.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T15:25:02.904083.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T15:25:02.904083.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T15:25:02.904083.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T15:25:02.904083.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T15:25:02.904083.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T15:25:02.904083.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_19T15_25_02.904083", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T15:25:02.904083.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T15:25:02.904083.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_19T15_25_02.904083", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T15:25:02.904083.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T15:25:02.904083.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_19T15_25_02.904083", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T15:25:02.904083.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T15:25:02.904083.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_19T15_25_02.904083", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T15:25:02.904083.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T15:25:02.904083.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_19T15_25_02.904083", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T15:25:02.904083.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T15:25:02.904083.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_19T15_25_02.904083", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T15:25:02.904083.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T15:25:02.904083.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_19T15_25_02.904083", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T15:25:02.904083.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T15:25:02.904083.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_19T15_25_02.904083", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T15:25:02.904083.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T15:25:02.904083.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_19T15_25_02.904083", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T15:25:02.904083.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T15:25:02.904083.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_19T15_25_02.904083", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T15:25:02.904083.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T15:25:02.904083.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_19T15_25_02.904083", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T15:25:02.904083.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T15:25:02.904083.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_19T15_25_02.904083", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T15:25:02.904083.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T15:25:02.904083.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_19T15_25_02.904083", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T15:25:02.904083.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T15:25:02.904083.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_19T15_25_02.904083", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T15:25:02.904083.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T15:25:02.904083.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_19T15_25_02.904083", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T15:25:02.904083.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T15:25:02.904083.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_19T15_25_02.904083", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T15:25:02.904083.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T15:25:02.904083.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_19T15_25_02.904083", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T15:25:02.904083.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T15:25:02.904083.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_19T15_25_02.904083", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T15:25:02.904083.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T15:25:02.904083.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_19T15_25_02.904083", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T15:25:02.904083.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T15:25:02.904083.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_19T15_25_02.904083", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T15:25:02.904083.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T15:25:02.904083.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_19T15_25_02.904083", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T15:25:02.904083.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T15:25:02.904083.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_19T15_25_02.904083", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T15:25:02.904083.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T15:25:02.904083.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_19T15_25_02.904083", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T15:25:02.904083.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T15:25:02.904083.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_19T15_25_02.904083", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T15:25:02.904083.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T15:25:02.904083.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_19T15_25_02.904083", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T15:25:02.904083.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T15:25:02.904083.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_19T15_25_02.904083", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T15:25:02.904083.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T15:25:02.904083.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_19T15_25_02.904083", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T15:25:02.904083.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T15:25:02.904083.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_19T15_25_02.904083", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T15:25:02.904083.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T15:25:02.904083.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_19T15_25_02.904083", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T15:25:02.904083.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T15:25:02.904083.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_19T15_25_02.904083", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T15:25:02.904083.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T15:25:02.904083.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_19T15_25_02.904083", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T15:25:02.904083.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T15:25:02.904083.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_19T15_25_02.904083", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T15:25:02.904083.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T15:25:02.904083.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_19T15_25_02.904083", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T15:25:02.904083.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T15:25:02.904083.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_19T15_25_02.904083", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T15:25:02.904083.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T15:25:02.904083.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_19T15_25_02.904083", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T15:25:02.904083.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T15:25:02.904083.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_19T15_25_02.904083", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T15:25:02.904083.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T15:25:02.904083.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_19T15_25_02.904083", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T15:25:02.904083.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T15:25:02.904083.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_19T15_25_02.904083", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T15:25:02.904083.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T15:25:02.904083.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_19T15_25_02.904083", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T15:25:02.904083.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T15:25:02.904083.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_19T15_25_02.904083", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T15:25:02.904083.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T15:25:02.904083.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_19T15_25_02.904083", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T15:25:02.904083.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T15:25:02.904083.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_19T15_25_02.904083", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T15:25:02.904083.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T15:25:02.904083.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_19T15_25_02.904083", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T15:25:02.904083.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T15:25:02.904083.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_19T15_25_02.904083", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T15:25:02.904083.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T15:25:02.904083.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_19T15_25_02.904083", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T15:25:02.904083.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T15:25:02.904083.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_19T15_25_02.904083", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T15:25:02.904083.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T15:25:02.904083.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_19T15_25_02.904083", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T15:25:02.904083.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T15:25:02.904083.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_19T15_25_02.904083", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T15:25:02.904083.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T15:25:02.904083.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_19T15_25_02.904083", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T15:25:02.904083.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T15:25:02.904083.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_19T15_25_02.904083", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T15:25:02.904083.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T15:25:02.904083.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_19T15_25_02.904083", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T15:25:02.904083.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T15:25:02.904083.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_19T15_25_02.904083", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T15:25:02.904083.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T15:25:02.904083.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_19T15_25_02.904083", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T15:25:02.904083.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T15:25:02.904083.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_19T15_25_02.904083", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T15:25:02.904083.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T15:25:02.904083.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_19T15_25_02.904083", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T15:25:02.904083.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T15:25:02.904083.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_19T15_25_02.904083", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T15:25:02.904083.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T15:25:02.904083.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_19T15_25_02.904083", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T15:25:02.904083.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T15:25:02.904083.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_19T15_25_02.904083", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T15:25:02.904083.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T15:25:02.904083.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_18T14_36_32.116490", "path": ["**/details_harness|winogrande|5_2023-10-18T14-36-32.116490.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-18T14-36-32.116490.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_19T15_25_02.904083", "path": ["results_2023-07-19T15:25:02.904083.parquet"]}, {"split": "2023_10_18T14_36_32.116490", "path": ["results_2023-10-18T14-36-32.116490.parquet"]}, {"split": "latest", "path": ["results_2023-10-18T14-36-32.116490.parquet"]}]}]}
2023-10-18T13:36:45+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of Writer/camel-5b-hf ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model Writer/camel-5b-hf on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-10-18T14:36:32.116490(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of Writer/camel-5b-hf", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model Writer/camel-5b-hf on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-18T14:36:32.116490(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of Writer/camel-5b-hf", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model Writer/camel-5b-hf on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-18T14:36:32.116490(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 20, 31, 168, 66, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of Writer/camel-5b-hf## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model Writer/camel-5b-hf on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-18T14:36:32.116490(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
1e30e3fb4eaf0dd7c1cd9bcc54695e65cf833cc4
# Dataset Card for "mmlu-high_school_microeconomics-neg-prepend-fix" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
joey234/mmlu-high_school_microeconomics-neg-prepend-fix
[ "region:us" ]
2023-08-18T11:03:40+00:00
{"configs": [{"config_name": "default", "data_files": [{"split": "dev", "path": "data/dev-*"}, {"split": "test", "path": "data/test-*"}]}], "dataset_info": {"features": [{"name": "question", "dtype": "string"}, {"name": "choices", "sequence": "string"}, {"name": "answer", "dtype": {"class_label": {"names": {"0": "A", "1": "B", "2": "C", "3": "D"}}}}, {"name": "negate_openai_prompt", "struct": [{"name": "content", "dtype": "string"}, {"name": "role", "dtype": "string"}]}, {"name": "neg_question", "dtype": "string"}, {"name": "fewshot_context", "dtype": "string"}, {"name": "ori_prompt", "dtype": "string"}], "splits": [{"name": "dev", "num_bytes": 5450, "num_examples": 5}, {"name": "test", "num_bytes": 608302, "num_examples": 238}], "download_size": 12377, "dataset_size": 613752}}
2023-08-21T06:36:31+00:00
[]
[]
TAGS #region-us
# Dataset Card for "mmlu-high_school_microeconomics-neg-prepend-fix" More Information needed
[ "# Dataset Card for \"mmlu-high_school_microeconomics-neg-prepend-fix\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"mmlu-high_school_microeconomics-neg-prepend-fix\"\n\nMore Information needed" ]
[ 6, 29 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"mmlu-high_school_microeconomics-neg-prepend-fix\"\n\nMore Information needed" ]
c0cb02052b7d257b5e0f7da6c27e34e7b67ddd26
# Dataset Card for Evaluation run of ariellee/SuperPlatty-30B ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/ariellee/SuperPlatty-30B - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [ariellee/SuperPlatty-30B](https://huggingface.co/ariellee/SuperPlatty-30B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_ariellee__SuperPlatty-30B", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-09-18T02:07:35.126517](https://huggingface.co/datasets/open-llm-leaderboard/details_ariellee__SuperPlatty-30B/blob/main/results_2023-09-18T02-07-35.126517.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.4521812080536913, "em_stderr": 0.0050969968963073785, "f1": 0.4944075083892625, "f1_stderr": 0.004926147134745944, "acc": 0.44987891738317937, "acc_stderr": 0.009646692360892724 }, "harness|drop|3": { "em": 0.4521812080536913, "em_stderr": 0.0050969968963073785, "f1": 0.4944075083892625, "f1_stderr": 0.004926147134745944 }, "harness|gsm8k|5": { "acc": 0.09628506444275967, "acc_stderr": 0.008125264128215882 }, "harness|winogrande|5": { "acc": 0.8034727703235991, "acc_stderr": 0.011168120593569567 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_ariellee__SuperPlatty-30B
[ "region:us" ]
2023-08-18T11:03:48+00:00
{"pretty_name": "Evaluation run of ariellee/SuperPlatty-30B", "dataset_summary": "Dataset automatically created during the evaluation run of model [ariellee/SuperPlatty-30B](https://huggingface.co/ariellee/SuperPlatty-30B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_ariellee__SuperPlatty-30B\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-09-18T02:07:35.126517](https://huggingface.co/datasets/open-llm-leaderboard/details_ariellee__SuperPlatty-30B/blob/main/results_2023-09-18T02-07-35.126517.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.4521812080536913,\n \"em_stderr\": 0.0050969968963073785,\n \"f1\": 0.4944075083892625,\n \"f1_stderr\": 0.004926147134745944,\n \"acc\": 0.44987891738317937,\n \"acc_stderr\": 0.009646692360892724\n },\n \"harness|drop|3\": {\n \"em\": 0.4521812080536913,\n \"em_stderr\": 0.0050969968963073785,\n \"f1\": 0.4944075083892625,\n \"f1_stderr\": 0.004926147134745944\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.09628506444275967,\n \"acc_stderr\": 0.008125264128215882\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8034727703235991,\n \"acc_stderr\": 0.011168120593569567\n }\n}\n```", "repo_url": "https://huggingface.co/ariellee/SuperPlatty-30B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_19T22_30_48.838844", "path": ["**/details_harness|arc:challenge|25_2023-07-19T22:30:48.838844.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-19T22:30:48.838844.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_09_18T02_07_35.126517", "path": ["**/details_harness|drop|3_2023-09-18T02-07-35.126517.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-09-18T02-07-35.126517.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_09_18T02_07_35.126517", "path": ["**/details_harness|gsm8k|5_2023-09-18T02-07-35.126517.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-09-18T02-07-35.126517.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_19T22_30_48.838844", "path": ["**/details_harness|hellaswag|10_2023-07-19T22:30:48.838844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-19T22:30:48.838844.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_19T22_30_48.838844", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T22:30:48.838844.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T22:30:48.838844.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T22:30:48.838844.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T22:30:48.838844.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T22:30:48.838844.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T22:30:48.838844.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T22:30:48.838844.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T22:30:48.838844.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T22:30:48.838844.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T22:30:48.838844.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T22:30:48.838844.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T22:30:48.838844.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T22:30:48.838844.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T22:30:48.838844.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T22:30:48.838844.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T22:30:48.838844.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T22:30:48.838844.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T22:30:48.838844.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T22:30:48.838844.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T22:30:48.838844.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T22:30:48.838844.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T22:30:48.838844.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T22:30:48.838844.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T22:30:48.838844.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T22:30:48.838844.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T22:30:48.838844.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T22:30:48.838844.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T22:30:48.838844.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T22:30:48.838844.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T22:30:48.838844.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T22:30:48.838844.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T22:30:48.838844.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T22:30:48.838844.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T22:30:48.838844.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T22:30:48.838844.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T22:30:48.838844.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T22:30:48.838844.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T22:30:48.838844.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T22:30:48.838844.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T22:30:48.838844.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T22:30:48.838844.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T22:30:48.838844.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T22:30:48.838844.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T22:30:48.838844.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T22:30:48.838844.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T22:30:48.838844.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T22:30:48.838844.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T22:30:48.838844.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T22:30:48.838844.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T22:30:48.838844.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T22:30:48.838844.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T22:30:48.838844.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T22:30:48.838844.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T22:30:48.838844.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T22:30:48.838844.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T22:30:48.838844.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T22:30:48.838844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T22:30:48.838844.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T22:30:48.838844.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T22:30:48.838844.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T22:30:48.838844.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T22:30:48.838844.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T22:30:48.838844.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T22:30:48.838844.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T22:30:48.838844.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T22:30:48.838844.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T22:30:48.838844.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T22:30:48.838844.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T22:30:48.838844.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T22:30:48.838844.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T22:30:48.838844.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T22:30:48.838844.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T22:30:48.838844.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T22:30:48.838844.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T22:30:48.838844.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T22:30:48.838844.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T22:30:48.838844.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T22:30:48.838844.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T22:30:48.838844.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T22:30:48.838844.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T22:30:48.838844.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T22:30:48.838844.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T22:30:48.838844.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T22:30:48.838844.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T22:30:48.838844.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T22:30:48.838844.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T22:30:48.838844.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T22:30:48.838844.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T22:30:48.838844.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T22:30:48.838844.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T22:30:48.838844.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T22:30:48.838844.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T22:30:48.838844.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T22:30:48.838844.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T22:30:48.838844.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T22:30:48.838844.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T22:30:48.838844.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T22:30:48.838844.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T22:30:48.838844.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T22:30:48.838844.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T22:30:48.838844.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T22:30:48.838844.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T22:30:48.838844.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T22:30:48.838844.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T22:30:48.838844.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T22:30:48.838844.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T22:30:48.838844.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T22:30:48.838844.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T22:30:48.838844.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T22:30:48.838844.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T22:30:48.838844.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T22:30:48.838844.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T22:30:48.838844.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T22:30:48.838844.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_19T22_30_48.838844", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T22:30:48.838844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T22:30:48.838844.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_19T22_30_48.838844", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T22:30:48.838844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T22:30:48.838844.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_19T22_30_48.838844", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T22:30:48.838844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T22:30:48.838844.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_19T22_30_48.838844", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T22:30:48.838844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T22:30:48.838844.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_19T22_30_48.838844", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T22:30:48.838844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T22:30:48.838844.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_19T22_30_48.838844", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T22:30:48.838844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T22:30:48.838844.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_19T22_30_48.838844", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T22:30:48.838844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T22:30:48.838844.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_19T22_30_48.838844", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T22:30:48.838844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T22:30:48.838844.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_19T22_30_48.838844", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T22:30:48.838844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T22:30:48.838844.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_19T22_30_48.838844", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T22:30:48.838844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T22:30:48.838844.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_19T22_30_48.838844", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T22:30:48.838844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T22:30:48.838844.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_19T22_30_48.838844", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T22:30:48.838844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T22:30:48.838844.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_19T22_30_48.838844", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T22:30:48.838844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T22:30:48.838844.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_19T22_30_48.838844", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T22:30:48.838844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T22:30:48.838844.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_19T22_30_48.838844", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T22:30:48.838844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T22:30:48.838844.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_19T22_30_48.838844", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T22:30:48.838844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T22:30:48.838844.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_19T22_30_48.838844", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T22:30:48.838844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T22:30:48.838844.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_19T22_30_48.838844", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T22:30:48.838844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T22:30:48.838844.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_19T22_30_48.838844", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T22:30:48.838844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T22:30:48.838844.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_19T22_30_48.838844", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T22:30:48.838844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T22:30:48.838844.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_19T22_30_48.838844", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T22:30:48.838844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T22:30:48.838844.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_19T22_30_48.838844", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T22:30:48.838844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T22:30:48.838844.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_19T22_30_48.838844", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T22:30:48.838844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T22:30:48.838844.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_19T22_30_48.838844", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T22:30:48.838844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T22:30:48.838844.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_19T22_30_48.838844", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T22:30:48.838844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T22:30:48.838844.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_19T22_30_48.838844", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T22:30:48.838844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T22:30:48.838844.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_19T22_30_48.838844", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T22:30:48.838844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T22:30:48.838844.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_19T22_30_48.838844", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T22:30:48.838844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T22:30:48.838844.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_19T22_30_48.838844", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T22:30:48.838844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T22:30:48.838844.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_19T22_30_48.838844", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T22:30:48.838844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T22:30:48.838844.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_19T22_30_48.838844", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T22:30:48.838844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T22:30:48.838844.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_19T22_30_48.838844", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T22:30:48.838844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T22:30:48.838844.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_19T22_30_48.838844", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T22:30:48.838844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T22:30:48.838844.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_19T22_30_48.838844", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T22:30:48.838844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T22:30:48.838844.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_19T22_30_48.838844", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T22:30:48.838844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T22:30:48.838844.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_19T22_30_48.838844", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T22:30:48.838844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T22:30:48.838844.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_19T22_30_48.838844", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T22:30:48.838844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T22:30:48.838844.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_19T22_30_48.838844", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T22:30:48.838844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T22:30:48.838844.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_19T22_30_48.838844", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T22:30:48.838844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T22:30:48.838844.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_19T22_30_48.838844", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T22:30:48.838844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T22:30:48.838844.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_19T22_30_48.838844", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T22:30:48.838844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T22:30:48.838844.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_19T22_30_48.838844", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T22:30:48.838844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T22:30:48.838844.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_19T22_30_48.838844", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T22:30:48.838844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T22:30:48.838844.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_19T22_30_48.838844", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T22:30:48.838844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T22:30:48.838844.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_19T22_30_48.838844", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T22:30:48.838844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T22:30:48.838844.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_19T22_30_48.838844", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T22:30:48.838844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T22:30:48.838844.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_19T22_30_48.838844", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T22:30:48.838844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T22:30:48.838844.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_19T22_30_48.838844", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T22:30:48.838844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T22:30:48.838844.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_19T22_30_48.838844", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T22:30:48.838844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T22:30:48.838844.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_19T22_30_48.838844", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T22:30:48.838844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T22:30:48.838844.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_19T22_30_48.838844", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T22:30:48.838844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T22:30:48.838844.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_19T22_30_48.838844", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T22:30:48.838844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T22:30:48.838844.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_19T22_30_48.838844", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T22:30:48.838844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T22:30:48.838844.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_19T22_30_48.838844", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T22:30:48.838844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T22:30:48.838844.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_19T22_30_48.838844", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T22:30:48.838844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T22:30:48.838844.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_19T22_30_48.838844", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T22:30:48.838844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T22:30:48.838844.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_19T22_30_48.838844", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T22:30:48.838844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T22:30:48.838844.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_19T22_30_48.838844", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T22:30:48.838844.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T22:30:48.838844.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_09_18T02_07_35.126517", "path": ["**/details_harness|winogrande|5_2023-09-18T02-07-35.126517.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-09-18T02-07-35.126517.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_19T22_30_48.838844", "path": ["results_2023-07-19T22:30:48.838844.parquet"]}, {"split": "2023_09_18T02_07_35.126517", "path": ["results_2023-09-18T02-07-35.126517.parquet"]}, {"split": "latest", "path": ["results_2023-09-18T02-07-35.126517.parquet"]}]}]}
2023-09-18T01:07:47+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of ariellee/SuperPlatty-30B ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model ariellee/SuperPlatty-30B on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-09-18T02:07:35.126517(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of ariellee/SuperPlatty-30B", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model ariellee/SuperPlatty-30B on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-09-18T02:07:35.126517(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of ariellee/SuperPlatty-30B", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model ariellee/SuperPlatty-30B on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-09-18T02:07:35.126517(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 18, 31, 166, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of ariellee/SuperPlatty-30B## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model ariellee/SuperPlatty-30B on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-09-18T02:07:35.126517(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
6d9be16cc06762d3f9cc53b3352e5828057c040f
# Dataset Card for "mmlu-high_school_physics-neg-prepend-fix" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
joey234/mmlu-high_school_physics-neg-prepend-fix
[ "region:us" ]
2023-08-18T11:03:53+00:00
{"configs": [{"config_name": "default", "data_files": [{"split": "dev", "path": "data/dev-*"}, {"split": "test", "path": "data/test-*"}]}], "dataset_info": {"features": [{"name": "question", "dtype": "string"}, {"name": "choices", "sequence": "string"}, {"name": "answer", "dtype": {"class_label": {"names": {"0": "A", "1": "B", "2": "C", "3": "D"}}}}, {"name": "negate_openai_prompt", "struct": [{"name": "content", "dtype": "string"}, {"name": "role", "dtype": "string"}]}, {"name": "neg_question", "dtype": "string"}, {"name": "fewshot_context", "dtype": "string"}, {"name": "ori_prompt", "dtype": "string"}], "splits": [{"name": "dev", "num_bytes": 6996, "num_examples": 5}, {"name": "test", "num_bytes": 466852, "num_examples": 151}], "download_size": 14956, "dataset_size": 473848}}
2023-08-21T06:36:44+00:00
[]
[]
TAGS #region-us
# Dataset Card for "mmlu-high_school_physics-neg-prepend-fix" More Information needed
[ "# Dataset Card for \"mmlu-high_school_physics-neg-prepend-fix\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"mmlu-high_school_physics-neg-prepend-fix\"\n\nMore Information needed" ]
[ 6, 29 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"mmlu-high_school_physics-neg-prepend-fix\"\n\nMore Information needed" ]
acfcd15ca333184726fa0d1e7b36476abe337d95
# Dataset Card for Evaluation run of roneneldan/TinyStories-33M ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/roneneldan/TinyStories-33M - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [roneneldan/TinyStories-33M](https://huggingface.co/roneneldan/TinyStories-33M) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_roneneldan__TinyStories-33M", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-09-23T05:35:11.802678](https://huggingface.co/datasets/open-llm-leaderboard/details_roneneldan__TinyStories-33M/blob/main/results_2023-09-23T05-35-11.802678.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.0003145973154362416, "em_stderr": 0.0001816137946884096, "f1": 0.001937919463087248, "f1_stderr": 0.0003031702602652814, "acc": 0.24546172059984214, "acc_stderr": 0.007025085047248846 }, "harness|drop|3": { "em": 0.0003145973154362416, "em_stderr": 0.0001816137946884096, "f1": 0.001937919463087248, "f1_stderr": 0.0003031702602652814 }, "harness|gsm8k|5": { "acc": 0.0, "acc_stderr": 0.0 }, "harness|winogrande|5": { "acc": 0.4909234411996843, "acc_stderr": 0.014050170094497692 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_roneneldan__TinyStories-33M
[ "region:us" ]
2023-08-18T11:03:56+00:00
{"pretty_name": "Evaluation run of roneneldan/TinyStories-33M", "dataset_summary": "Dataset automatically created during the evaluation run of model [roneneldan/TinyStories-33M](https://huggingface.co/roneneldan/TinyStories-33M) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_roneneldan__TinyStories-33M\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-09-23T05:35:11.802678](https://huggingface.co/datasets/open-llm-leaderboard/details_roneneldan__TinyStories-33M/blob/main/results_2023-09-23T05-35-11.802678.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0003145973154362416,\n \"em_stderr\": 0.0001816137946884096,\n \"f1\": 0.001937919463087248,\n \"f1_stderr\": 0.0003031702602652814,\n \"acc\": 0.24546172059984214,\n \"acc_stderr\": 0.007025085047248846\n },\n \"harness|drop|3\": {\n \"em\": 0.0003145973154362416,\n \"em_stderr\": 0.0001816137946884096,\n \"f1\": 0.001937919463087248,\n \"f1_stderr\": 0.0003031702602652814\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\": 0.0\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.4909234411996843,\n \"acc_stderr\": 0.014050170094497692\n }\n}\n```", "repo_url": "https://huggingface.co/roneneldan/TinyStories-33M", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_19T13_32_19.766363", "path": ["**/details_harness|arc:challenge|25_2023-07-19T13:32:19.766363.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-19T13:32:19.766363.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_09_23T05_35_11.802678", "path": ["**/details_harness|drop|3_2023-09-23T05-35-11.802678.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-09-23T05-35-11.802678.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_09_23T05_35_11.802678", "path": ["**/details_harness|gsm8k|5_2023-09-23T05-35-11.802678.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-09-23T05-35-11.802678.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_19T13_32_19.766363", "path": ["**/details_harness|hellaswag|10_2023-07-19T13:32:19.766363.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-19T13:32:19.766363.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_19T13_32_19.766363", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T13:32:19.766363.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T13:32:19.766363.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T13:32:19.766363.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T13:32:19.766363.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T13:32:19.766363.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T13:32:19.766363.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T13:32:19.766363.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T13:32:19.766363.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T13:32:19.766363.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T13:32:19.766363.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T13:32:19.766363.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T13:32:19.766363.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T13:32:19.766363.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T13:32:19.766363.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T13:32:19.766363.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T13:32:19.766363.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T13:32:19.766363.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T13:32:19.766363.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T13:32:19.766363.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T13:32:19.766363.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T13:32:19.766363.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T13:32:19.766363.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T13:32:19.766363.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T13:32:19.766363.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T13:32:19.766363.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T13:32:19.766363.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T13:32:19.766363.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T13:32:19.766363.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T13:32:19.766363.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T13:32:19.766363.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T13:32:19.766363.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T13:32:19.766363.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T13:32:19.766363.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T13:32:19.766363.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T13:32:19.766363.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T13:32:19.766363.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T13:32:19.766363.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T13:32:19.766363.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T13:32:19.766363.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T13:32:19.766363.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T13:32:19.766363.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T13:32:19.766363.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T13:32:19.766363.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T13:32:19.766363.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T13:32:19.766363.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T13:32:19.766363.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T13:32:19.766363.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T13:32:19.766363.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T13:32:19.766363.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T13:32:19.766363.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T13:32:19.766363.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T13:32:19.766363.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T13:32:19.766363.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T13:32:19.766363.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T13:32:19.766363.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T13:32:19.766363.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T13:32:19.766363.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T13:32:19.766363.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T13:32:19.766363.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T13:32:19.766363.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T13:32:19.766363.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T13:32:19.766363.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T13:32:19.766363.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T13:32:19.766363.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T13:32:19.766363.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T13:32:19.766363.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T13:32:19.766363.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T13:32:19.766363.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T13:32:19.766363.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T13:32:19.766363.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T13:32:19.766363.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T13:32:19.766363.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T13:32:19.766363.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T13:32:19.766363.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T13:32:19.766363.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T13:32:19.766363.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T13:32:19.766363.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T13:32:19.766363.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T13:32:19.766363.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T13:32:19.766363.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T13:32:19.766363.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T13:32:19.766363.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T13:32:19.766363.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T13:32:19.766363.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T13:32:19.766363.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T13:32:19.766363.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T13:32:19.766363.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T13:32:19.766363.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T13:32:19.766363.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T13:32:19.766363.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T13:32:19.766363.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T13:32:19.766363.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T13:32:19.766363.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T13:32:19.766363.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T13:32:19.766363.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T13:32:19.766363.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T13:32:19.766363.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T13:32:19.766363.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T13:32:19.766363.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T13:32:19.766363.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T13:32:19.766363.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T13:32:19.766363.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T13:32:19.766363.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T13:32:19.766363.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T13:32:19.766363.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T13:32:19.766363.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T13:32:19.766363.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T13:32:19.766363.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T13:32:19.766363.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T13:32:19.766363.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T13:32:19.766363.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T13:32:19.766363.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T13:32:19.766363.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T13:32:19.766363.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_19T13_32_19.766363", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T13:32:19.766363.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T13:32:19.766363.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_19T13_32_19.766363", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T13:32:19.766363.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T13:32:19.766363.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_19T13_32_19.766363", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T13:32:19.766363.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T13:32:19.766363.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_19T13_32_19.766363", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T13:32:19.766363.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T13:32:19.766363.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_19T13_32_19.766363", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T13:32:19.766363.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T13:32:19.766363.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_19T13_32_19.766363", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T13:32:19.766363.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T13:32:19.766363.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_19T13_32_19.766363", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T13:32:19.766363.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T13:32:19.766363.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_19T13_32_19.766363", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T13:32:19.766363.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T13:32:19.766363.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_19T13_32_19.766363", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T13:32:19.766363.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T13:32:19.766363.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_19T13_32_19.766363", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T13:32:19.766363.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T13:32:19.766363.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_19T13_32_19.766363", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T13:32:19.766363.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T13:32:19.766363.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_19T13_32_19.766363", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T13:32:19.766363.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T13:32:19.766363.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_19T13_32_19.766363", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T13:32:19.766363.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T13:32:19.766363.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_19T13_32_19.766363", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T13:32:19.766363.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T13:32:19.766363.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_19T13_32_19.766363", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T13:32:19.766363.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T13:32:19.766363.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_19T13_32_19.766363", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T13:32:19.766363.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T13:32:19.766363.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_19T13_32_19.766363", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T13:32:19.766363.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T13:32:19.766363.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_19T13_32_19.766363", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T13:32:19.766363.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T13:32:19.766363.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_19T13_32_19.766363", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T13:32:19.766363.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T13:32:19.766363.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_19T13_32_19.766363", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T13:32:19.766363.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T13:32:19.766363.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_19T13_32_19.766363", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T13:32:19.766363.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T13:32:19.766363.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_19T13_32_19.766363", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T13:32:19.766363.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T13:32:19.766363.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_19T13_32_19.766363", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T13:32:19.766363.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T13:32:19.766363.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_19T13_32_19.766363", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T13:32:19.766363.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T13:32:19.766363.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_19T13_32_19.766363", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T13:32:19.766363.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T13:32:19.766363.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_19T13_32_19.766363", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T13:32:19.766363.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T13:32:19.766363.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_19T13_32_19.766363", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T13:32:19.766363.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T13:32:19.766363.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_19T13_32_19.766363", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T13:32:19.766363.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T13:32:19.766363.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_19T13_32_19.766363", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T13:32:19.766363.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T13:32:19.766363.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_19T13_32_19.766363", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T13:32:19.766363.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T13:32:19.766363.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_19T13_32_19.766363", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T13:32:19.766363.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T13:32:19.766363.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_19T13_32_19.766363", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T13:32:19.766363.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T13:32:19.766363.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_19T13_32_19.766363", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T13:32:19.766363.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T13:32:19.766363.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_19T13_32_19.766363", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T13:32:19.766363.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T13:32:19.766363.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_19T13_32_19.766363", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T13:32:19.766363.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T13:32:19.766363.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_19T13_32_19.766363", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T13:32:19.766363.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T13:32:19.766363.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_19T13_32_19.766363", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T13:32:19.766363.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T13:32:19.766363.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_19T13_32_19.766363", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T13:32:19.766363.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T13:32:19.766363.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_19T13_32_19.766363", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T13:32:19.766363.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T13:32:19.766363.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_19T13_32_19.766363", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T13:32:19.766363.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T13:32:19.766363.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_19T13_32_19.766363", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T13:32:19.766363.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T13:32:19.766363.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_19T13_32_19.766363", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T13:32:19.766363.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T13:32:19.766363.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_19T13_32_19.766363", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T13:32:19.766363.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T13:32:19.766363.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_19T13_32_19.766363", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T13:32:19.766363.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T13:32:19.766363.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_19T13_32_19.766363", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T13:32:19.766363.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T13:32:19.766363.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_19T13_32_19.766363", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T13:32:19.766363.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T13:32:19.766363.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_19T13_32_19.766363", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T13:32:19.766363.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T13:32:19.766363.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_19T13_32_19.766363", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T13:32:19.766363.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T13:32:19.766363.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_19T13_32_19.766363", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T13:32:19.766363.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T13:32:19.766363.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_19T13_32_19.766363", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T13:32:19.766363.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T13:32:19.766363.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_19T13_32_19.766363", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T13:32:19.766363.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T13:32:19.766363.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_19T13_32_19.766363", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T13:32:19.766363.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T13:32:19.766363.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_19T13_32_19.766363", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T13:32:19.766363.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T13:32:19.766363.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_19T13_32_19.766363", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T13:32:19.766363.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T13:32:19.766363.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_19T13_32_19.766363", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T13:32:19.766363.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T13:32:19.766363.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_19T13_32_19.766363", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T13:32:19.766363.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T13:32:19.766363.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_19T13_32_19.766363", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T13:32:19.766363.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T13:32:19.766363.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_19T13_32_19.766363", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T13:32:19.766363.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T13:32:19.766363.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_09_23T05_35_11.802678", "path": ["**/details_harness|winogrande|5_2023-09-23T05-35-11.802678.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-09-23T05-35-11.802678.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_19T13_32_19.766363", "path": ["results_2023-07-19T13:32:19.766363.parquet"]}, {"split": "2023_09_23T05_35_11.802678", "path": ["results_2023-09-23T05-35-11.802678.parquet"]}, {"split": "latest", "path": ["results_2023-09-23T05-35-11.802678.parquet"]}]}]}
2023-09-23T04:35:23+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of roneneldan/TinyStories-33M ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model roneneldan/TinyStories-33M on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-09-23T05:35:11.802678(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of roneneldan/TinyStories-33M", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model roneneldan/TinyStories-33M on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-09-23T05:35:11.802678(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of roneneldan/TinyStories-33M", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model roneneldan/TinyStories-33M on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-09-23T05:35:11.802678(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 21, 31, 169, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of roneneldan/TinyStories-33M## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model roneneldan/TinyStories-33M on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-09-23T05:35:11.802678(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
8a65a829a11e8b5ffaae950f1b369ea8ff53ab06
# Dataset Card for Evaluation run of roneneldan/TinyStories-3M ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/roneneldan/TinyStories-3M - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [roneneldan/TinyStories-3M](https://huggingface.co/roneneldan/TinyStories-3M) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_roneneldan__TinyStories-3M", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-10-16T01:04:49.334028](https://huggingface.co/datasets/open-llm-leaderboard/details_roneneldan__TinyStories-3M/blob/main/results_2023-10-16T01-04-49.334028.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.0, "em_stderr": 0.0, "f1": 0.0009553271812080536, "f1_stderr": 0.00014278665313780474, "acc": 0.2462509865824783, "acc_stderr": 0.00702545276061429 }, "harness|drop|3": { "em": 0.0, "em_stderr": 0.0, "f1": 0.0009553271812080536, "f1_stderr": 0.00014278665313780474 }, "harness|gsm8k|5": { "acc": 0.0, "acc_stderr": 0.0 }, "harness|winogrande|5": { "acc": 0.4925019731649566, "acc_stderr": 0.01405090552122858 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_roneneldan__TinyStories-3M
[ "region:us" ]
2023-08-18T11:04:05+00:00
{"pretty_name": "Evaluation run of roneneldan/TinyStories-3M", "dataset_summary": "Dataset automatically created during the evaluation run of model [roneneldan/TinyStories-3M](https://huggingface.co/roneneldan/TinyStories-3M) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_roneneldan__TinyStories-3M\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-16T01:04:49.334028](https://huggingface.co/datasets/open-llm-leaderboard/details_roneneldan__TinyStories-3M/blob/main/results_2023-10-16T01-04-49.334028.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0,\n \"em_stderr\": 0.0,\n \"f1\": 0.0009553271812080536,\n \"f1_stderr\": 0.00014278665313780474,\n \"acc\": 0.2462509865824783,\n \"acc_stderr\": 0.00702545276061429\n },\n \"harness|drop|3\": {\n \"em\": 0.0,\n \"em_stderr\": 0.0,\n \"f1\": 0.0009553271812080536,\n \"f1_stderr\": 0.00014278665313780474\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\": 0.0\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.4925019731649566,\n \"acc_stderr\": 0.01405090552122858\n }\n}\n```", "repo_url": "https://huggingface.co/roneneldan/TinyStories-3M", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_19T13_26_26.672547", "path": ["**/details_harness|arc:challenge|25_2023-07-19T13:26:26.672547.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-19T13:26:26.672547.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_16T01_04_49.334028", "path": ["**/details_harness|drop|3_2023-10-16T01-04-49.334028.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-16T01-04-49.334028.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_16T01_04_49.334028", "path": ["**/details_harness|gsm8k|5_2023-10-16T01-04-49.334028.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-16T01-04-49.334028.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_19T13_26_26.672547", "path": ["**/details_harness|hellaswag|10_2023-07-19T13:26:26.672547.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-19T13:26:26.672547.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_19T13_26_26.672547", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T13:26:26.672547.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T13:26:26.672547.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T13:26:26.672547.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T13:26:26.672547.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T13:26:26.672547.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T13:26:26.672547.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T13:26:26.672547.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T13:26:26.672547.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T13:26:26.672547.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T13:26:26.672547.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T13:26:26.672547.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T13:26:26.672547.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T13:26:26.672547.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T13:26:26.672547.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T13:26:26.672547.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T13:26:26.672547.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T13:26:26.672547.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T13:26:26.672547.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T13:26:26.672547.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T13:26:26.672547.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T13:26:26.672547.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T13:26:26.672547.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T13:26:26.672547.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T13:26:26.672547.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T13:26:26.672547.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T13:26:26.672547.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T13:26:26.672547.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T13:26:26.672547.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T13:26:26.672547.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T13:26:26.672547.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T13:26:26.672547.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T13:26:26.672547.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T13:26:26.672547.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T13:26:26.672547.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T13:26:26.672547.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T13:26:26.672547.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T13:26:26.672547.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T13:26:26.672547.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T13:26:26.672547.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T13:26:26.672547.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T13:26:26.672547.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T13:26:26.672547.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T13:26:26.672547.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T13:26:26.672547.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T13:26:26.672547.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T13:26:26.672547.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T13:26:26.672547.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T13:26:26.672547.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T13:26:26.672547.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T13:26:26.672547.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T13:26:26.672547.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T13:26:26.672547.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T13:26:26.672547.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T13:26:26.672547.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T13:26:26.672547.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T13:26:26.672547.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T13:26:26.672547.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T13:26:26.672547.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T13:26:26.672547.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T13:26:26.672547.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T13:26:26.672547.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T13:26:26.672547.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T13:26:26.672547.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T13:26:26.672547.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T13:26:26.672547.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T13:26:26.672547.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T13:26:26.672547.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T13:26:26.672547.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T13:26:26.672547.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T13:26:26.672547.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T13:26:26.672547.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T13:26:26.672547.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T13:26:26.672547.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T13:26:26.672547.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T13:26:26.672547.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T13:26:26.672547.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T13:26:26.672547.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T13:26:26.672547.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T13:26:26.672547.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T13:26:26.672547.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T13:26:26.672547.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T13:26:26.672547.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T13:26:26.672547.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T13:26:26.672547.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T13:26:26.672547.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T13:26:26.672547.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T13:26:26.672547.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T13:26:26.672547.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T13:26:26.672547.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T13:26:26.672547.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T13:26:26.672547.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T13:26:26.672547.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T13:26:26.672547.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T13:26:26.672547.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T13:26:26.672547.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T13:26:26.672547.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T13:26:26.672547.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T13:26:26.672547.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T13:26:26.672547.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T13:26:26.672547.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T13:26:26.672547.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T13:26:26.672547.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T13:26:26.672547.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T13:26:26.672547.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T13:26:26.672547.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T13:26:26.672547.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T13:26:26.672547.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T13:26:26.672547.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T13:26:26.672547.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T13:26:26.672547.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T13:26:26.672547.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T13:26:26.672547.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T13:26:26.672547.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T13:26:26.672547.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_19T13_26_26.672547", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T13:26:26.672547.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T13:26:26.672547.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_19T13_26_26.672547", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T13:26:26.672547.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T13:26:26.672547.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_19T13_26_26.672547", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T13:26:26.672547.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T13:26:26.672547.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_19T13_26_26.672547", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T13:26:26.672547.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T13:26:26.672547.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_19T13_26_26.672547", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T13:26:26.672547.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T13:26:26.672547.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_19T13_26_26.672547", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T13:26:26.672547.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T13:26:26.672547.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_19T13_26_26.672547", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T13:26:26.672547.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T13:26:26.672547.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_19T13_26_26.672547", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T13:26:26.672547.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T13:26:26.672547.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_19T13_26_26.672547", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T13:26:26.672547.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T13:26:26.672547.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_19T13_26_26.672547", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T13:26:26.672547.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T13:26:26.672547.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_19T13_26_26.672547", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T13:26:26.672547.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T13:26:26.672547.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_19T13_26_26.672547", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T13:26:26.672547.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T13:26:26.672547.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_19T13_26_26.672547", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T13:26:26.672547.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T13:26:26.672547.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_19T13_26_26.672547", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T13:26:26.672547.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T13:26:26.672547.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_19T13_26_26.672547", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T13:26:26.672547.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T13:26:26.672547.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_19T13_26_26.672547", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T13:26:26.672547.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T13:26:26.672547.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_19T13_26_26.672547", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T13:26:26.672547.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T13:26:26.672547.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_19T13_26_26.672547", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T13:26:26.672547.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T13:26:26.672547.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_19T13_26_26.672547", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T13:26:26.672547.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T13:26:26.672547.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_19T13_26_26.672547", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T13:26:26.672547.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T13:26:26.672547.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_19T13_26_26.672547", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T13:26:26.672547.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T13:26:26.672547.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_19T13_26_26.672547", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T13:26:26.672547.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T13:26:26.672547.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_19T13_26_26.672547", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T13:26:26.672547.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T13:26:26.672547.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_19T13_26_26.672547", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T13:26:26.672547.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T13:26:26.672547.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_19T13_26_26.672547", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T13:26:26.672547.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T13:26:26.672547.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_19T13_26_26.672547", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T13:26:26.672547.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T13:26:26.672547.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_19T13_26_26.672547", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T13:26:26.672547.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T13:26:26.672547.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_19T13_26_26.672547", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T13:26:26.672547.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T13:26:26.672547.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_19T13_26_26.672547", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T13:26:26.672547.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T13:26:26.672547.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_19T13_26_26.672547", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T13:26:26.672547.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T13:26:26.672547.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_19T13_26_26.672547", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T13:26:26.672547.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T13:26:26.672547.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_19T13_26_26.672547", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T13:26:26.672547.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T13:26:26.672547.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_19T13_26_26.672547", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T13:26:26.672547.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T13:26:26.672547.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_19T13_26_26.672547", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T13:26:26.672547.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T13:26:26.672547.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_19T13_26_26.672547", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T13:26:26.672547.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T13:26:26.672547.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_19T13_26_26.672547", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T13:26:26.672547.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T13:26:26.672547.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_19T13_26_26.672547", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T13:26:26.672547.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T13:26:26.672547.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_19T13_26_26.672547", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T13:26:26.672547.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T13:26:26.672547.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_19T13_26_26.672547", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T13:26:26.672547.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T13:26:26.672547.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_19T13_26_26.672547", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T13:26:26.672547.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T13:26:26.672547.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_19T13_26_26.672547", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T13:26:26.672547.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T13:26:26.672547.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_19T13_26_26.672547", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T13:26:26.672547.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T13:26:26.672547.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_19T13_26_26.672547", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T13:26:26.672547.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T13:26:26.672547.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_19T13_26_26.672547", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T13:26:26.672547.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T13:26:26.672547.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_19T13_26_26.672547", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T13:26:26.672547.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T13:26:26.672547.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_19T13_26_26.672547", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T13:26:26.672547.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T13:26:26.672547.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_19T13_26_26.672547", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T13:26:26.672547.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T13:26:26.672547.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_19T13_26_26.672547", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T13:26:26.672547.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T13:26:26.672547.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_19T13_26_26.672547", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T13:26:26.672547.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T13:26:26.672547.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_19T13_26_26.672547", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T13:26:26.672547.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T13:26:26.672547.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_19T13_26_26.672547", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T13:26:26.672547.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T13:26:26.672547.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_19T13_26_26.672547", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T13:26:26.672547.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T13:26:26.672547.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_19T13_26_26.672547", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T13:26:26.672547.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T13:26:26.672547.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_19T13_26_26.672547", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T13:26:26.672547.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T13:26:26.672547.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_19T13_26_26.672547", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T13:26:26.672547.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T13:26:26.672547.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_19T13_26_26.672547", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T13:26:26.672547.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T13:26:26.672547.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_19T13_26_26.672547", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T13:26:26.672547.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T13:26:26.672547.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_19T13_26_26.672547", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T13:26:26.672547.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T13:26:26.672547.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_16T01_04_49.334028", "path": ["**/details_harness|winogrande|5_2023-10-16T01-04-49.334028.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-16T01-04-49.334028.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_19T13_26_26.672547", "path": ["results_2023-07-19T13:26:26.672547.parquet"]}, {"split": "2023_10_16T01_04_49.334028", "path": ["results_2023-10-16T01-04-49.334028.parquet"]}, {"split": "latest", "path": ["results_2023-10-16T01-04-49.334028.parquet"]}]}]}
2023-10-16T00:05:02+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of roneneldan/TinyStories-3M ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model roneneldan/TinyStories-3M on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-10-16T01:04:49.334028(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of roneneldan/TinyStories-3M", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model roneneldan/TinyStories-3M on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-16T01:04:49.334028(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of roneneldan/TinyStories-3M", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model roneneldan/TinyStories-3M on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-16T01:04:49.334028(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 21, 31, 169, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of roneneldan/TinyStories-3M## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model roneneldan/TinyStories-3M on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-16T01:04:49.334028(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
0ff6c4d662e4613d5bcee605eb5f2d6c1e2252d7
# Dataset Card for "mmlu-high_school_psychology-neg-prepend-fix" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
joey234/mmlu-high_school_psychology-neg-prepend-fix
[ "region:us" ]
2023-08-18T11:04:07+00:00
{"configs": [{"config_name": "default", "data_files": [{"split": "dev", "path": "data/dev-*"}, {"split": "test", "path": "data/test-*"}]}], "dataset_info": {"features": [{"name": "question", "dtype": "string"}, {"name": "choices", "sequence": "string"}, {"name": "answer", "dtype": {"class_label": {"names": {"0": "A", "1": "B", "2": "C", "3": "D"}}}}, {"name": "negate_openai_prompt", "struct": [{"name": "content", "dtype": "string"}, {"name": "role", "dtype": "string"}]}, {"name": "neg_question", "dtype": "string"}, {"name": "fewshot_context", "dtype": "string"}, {"name": "ori_prompt", "dtype": "string"}], "splits": [{"name": "dev", "num_bytes": 7683, "num_examples": 5}, {"name": "test", "num_bytes": 1743191, "num_examples": 545}], "download_size": 18091, "dataset_size": 1750874}}
2023-08-21T06:36:56+00:00
[]
[]
TAGS #region-us
# Dataset Card for "mmlu-high_school_psychology-neg-prepend-fix" More Information needed
[ "# Dataset Card for \"mmlu-high_school_psychology-neg-prepend-fix\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"mmlu-high_school_psychology-neg-prepend-fix\"\n\nMore Information needed" ]
[ 6, 29 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"mmlu-high_school_psychology-neg-prepend-fix\"\n\nMore Information needed" ]
213ada577e4fdbdc5bd52e8910d13f78719cbaf4
# Dataset Card for Evaluation run of roneneldan/TinyStories-8M ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/roneneldan/TinyStories-8M - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [roneneldan/TinyStories-8M](https://huggingface.co/roneneldan/TinyStories-8M) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_roneneldan__TinyStories-8M", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-09-22T21:54:42.059067](https://huggingface.co/datasets/open-llm-leaderboard/details_roneneldan__TinyStories-8M/blob/main/results_2023-09-22T21-54-42.059067.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.00010486577181208053, "em_stderr": 0.0001048657718120815, "f1": 0.0030820050335570444, "f1_stderr": 0.00023396683820156773, "acc": 0.2513812154696133, "acc_stderr": 0.007026135605808221 }, "harness|drop|3": { "em": 0.00010486577181208053, "em_stderr": 0.0001048657718120815, "f1": 0.0030820050335570444, "f1_stderr": 0.00023396683820156773 }, "harness|gsm8k|5": { "acc": 0.0, "acc_stderr": 0.0 }, "harness|winogrande|5": { "acc": 0.5027624309392266, "acc_stderr": 0.014052271211616441 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_roneneldan__TinyStories-8M
[ "region:us" ]
2023-08-18T11:04:14+00:00
{"pretty_name": "Evaluation run of roneneldan/TinyStories-8M", "dataset_summary": "Dataset automatically created during the evaluation run of model [roneneldan/TinyStories-8M](https://huggingface.co/roneneldan/TinyStories-8M) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_roneneldan__TinyStories-8M\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-09-22T21:54:42.059067](https://huggingface.co/datasets/open-llm-leaderboard/details_roneneldan__TinyStories-8M/blob/main/results_2023-09-22T21-54-42.059067.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.00010486577181208053,\n \"em_stderr\": 0.0001048657718120815,\n \"f1\": 0.0030820050335570444,\n \"f1_stderr\": 0.00023396683820156773,\n \"acc\": 0.2513812154696133,\n \"acc_stderr\": 0.007026135605808221\n },\n \"harness|drop|3\": {\n \"em\": 0.00010486577181208053,\n \"em_stderr\": 0.0001048657718120815,\n \"f1\": 0.0030820050335570444,\n \"f1_stderr\": 0.00023396683820156773\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\": 0.0\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.5027624309392266,\n \"acc_stderr\": 0.014052271211616441\n }\n}\n```", "repo_url": "https://huggingface.co/roneneldan/TinyStories-8M", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_19T13_29_12.033365", "path": ["**/details_harness|arc:challenge|25_2023-07-19T13:29:12.033365.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-19T13:29:12.033365.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_09_22T21_54_42.059067", "path": ["**/details_harness|drop|3_2023-09-22T21-54-42.059067.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-09-22T21-54-42.059067.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_09_22T21_54_42.059067", "path": ["**/details_harness|gsm8k|5_2023-09-22T21-54-42.059067.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-09-22T21-54-42.059067.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_19T13_29_12.033365", "path": ["**/details_harness|hellaswag|10_2023-07-19T13:29:12.033365.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-19T13:29:12.033365.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_19T13_29_12.033365", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T13:29:12.033365.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T13:29:12.033365.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T13:29:12.033365.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T13:29:12.033365.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T13:29:12.033365.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T13:29:12.033365.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T13:29:12.033365.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T13:29:12.033365.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T13:29:12.033365.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T13:29:12.033365.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T13:29:12.033365.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T13:29:12.033365.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T13:29:12.033365.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T13:29:12.033365.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T13:29:12.033365.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T13:29:12.033365.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T13:29:12.033365.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T13:29:12.033365.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T13:29:12.033365.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T13:29:12.033365.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T13:29:12.033365.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T13:29:12.033365.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T13:29:12.033365.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T13:29:12.033365.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T13:29:12.033365.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T13:29:12.033365.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T13:29:12.033365.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T13:29:12.033365.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T13:29:12.033365.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T13:29:12.033365.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T13:29:12.033365.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T13:29:12.033365.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T13:29:12.033365.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T13:29:12.033365.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T13:29:12.033365.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T13:29:12.033365.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T13:29:12.033365.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T13:29:12.033365.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T13:29:12.033365.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T13:29:12.033365.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T13:29:12.033365.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T13:29:12.033365.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T13:29:12.033365.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T13:29:12.033365.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T13:29:12.033365.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T13:29:12.033365.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T13:29:12.033365.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T13:29:12.033365.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T13:29:12.033365.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T13:29:12.033365.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T13:29:12.033365.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T13:29:12.033365.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T13:29:12.033365.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T13:29:12.033365.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T13:29:12.033365.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T13:29:12.033365.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T13:29:12.033365.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T13:29:12.033365.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T13:29:12.033365.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T13:29:12.033365.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T13:29:12.033365.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T13:29:12.033365.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T13:29:12.033365.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T13:29:12.033365.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T13:29:12.033365.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T13:29:12.033365.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T13:29:12.033365.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T13:29:12.033365.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T13:29:12.033365.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T13:29:12.033365.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T13:29:12.033365.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T13:29:12.033365.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T13:29:12.033365.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T13:29:12.033365.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T13:29:12.033365.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T13:29:12.033365.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T13:29:12.033365.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T13:29:12.033365.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T13:29:12.033365.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T13:29:12.033365.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T13:29:12.033365.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T13:29:12.033365.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T13:29:12.033365.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T13:29:12.033365.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T13:29:12.033365.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T13:29:12.033365.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T13:29:12.033365.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T13:29:12.033365.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T13:29:12.033365.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T13:29:12.033365.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T13:29:12.033365.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T13:29:12.033365.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T13:29:12.033365.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T13:29:12.033365.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T13:29:12.033365.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T13:29:12.033365.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T13:29:12.033365.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T13:29:12.033365.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T13:29:12.033365.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T13:29:12.033365.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T13:29:12.033365.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T13:29:12.033365.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T13:29:12.033365.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T13:29:12.033365.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T13:29:12.033365.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T13:29:12.033365.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T13:29:12.033365.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T13:29:12.033365.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T13:29:12.033365.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T13:29:12.033365.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T13:29:12.033365.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T13:29:12.033365.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T13:29:12.033365.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T13:29:12.033365.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_19T13_29_12.033365", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T13:29:12.033365.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T13:29:12.033365.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_19T13_29_12.033365", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T13:29:12.033365.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T13:29:12.033365.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_19T13_29_12.033365", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T13:29:12.033365.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T13:29:12.033365.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_19T13_29_12.033365", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T13:29:12.033365.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T13:29:12.033365.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_19T13_29_12.033365", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T13:29:12.033365.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T13:29:12.033365.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_19T13_29_12.033365", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T13:29:12.033365.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T13:29:12.033365.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_19T13_29_12.033365", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T13:29:12.033365.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T13:29:12.033365.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_19T13_29_12.033365", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T13:29:12.033365.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T13:29:12.033365.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_19T13_29_12.033365", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T13:29:12.033365.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T13:29:12.033365.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_19T13_29_12.033365", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T13:29:12.033365.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T13:29:12.033365.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_19T13_29_12.033365", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T13:29:12.033365.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T13:29:12.033365.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_19T13_29_12.033365", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T13:29:12.033365.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T13:29:12.033365.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_19T13_29_12.033365", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T13:29:12.033365.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T13:29:12.033365.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_19T13_29_12.033365", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T13:29:12.033365.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T13:29:12.033365.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_19T13_29_12.033365", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T13:29:12.033365.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T13:29:12.033365.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_19T13_29_12.033365", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T13:29:12.033365.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T13:29:12.033365.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_19T13_29_12.033365", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T13:29:12.033365.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T13:29:12.033365.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_19T13_29_12.033365", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T13:29:12.033365.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T13:29:12.033365.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_19T13_29_12.033365", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T13:29:12.033365.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T13:29:12.033365.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_19T13_29_12.033365", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T13:29:12.033365.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T13:29:12.033365.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_19T13_29_12.033365", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T13:29:12.033365.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T13:29:12.033365.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_19T13_29_12.033365", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T13:29:12.033365.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T13:29:12.033365.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_19T13_29_12.033365", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T13:29:12.033365.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T13:29:12.033365.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_19T13_29_12.033365", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T13:29:12.033365.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T13:29:12.033365.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_19T13_29_12.033365", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T13:29:12.033365.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T13:29:12.033365.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_19T13_29_12.033365", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T13:29:12.033365.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T13:29:12.033365.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_19T13_29_12.033365", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T13:29:12.033365.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T13:29:12.033365.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_19T13_29_12.033365", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T13:29:12.033365.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T13:29:12.033365.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_19T13_29_12.033365", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T13:29:12.033365.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T13:29:12.033365.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_19T13_29_12.033365", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T13:29:12.033365.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T13:29:12.033365.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_19T13_29_12.033365", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T13:29:12.033365.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T13:29:12.033365.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_19T13_29_12.033365", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T13:29:12.033365.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T13:29:12.033365.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_19T13_29_12.033365", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T13:29:12.033365.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T13:29:12.033365.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_19T13_29_12.033365", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T13:29:12.033365.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T13:29:12.033365.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_19T13_29_12.033365", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T13:29:12.033365.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T13:29:12.033365.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_19T13_29_12.033365", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T13:29:12.033365.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T13:29:12.033365.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_19T13_29_12.033365", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T13:29:12.033365.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T13:29:12.033365.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_19T13_29_12.033365", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T13:29:12.033365.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T13:29:12.033365.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_19T13_29_12.033365", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T13:29:12.033365.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T13:29:12.033365.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_19T13_29_12.033365", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T13:29:12.033365.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T13:29:12.033365.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_19T13_29_12.033365", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T13:29:12.033365.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T13:29:12.033365.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_19T13_29_12.033365", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T13:29:12.033365.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T13:29:12.033365.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_19T13_29_12.033365", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T13:29:12.033365.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T13:29:12.033365.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_19T13_29_12.033365", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T13:29:12.033365.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T13:29:12.033365.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_19T13_29_12.033365", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T13:29:12.033365.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T13:29:12.033365.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_19T13_29_12.033365", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T13:29:12.033365.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T13:29:12.033365.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_19T13_29_12.033365", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T13:29:12.033365.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T13:29:12.033365.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_19T13_29_12.033365", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T13:29:12.033365.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T13:29:12.033365.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_19T13_29_12.033365", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T13:29:12.033365.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T13:29:12.033365.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_19T13_29_12.033365", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T13:29:12.033365.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T13:29:12.033365.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_19T13_29_12.033365", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T13:29:12.033365.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T13:29:12.033365.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_19T13_29_12.033365", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T13:29:12.033365.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T13:29:12.033365.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_19T13_29_12.033365", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T13:29:12.033365.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T13:29:12.033365.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_19T13_29_12.033365", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T13:29:12.033365.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T13:29:12.033365.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_19T13_29_12.033365", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T13:29:12.033365.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T13:29:12.033365.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_19T13_29_12.033365", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T13:29:12.033365.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T13:29:12.033365.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_19T13_29_12.033365", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T13:29:12.033365.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T13:29:12.033365.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_19T13_29_12.033365", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T13:29:12.033365.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T13:29:12.033365.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_09_22T21_54_42.059067", "path": ["**/details_harness|winogrande|5_2023-09-22T21-54-42.059067.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-09-22T21-54-42.059067.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_19T13_29_12.033365", "path": ["results_2023-07-19T13:29:12.033365.parquet"]}, {"split": "2023_09_22T21_54_42.059067", "path": ["results_2023-09-22T21-54-42.059067.parquet"]}, {"split": "latest", "path": ["results_2023-09-22T21-54-42.059067.parquet"]}]}]}
2023-09-22T20:54:53+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of roneneldan/TinyStories-8M ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model roneneldan/TinyStories-8M on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-09-22T21:54:42.059067(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of roneneldan/TinyStories-8M", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model roneneldan/TinyStories-8M on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-09-22T21:54:42.059067(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of roneneldan/TinyStories-8M", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model roneneldan/TinyStories-8M on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-09-22T21:54:42.059067(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 21, 31, 169, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of roneneldan/TinyStories-8M## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model roneneldan/TinyStories-8M on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-09-22T21:54:42.059067(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
44d27643b18c9fb49a77163cf8f3826cf7920874
# Dataset Card for "mmlu-high_school_statistics-neg-prepend-fix" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
joey234/mmlu-high_school_statistics-neg-prepend-fix
[ "region:us" ]
2023-08-18T11:04:22+00:00
{"configs": [{"config_name": "default", "data_files": [{"split": "dev", "path": "data/dev-*"}, {"split": "test", "path": "data/test-*"}]}], "dataset_info": {"features": [{"name": "question", "dtype": "string"}, {"name": "choices", "sequence": "string"}, {"name": "answer", "dtype": {"class_label": {"names": {"0": "A", "1": "B", "2": "C", "3": "D"}}}}, {"name": "negate_openai_prompt", "struct": [{"name": "content", "dtype": "string"}, {"name": "role", "dtype": "string"}]}, {"name": "neg_question", "dtype": "string"}, {"name": "fewshot_context", "dtype": "string"}, {"name": "ori_prompt", "dtype": "string"}], "splits": [{"name": "dev", "num_bytes": 9060, "num_examples": 5}, {"name": "test", "num_bytes": 779208, "num_examples": 216}], "download_size": 18867, "dataset_size": 788268}}
2023-08-21T06:37:09+00:00
[]
[]
TAGS #region-us
# Dataset Card for "mmlu-high_school_statistics-neg-prepend-fix" More Information needed
[ "# Dataset Card for \"mmlu-high_school_statistics-neg-prepend-fix\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"mmlu-high_school_statistics-neg-prepend-fix\"\n\nMore Information needed" ]
[ 6, 29 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"mmlu-high_school_statistics-neg-prepend-fix\"\n\nMore Information needed" ]
2c86b34fa95d35f780757e117347fcdf3f8e9afe
# Dataset Card for Evaluation run of roneneldan/TinyStories-1M ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/roneneldan/TinyStories-1M - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [roneneldan/TinyStories-1M](https://huggingface.co/roneneldan/TinyStories-1M) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_roneneldan__TinyStories-1M", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-09-22T21:41:24.294253](https://huggingface.co/datasets/open-llm-leaderboard/details_roneneldan__TinyStories-1M/blob/main/results_2023-09-22T21-41-24.294253.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.00020973154362416107, "em_stderr": 0.00014829481977282063, "f1": 0.003178481543624158, "f1_stderr": 0.0002730192207643319, "acc": 0.26085240726124703, "acc_stderr": 0.007019619608242314 }, "harness|drop|3": { "em": 0.00020973154362416107, "em_stderr": 0.00014829481977282063, "f1": 0.003178481543624158, "f1_stderr": 0.0002730192207643319 }, "harness|gsm8k|5": { "acc": 0.0, "acc_stderr": 0.0 }, "harness|winogrande|5": { "acc": 0.5217048145224941, "acc_stderr": 0.014039239216484627 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_roneneldan__TinyStories-1M
[ "region:us" ]
2023-08-18T11:04:23+00:00
{"pretty_name": "Evaluation run of roneneldan/TinyStories-1M", "dataset_summary": "Dataset automatically created during the evaluation run of model [roneneldan/TinyStories-1M](https://huggingface.co/roneneldan/TinyStories-1M) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_roneneldan__TinyStories-1M\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-09-22T21:41:24.294253](https://huggingface.co/datasets/open-llm-leaderboard/details_roneneldan__TinyStories-1M/blob/main/results_2023-09-22T21-41-24.294253.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.00020973154362416107,\n \"em_stderr\": 0.00014829481977282063,\n \"f1\": 0.003178481543624158,\n \"f1_stderr\": 0.0002730192207643319,\n \"acc\": 0.26085240726124703,\n \"acc_stderr\": 0.007019619608242314\n },\n \"harness|drop|3\": {\n \"em\": 0.00020973154362416107,\n \"em_stderr\": 0.00014829481977282063,\n \"f1\": 0.003178481543624158,\n \"f1_stderr\": 0.0002730192207643319\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\": 0.0\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.5217048145224941,\n \"acc_stderr\": 0.014039239216484627\n }\n}\n```", "repo_url": "https://huggingface.co/roneneldan/TinyStories-1M", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_19T13_25_02.593147", "path": ["**/details_harness|arc:challenge|25_2023-07-19T13:25:02.593147.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-19T13:25:02.593147.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_09_22T21_41_24.294253", "path": ["**/details_harness|drop|3_2023-09-22T21-41-24.294253.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-09-22T21-41-24.294253.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_09_22T21_41_24.294253", "path": ["**/details_harness|gsm8k|5_2023-09-22T21-41-24.294253.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-09-22T21-41-24.294253.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_19T13_25_02.593147", "path": ["**/details_harness|hellaswag|10_2023-07-19T13:25:02.593147.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-19T13:25:02.593147.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_19T13_25_02.593147", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T13:25:02.593147.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T13:25:02.593147.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T13:25:02.593147.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T13:25:02.593147.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T13:25:02.593147.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T13:25:02.593147.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T13:25:02.593147.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T13:25:02.593147.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T13:25:02.593147.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T13:25:02.593147.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T13:25:02.593147.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T13:25:02.593147.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T13:25:02.593147.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T13:25:02.593147.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T13:25:02.593147.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T13:25:02.593147.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T13:25:02.593147.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T13:25:02.593147.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T13:25:02.593147.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T13:25:02.593147.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T13:25:02.593147.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T13:25:02.593147.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T13:25:02.593147.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T13:25:02.593147.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T13:25:02.593147.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T13:25:02.593147.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T13:25:02.593147.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T13:25:02.593147.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T13:25:02.593147.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T13:25:02.593147.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T13:25:02.593147.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T13:25:02.593147.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T13:25:02.593147.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T13:25:02.593147.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T13:25:02.593147.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T13:25:02.593147.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T13:25:02.593147.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T13:25:02.593147.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T13:25:02.593147.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T13:25:02.593147.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T13:25:02.593147.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T13:25:02.593147.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T13:25:02.593147.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T13:25:02.593147.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T13:25:02.593147.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T13:25:02.593147.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T13:25:02.593147.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T13:25:02.593147.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T13:25:02.593147.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T13:25:02.593147.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T13:25:02.593147.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T13:25:02.593147.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T13:25:02.593147.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T13:25:02.593147.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T13:25:02.593147.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T13:25:02.593147.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T13:25:02.593147.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T13:25:02.593147.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T13:25:02.593147.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T13:25:02.593147.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T13:25:02.593147.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T13:25:02.593147.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T13:25:02.593147.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T13:25:02.593147.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T13:25:02.593147.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T13:25:02.593147.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T13:25:02.593147.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T13:25:02.593147.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T13:25:02.593147.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T13:25:02.593147.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T13:25:02.593147.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T13:25:02.593147.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T13:25:02.593147.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T13:25:02.593147.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T13:25:02.593147.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T13:25:02.593147.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T13:25:02.593147.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T13:25:02.593147.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T13:25:02.593147.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T13:25:02.593147.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T13:25:02.593147.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T13:25:02.593147.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T13:25:02.593147.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T13:25:02.593147.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T13:25:02.593147.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T13:25:02.593147.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T13:25:02.593147.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T13:25:02.593147.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T13:25:02.593147.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T13:25:02.593147.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T13:25:02.593147.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T13:25:02.593147.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T13:25:02.593147.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T13:25:02.593147.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T13:25:02.593147.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T13:25:02.593147.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T13:25:02.593147.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T13:25:02.593147.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T13:25:02.593147.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T13:25:02.593147.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T13:25:02.593147.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T13:25:02.593147.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T13:25:02.593147.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T13:25:02.593147.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T13:25:02.593147.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T13:25:02.593147.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T13:25:02.593147.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T13:25:02.593147.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T13:25:02.593147.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T13:25:02.593147.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T13:25:02.593147.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T13:25:02.593147.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T13:25:02.593147.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T13:25:02.593147.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_19T13_25_02.593147", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T13:25:02.593147.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T13:25:02.593147.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_19T13_25_02.593147", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T13:25:02.593147.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T13:25:02.593147.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_19T13_25_02.593147", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T13:25:02.593147.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T13:25:02.593147.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_19T13_25_02.593147", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T13:25:02.593147.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T13:25:02.593147.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_19T13_25_02.593147", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T13:25:02.593147.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T13:25:02.593147.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_19T13_25_02.593147", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T13:25:02.593147.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T13:25:02.593147.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_19T13_25_02.593147", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T13:25:02.593147.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T13:25:02.593147.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_19T13_25_02.593147", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T13:25:02.593147.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T13:25:02.593147.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_19T13_25_02.593147", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T13:25:02.593147.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T13:25:02.593147.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_19T13_25_02.593147", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T13:25:02.593147.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T13:25:02.593147.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_19T13_25_02.593147", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T13:25:02.593147.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T13:25:02.593147.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_19T13_25_02.593147", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T13:25:02.593147.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T13:25:02.593147.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_19T13_25_02.593147", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T13:25:02.593147.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T13:25:02.593147.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_19T13_25_02.593147", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T13:25:02.593147.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T13:25:02.593147.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_19T13_25_02.593147", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T13:25:02.593147.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T13:25:02.593147.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_19T13_25_02.593147", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T13:25:02.593147.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T13:25:02.593147.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_19T13_25_02.593147", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T13:25:02.593147.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T13:25:02.593147.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_19T13_25_02.593147", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T13:25:02.593147.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T13:25:02.593147.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_19T13_25_02.593147", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T13:25:02.593147.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T13:25:02.593147.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_19T13_25_02.593147", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T13:25:02.593147.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T13:25:02.593147.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_19T13_25_02.593147", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T13:25:02.593147.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T13:25:02.593147.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_19T13_25_02.593147", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T13:25:02.593147.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T13:25:02.593147.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_19T13_25_02.593147", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T13:25:02.593147.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T13:25:02.593147.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_19T13_25_02.593147", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T13:25:02.593147.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T13:25:02.593147.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_19T13_25_02.593147", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T13:25:02.593147.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T13:25:02.593147.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_19T13_25_02.593147", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T13:25:02.593147.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T13:25:02.593147.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_19T13_25_02.593147", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T13:25:02.593147.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T13:25:02.593147.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_19T13_25_02.593147", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T13:25:02.593147.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T13:25:02.593147.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_19T13_25_02.593147", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T13:25:02.593147.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T13:25:02.593147.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_19T13_25_02.593147", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T13:25:02.593147.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T13:25:02.593147.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_19T13_25_02.593147", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T13:25:02.593147.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T13:25:02.593147.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_19T13_25_02.593147", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T13:25:02.593147.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T13:25:02.593147.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_19T13_25_02.593147", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T13:25:02.593147.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T13:25:02.593147.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_19T13_25_02.593147", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T13:25:02.593147.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T13:25:02.593147.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_19T13_25_02.593147", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T13:25:02.593147.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T13:25:02.593147.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_19T13_25_02.593147", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T13:25:02.593147.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T13:25:02.593147.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_19T13_25_02.593147", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T13:25:02.593147.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T13:25:02.593147.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_19T13_25_02.593147", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T13:25:02.593147.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T13:25:02.593147.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_19T13_25_02.593147", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T13:25:02.593147.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T13:25:02.593147.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_19T13_25_02.593147", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T13:25:02.593147.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T13:25:02.593147.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_19T13_25_02.593147", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T13:25:02.593147.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T13:25:02.593147.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_19T13_25_02.593147", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T13:25:02.593147.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T13:25:02.593147.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_19T13_25_02.593147", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T13:25:02.593147.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T13:25:02.593147.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_19T13_25_02.593147", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T13:25:02.593147.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T13:25:02.593147.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_19T13_25_02.593147", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T13:25:02.593147.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T13:25:02.593147.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_19T13_25_02.593147", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T13:25:02.593147.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T13:25:02.593147.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_19T13_25_02.593147", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T13:25:02.593147.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T13:25:02.593147.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_19T13_25_02.593147", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T13:25:02.593147.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T13:25:02.593147.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_19T13_25_02.593147", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T13:25:02.593147.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T13:25:02.593147.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_19T13_25_02.593147", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T13:25:02.593147.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T13:25:02.593147.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_19T13_25_02.593147", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T13:25:02.593147.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T13:25:02.593147.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_19T13_25_02.593147", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T13:25:02.593147.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T13:25:02.593147.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_19T13_25_02.593147", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T13:25:02.593147.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T13:25:02.593147.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_19T13_25_02.593147", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T13:25:02.593147.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T13:25:02.593147.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_19T13_25_02.593147", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T13:25:02.593147.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T13:25:02.593147.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_19T13_25_02.593147", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T13:25:02.593147.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T13:25:02.593147.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_19T13_25_02.593147", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T13:25:02.593147.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T13:25:02.593147.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_19T13_25_02.593147", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T13:25:02.593147.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T13:25:02.593147.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_09_22T21_41_24.294253", "path": ["**/details_harness|winogrande|5_2023-09-22T21-41-24.294253.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-09-22T21-41-24.294253.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_19T13_25_02.593147", "path": ["results_2023-07-19T13:25:02.593147.parquet"]}, {"split": "2023_09_22T21_41_24.294253", "path": ["results_2023-09-22T21-41-24.294253.parquet"]}, {"split": "latest", "path": ["results_2023-09-22T21-41-24.294253.parquet"]}]}]}
2023-09-22T20:41:36+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of roneneldan/TinyStories-1M ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model roneneldan/TinyStories-1M on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-09-22T21:41:24.294253(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of roneneldan/TinyStories-1M", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model roneneldan/TinyStories-1M on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-09-22T21:41:24.294253(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of roneneldan/TinyStories-1M", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model roneneldan/TinyStories-1M on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-09-22T21:41:24.294253(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 21, 31, 169, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of roneneldan/TinyStories-1M## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model roneneldan/TinyStories-1M on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-09-22T21:41:24.294253(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
189905d644b7a5a78063ce44d05d20df3a181e82
# Dataset Card for Evaluation run of roneneldan/TinyStories-28M ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/roneneldan/TinyStories-28M - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [roneneldan/TinyStories-28M](https://huggingface.co/roneneldan/TinyStories-28M) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_roneneldan__TinyStories-28M", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-10-23T11:40:27.697505](https://huggingface.co/datasets/open-llm-leaderboard/details_roneneldan__TinyStories-28M/blob/main/results_2023-10-23T11-40-27.697505.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.0, "em_stderr": 0.0, "f1": 0.000667994966442953, "f1_stderr": 0.0001364795889684006, "acc": 0.2521704814522494, "acc_stderr": 0.007025978032038445 }, "harness|drop|3": { "em": 0.0, "em_stderr": 0.0, "f1": 0.000667994966442953, "f1_stderr": 0.0001364795889684006 }, "harness|gsm8k|5": { "acc": 0.0, "acc_stderr": 0.0 }, "harness|winogrande|5": { "acc": 0.5043409629044988, "acc_stderr": 0.01405195606407689 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_roneneldan__TinyStories-28M
[ "region:us" ]
2023-08-18T11:04:32+00:00
{"pretty_name": "Evaluation run of roneneldan/TinyStories-28M", "dataset_summary": "Dataset automatically created during the evaluation run of model [roneneldan/TinyStories-28M](https://huggingface.co/roneneldan/TinyStories-28M) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_roneneldan__TinyStories-28M\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-23T11:40:27.697505](https://huggingface.co/datasets/open-llm-leaderboard/details_roneneldan__TinyStories-28M/blob/main/results_2023-10-23T11-40-27.697505.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0,\n \"em_stderr\": 0.0,\n \"f1\": 0.000667994966442953,\n \"f1_stderr\": 0.0001364795889684006,\n \"acc\": 0.2521704814522494,\n \"acc_stderr\": 0.007025978032038445\n },\n \"harness|drop|3\": {\n \"em\": 0.0,\n \"em_stderr\": 0.0,\n \"f1\": 0.000667994966442953,\n \"f1_stderr\": 0.0001364795889684006\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\": 0.0\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.5043409629044988,\n \"acc_stderr\": 0.01405195606407689\n }\n}\n```", "repo_url": "https://huggingface.co/roneneldan/TinyStories-28M", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_19T13_32_08.084027", "path": ["**/details_harness|arc:challenge|25_2023-07-19T13:32:08.084027.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-19T13:32:08.084027.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_23T11_40_27.697505", "path": ["**/details_harness|drop|3_2023-10-23T11-40-27.697505.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-23T11-40-27.697505.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_23T11_40_27.697505", "path": ["**/details_harness|gsm8k|5_2023-10-23T11-40-27.697505.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-23T11-40-27.697505.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_19T13_32_08.084027", "path": ["**/details_harness|hellaswag|10_2023-07-19T13:32:08.084027.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-19T13:32:08.084027.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_19T13_32_08.084027", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T13:32:08.084027.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T13:32:08.084027.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T13:32:08.084027.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T13:32:08.084027.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T13:32:08.084027.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T13:32:08.084027.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T13:32:08.084027.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T13:32:08.084027.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T13:32:08.084027.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T13:32:08.084027.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T13:32:08.084027.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T13:32:08.084027.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T13:32:08.084027.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T13:32:08.084027.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T13:32:08.084027.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T13:32:08.084027.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T13:32:08.084027.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T13:32:08.084027.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T13:32:08.084027.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T13:32:08.084027.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T13:32:08.084027.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T13:32:08.084027.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T13:32:08.084027.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T13:32:08.084027.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T13:32:08.084027.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T13:32:08.084027.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T13:32:08.084027.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T13:32:08.084027.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T13:32:08.084027.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T13:32:08.084027.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T13:32:08.084027.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T13:32:08.084027.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T13:32:08.084027.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T13:32:08.084027.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T13:32:08.084027.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T13:32:08.084027.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T13:32:08.084027.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T13:32:08.084027.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T13:32:08.084027.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T13:32:08.084027.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T13:32:08.084027.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T13:32:08.084027.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T13:32:08.084027.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T13:32:08.084027.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T13:32:08.084027.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T13:32:08.084027.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T13:32:08.084027.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T13:32:08.084027.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T13:32:08.084027.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T13:32:08.084027.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T13:32:08.084027.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T13:32:08.084027.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T13:32:08.084027.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T13:32:08.084027.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T13:32:08.084027.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T13:32:08.084027.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T13:32:08.084027.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T13:32:08.084027.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T13:32:08.084027.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T13:32:08.084027.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T13:32:08.084027.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T13:32:08.084027.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T13:32:08.084027.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T13:32:08.084027.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T13:32:08.084027.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T13:32:08.084027.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T13:32:08.084027.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T13:32:08.084027.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T13:32:08.084027.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T13:32:08.084027.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T13:32:08.084027.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T13:32:08.084027.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T13:32:08.084027.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T13:32:08.084027.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T13:32:08.084027.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T13:32:08.084027.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T13:32:08.084027.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T13:32:08.084027.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T13:32:08.084027.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T13:32:08.084027.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T13:32:08.084027.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T13:32:08.084027.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T13:32:08.084027.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T13:32:08.084027.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T13:32:08.084027.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T13:32:08.084027.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T13:32:08.084027.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T13:32:08.084027.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T13:32:08.084027.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T13:32:08.084027.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T13:32:08.084027.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T13:32:08.084027.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T13:32:08.084027.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T13:32:08.084027.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T13:32:08.084027.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T13:32:08.084027.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T13:32:08.084027.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T13:32:08.084027.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T13:32:08.084027.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T13:32:08.084027.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T13:32:08.084027.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T13:32:08.084027.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T13:32:08.084027.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T13:32:08.084027.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T13:32:08.084027.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T13:32:08.084027.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T13:32:08.084027.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T13:32:08.084027.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T13:32:08.084027.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T13:32:08.084027.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T13:32:08.084027.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T13:32:08.084027.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T13:32:08.084027.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T13:32:08.084027.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_19T13_32_08.084027", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T13:32:08.084027.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T13:32:08.084027.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_19T13_32_08.084027", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T13:32:08.084027.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T13:32:08.084027.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_19T13_32_08.084027", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T13:32:08.084027.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T13:32:08.084027.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_19T13_32_08.084027", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T13:32:08.084027.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T13:32:08.084027.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_19T13_32_08.084027", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T13:32:08.084027.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T13:32:08.084027.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_19T13_32_08.084027", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T13:32:08.084027.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T13:32:08.084027.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_19T13_32_08.084027", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T13:32:08.084027.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T13:32:08.084027.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_19T13_32_08.084027", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T13:32:08.084027.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T13:32:08.084027.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_19T13_32_08.084027", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T13:32:08.084027.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T13:32:08.084027.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_19T13_32_08.084027", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T13:32:08.084027.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T13:32:08.084027.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_19T13_32_08.084027", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T13:32:08.084027.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T13:32:08.084027.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_19T13_32_08.084027", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T13:32:08.084027.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T13:32:08.084027.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_19T13_32_08.084027", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T13:32:08.084027.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T13:32:08.084027.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_19T13_32_08.084027", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T13:32:08.084027.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T13:32:08.084027.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_19T13_32_08.084027", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T13:32:08.084027.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T13:32:08.084027.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_19T13_32_08.084027", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T13:32:08.084027.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T13:32:08.084027.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_19T13_32_08.084027", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T13:32:08.084027.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T13:32:08.084027.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_19T13_32_08.084027", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T13:32:08.084027.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T13:32:08.084027.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_19T13_32_08.084027", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T13:32:08.084027.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T13:32:08.084027.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_19T13_32_08.084027", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T13:32:08.084027.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T13:32:08.084027.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_19T13_32_08.084027", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T13:32:08.084027.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T13:32:08.084027.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_19T13_32_08.084027", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T13:32:08.084027.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T13:32:08.084027.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_19T13_32_08.084027", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T13:32:08.084027.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T13:32:08.084027.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_19T13_32_08.084027", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T13:32:08.084027.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T13:32:08.084027.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_19T13_32_08.084027", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T13:32:08.084027.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T13:32:08.084027.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_19T13_32_08.084027", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T13:32:08.084027.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T13:32:08.084027.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_19T13_32_08.084027", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T13:32:08.084027.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T13:32:08.084027.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_19T13_32_08.084027", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T13:32:08.084027.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T13:32:08.084027.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_19T13_32_08.084027", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T13:32:08.084027.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T13:32:08.084027.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_19T13_32_08.084027", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T13:32:08.084027.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T13:32:08.084027.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_19T13_32_08.084027", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T13:32:08.084027.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T13:32:08.084027.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_19T13_32_08.084027", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T13:32:08.084027.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T13:32:08.084027.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_19T13_32_08.084027", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T13:32:08.084027.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T13:32:08.084027.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_19T13_32_08.084027", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T13:32:08.084027.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T13:32:08.084027.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_19T13_32_08.084027", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T13:32:08.084027.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T13:32:08.084027.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_19T13_32_08.084027", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T13:32:08.084027.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T13:32:08.084027.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_19T13_32_08.084027", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T13:32:08.084027.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T13:32:08.084027.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_19T13_32_08.084027", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T13:32:08.084027.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T13:32:08.084027.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_19T13_32_08.084027", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T13:32:08.084027.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T13:32:08.084027.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_19T13_32_08.084027", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T13:32:08.084027.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T13:32:08.084027.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_19T13_32_08.084027", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T13:32:08.084027.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T13:32:08.084027.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_19T13_32_08.084027", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T13:32:08.084027.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T13:32:08.084027.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_19T13_32_08.084027", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T13:32:08.084027.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T13:32:08.084027.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_19T13_32_08.084027", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T13:32:08.084027.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T13:32:08.084027.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_19T13_32_08.084027", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T13:32:08.084027.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T13:32:08.084027.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_19T13_32_08.084027", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T13:32:08.084027.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T13:32:08.084027.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_19T13_32_08.084027", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T13:32:08.084027.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T13:32:08.084027.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_19T13_32_08.084027", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T13:32:08.084027.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T13:32:08.084027.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_19T13_32_08.084027", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T13:32:08.084027.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T13:32:08.084027.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_19T13_32_08.084027", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T13:32:08.084027.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T13:32:08.084027.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_19T13_32_08.084027", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T13:32:08.084027.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T13:32:08.084027.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_19T13_32_08.084027", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T13:32:08.084027.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T13:32:08.084027.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_19T13_32_08.084027", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T13:32:08.084027.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T13:32:08.084027.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_19T13_32_08.084027", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T13:32:08.084027.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T13:32:08.084027.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_19T13_32_08.084027", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T13:32:08.084027.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T13:32:08.084027.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_19T13_32_08.084027", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T13:32:08.084027.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T13:32:08.084027.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_19T13_32_08.084027", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T13:32:08.084027.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T13:32:08.084027.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_19T13_32_08.084027", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T13:32:08.084027.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T13:32:08.084027.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_23T11_40_27.697505", "path": ["**/details_harness|winogrande|5_2023-10-23T11-40-27.697505.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-23T11-40-27.697505.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_19T13_32_08.084027", "path": ["results_2023-07-19T13:32:08.084027.parquet"]}, {"split": "2023_10_23T11_40_27.697505", "path": ["results_2023-10-23T11-40-27.697505.parquet"]}, {"split": "latest", "path": ["results_2023-10-23T11-40-27.697505.parquet"]}]}]}
2023-10-23T10:40:40+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of roneneldan/TinyStories-28M ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model roneneldan/TinyStories-28M on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-10-23T11:40:27.697505(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of roneneldan/TinyStories-28M", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model roneneldan/TinyStories-28M on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-23T11:40:27.697505(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of roneneldan/TinyStories-28M", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model roneneldan/TinyStories-28M on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-23T11:40:27.697505(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 21, 31, 169, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of roneneldan/TinyStories-28M## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model roneneldan/TinyStories-28M on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-23T11:40:27.697505(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
1d6c3cd63849bfc51f026a7204e9364c0ca14ca0
# Dataset Card for "mmlu-high_school_us_history-neg-prepend-fix" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
joey234/mmlu-high_school_us_history-neg-prepend-fix
[ "region:us" ]
2023-08-18T11:04:36+00:00
{"configs": [{"config_name": "default", "data_files": [{"split": "dev", "path": "data/dev-*"}, {"split": "test", "path": "data/test-*"}]}], "dataset_info": {"features": [{"name": "question", "dtype": "string"}, {"name": "choices", "sequence": "string"}, {"name": "answer", "dtype": {"class_label": {"names": {"0": "A", "1": "B", "2": "C", "3": "D"}}}}, {"name": "negate_openai_prompt", "struct": [{"name": "content", "dtype": "string"}, {"name": "role", "dtype": "string"}]}, {"name": "neg_question", "dtype": "string"}, {"name": "fewshot_context", "dtype": "string"}, {"name": "ori_prompt", "dtype": "string"}], "splits": [{"name": "dev", "num_bytes": 28257, "num_examples": 5}, {"name": "test", "num_bytes": 1258154, "num_examples": 204}], "download_size": 55081, "dataset_size": 1286411}}
2023-08-21T06:37:22+00:00
[]
[]
TAGS #region-us
# Dataset Card for "mmlu-high_school_us_history-neg-prepend-fix" More Information needed
[ "# Dataset Card for \"mmlu-high_school_us_history-neg-prepend-fix\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"mmlu-high_school_us_history-neg-prepend-fix\"\n\nMore Information needed" ]
[ 6, 30 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"mmlu-high_school_us_history-neg-prepend-fix\"\n\nMore Information needed" ]
79955230b3a45a3f31f035082d0401432fe3b2d6
# Dataset Card for Evaluation run of openbmb/UltraLM-65b ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/openbmb/UltraLM-65b - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [openbmb/UltraLM-65b](https://huggingface.co/openbmb/UltraLM-65b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_openbmb__UltraLM-65b", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-09-23T05:14:21.286059](https://huggingface.co/datasets/open-llm-leaderboard/details_openbmb__UltraLM-65b/blob/main/results_2023-09-23T05-14-21.286059.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.23804530201342283, "em_stderr": 0.004361481495925771, "f1": 0.2999853187919465, "f1_stderr": 0.004304795126990332, "acc": 0.5694431396390439, "acc_stderr": 0.011961137264223144 }, "harness|drop|3": { "em": 0.23804530201342283, "em_stderr": 0.004361481495925771, "f1": 0.2999853187919465, "f1_stderr": 0.004304795126990332 }, "harness|gsm8k|5": { "acc": 0.32752084912812734, "acc_stderr": 0.012927102210426474 }, "harness|winogrande|5": { "acc": 0.8113654301499605, "acc_stderr": 0.010995172318019811 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_openbmb__UltraLM-65b
[ "region:us" ]
2023-08-18T11:04:40+00:00
{"pretty_name": "Evaluation run of openbmb/UltraLM-65b", "dataset_summary": "Dataset automatically created during the evaluation run of model [openbmb/UltraLM-65b](https://huggingface.co/openbmb/UltraLM-65b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_openbmb__UltraLM-65b\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-09-23T05:14:21.286059](https://huggingface.co/datasets/open-llm-leaderboard/details_openbmb__UltraLM-65b/blob/main/results_2023-09-23T05-14-21.286059.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.23804530201342283,\n \"em_stderr\": 0.004361481495925771,\n \"f1\": 0.2999853187919465,\n \"f1_stderr\": 0.004304795126990332,\n \"acc\": 0.5694431396390439,\n \"acc_stderr\": 0.011961137264223144\n },\n \"harness|drop|3\": {\n \"em\": 0.23804530201342283,\n \"em_stderr\": 0.004361481495925771,\n \"f1\": 0.2999853187919465,\n \"f1_stderr\": 0.004304795126990332\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.32752084912812734,\n \"acc_stderr\": 0.012927102210426474\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8113654301499605,\n \"acc_stderr\": 0.010995172318019811\n }\n}\n```", "repo_url": "https://huggingface.co/openbmb/UltraLM-65b", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_08_04T22_09_07.792369", "path": ["**/details_harness|arc:challenge|25_2023-08-04T22:09:07.792369.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-08-04T22:09:07.792369.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_09_18T23_27_44.207127", "path": ["**/details_harness|drop|3_2023-09-18T23-27-44.207127.parquet"]}, {"split": "2023_09_23T05_14_21.286059", "path": ["**/details_harness|drop|3_2023-09-23T05-14-21.286059.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-09-23T05-14-21.286059.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_09_18T23_27_44.207127", "path": ["**/details_harness|gsm8k|5_2023-09-18T23-27-44.207127.parquet"]}, {"split": "2023_09_23T05_14_21.286059", "path": ["**/details_harness|gsm8k|5_2023-09-23T05-14-21.286059.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-09-23T05-14-21.286059.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_08_04T22_09_07.792369", "path": ["**/details_harness|hellaswag|10_2023-08-04T22:09:07.792369.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-08-04T22:09:07.792369.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_08_04T22_09_07.792369", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-04T22:09:07.792369.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-04T22:09:07.792369.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-04T22:09:07.792369.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-04T22:09:07.792369.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-04T22:09:07.792369.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-04T22:09:07.792369.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-04T22:09:07.792369.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-04T22:09:07.792369.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-04T22:09:07.792369.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-04T22:09:07.792369.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-04T22:09:07.792369.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-04T22:09:07.792369.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-04T22:09:07.792369.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-04T22:09:07.792369.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-04T22:09:07.792369.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-04T22:09:07.792369.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-04T22:09:07.792369.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-04T22:09:07.792369.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-04T22:09:07.792369.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-04T22:09:07.792369.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-04T22:09:07.792369.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-04T22:09:07.792369.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-04T22:09:07.792369.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-04T22:09:07.792369.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-04T22:09:07.792369.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-04T22:09:07.792369.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-04T22:09:07.792369.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-04T22:09:07.792369.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-04T22:09:07.792369.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-04T22:09:07.792369.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-04T22:09:07.792369.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-04T22:09:07.792369.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-04T22:09:07.792369.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-04T22:09:07.792369.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-04T22:09:07.792369.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-04T22:09:07.792369.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-04T22:09:07.792369.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-04T22:09:07.792369.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-04T22:09:07.792369.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-04T22:09:07.792369.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-04T22:09:07.792369.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-04T22:09:07.792369.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-04T22:09:07.792369.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-04T22:09:07.792369.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-04T22:09:07.792369.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-04T22:09:07.792369.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-04T22:09:07.792369.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-04T22:09:07.792369.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-04T22:09:07.792369.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-04T22:09:07.792369.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-04T22:09:07.792369.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-04T22:09:07.792369.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-04T22:09:07.792369.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-04T22:09:07.792369.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-04T22:09:07.792369.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-04T22:09:07.792369.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-04T22:09:07.792369.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-04T22:09:07.792369.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-04T22:09:07.792369.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-04T22:09:07.792369.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-04T22:09:07.792369.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-04T22:09:07.792369.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-04T22:09:07.792369.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-04T22:09:07.792369.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-04T22:09:07.792369.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-04T22:09:07.792369.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-04T22:09:07.792369.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-04T22:09:07.792369.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-04T22:09:07.792369.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-04T22:09:07.792369.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-04T22:09:07.792369.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-04T22:09:07.792369.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-04T22:09:07.792369.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-04T22:09:07.792369.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-04T22:09:07.792369.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-04T22:09:07.792369.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-04T22:09:07.792369.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-04T22:09:07.792369.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-04T22:09:07.792369.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-04T22:09:07.792369.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-04T22:09:07.792369.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-04T22:09:07.792369.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-04T22:09:07.792369.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-04T22:09:07.792369.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-04T22:09:07.792369.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-04T22:09:07.792369.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-04T22:09:07.792369.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-04T22:09:07.792369.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-04T22:09:07.792369.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-04T22:09:07.792369.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-04T22:09:07.792369.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-04T22:09:07.792369.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-04T22:09:07.792369.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-04T22:09:07.792369.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-04T22:09:07.792369.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-04T22:09:07.792369.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-04T22:09:07.792369.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-04T22:09:07.792369.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-04T22:09:07.792369.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-04T22:09:07.792369.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-04T22:09:07.792369.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-04T22:09:07.792369.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-04T22:09:07.792369.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-04T22:09:07.792369.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-04T22:09:07.792369.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-04T22:09:07.792369.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-04T22:09:07.792369.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-04T22:09:07.792369.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-04T22:09:07.792369.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-04T22:09:07.792369.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-04T22:09:07.792369.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-04T22:09:07.792369.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-04T22:09:07.792369.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-04T22:09:07.792369.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_08_04T22_09_07.792369", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-04T22:09:07.792369.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-04T22:09:07.792369.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_08_04T22_09_07.792369", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-04T22:09:07.792369.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-04T22:09:07.792369.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_08_04T22_09_07.792369", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-04T22:09:07.792369.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-04T22:09:07.792369.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_08_04T22_09_07.792369", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-04T22:09:07.792369.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-04T22:09:07.792369.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_08_04T22_09_07.792369", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-04T22:09:07.792369.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-04T22:09:07.792369.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_08_04T22_09_07.792369", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-04T22:09:07.792369.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-04T22:09:07.792369.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_08_04T22_09_07.792369", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-04T22:09:07.792369.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-04T22:09:07.792369.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_08_04T22_09_07.792369", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-04T22:09:07.792369.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-04T22:09:07.792369.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_08_04T22_09_07.792369", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-04T22:09:07.792369.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-04T22:09:07.792369.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_08_04T22_09_07.792369", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-04T22:09:07.792369.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-04T22:09:07.792369.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_08_04T22_09_07.792369", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-04T22:09:07.792369.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-04T22:09:07.792369.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_08_04T22_09_07.792369", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-04T22:09:07.792369.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-04T22:09:07.792369.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_08_04T22_09_07.792369", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-04T22:09:07.792369.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-04T22:09:07.792369.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_08_04T22_09_07.792369", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-04T22:09:07.792369.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-04T22:09:07.792369.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_08_04T22_09_07.792369", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-04T22:09:07.792369.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-04T22:09:07.792369.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_08_04T22_09_07.792369", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-04T22:09:07.792369.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-04T22:09:07.792369.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_08_04T22_09_07.792369", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-04T22:09:07.792369.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-04T22:09:07.792369.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_08_04T22_09_07.792369", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-04T22:09:07.792369.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-04T22:09:07.792369.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_08_04T22_09_07.792369", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-04T22:09:07.792369.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-04T22:09:07.792369.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_08_04T22_09_07.792369", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-04T22:09:07.792369.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-04T22:09:07.792369.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_08_04T22_09_07.792369", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-04T22:09:07.792369.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-04T22:09:07.792369.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_08_04T22_09_07.792369", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-04T22:09:07.792369.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-04T22:09:07.792369.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_08_04T22_09_07.792369", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-04T22:09:07.792369.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-04T22:09:07.792369.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_08_04T22_09_07.792369", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-04T22:09:07.792369.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-04T22:09:07.792369.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_08_04T22_09_07.792369", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-04T22:09:07.792369.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-04T22:09:07.792369.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_08_04T22_09_07.792369", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-04T22:09:07.792369.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-04T22:09:07.792369.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_08_04T22_09_07.792369", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-04T22:09:07.792369.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-04T22:09:07.792369.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_08_04T22_09_07.792369", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-04T22:09:07.792369.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-04T22:09:07.792369.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_08_04T22_09_07.792369", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-04T22:09:07.792369.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-04T22:09:07.792369.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_08_04T22_09_07.792369", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-04T22:09:07.792369.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-04T22:09:07.792369.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_08_04T22_09_07.792369", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-04T22:09:07.792369.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-04T22:09:07.792369.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_08_04T22_09_07.792369", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-04T22:09:07.792369.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-04T22:09:07.792369.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_08_04T22_09_07.792369", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-04T22:09:07.792369.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-04T22:09:07.792369.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_08_04T22_09_07.792369", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-04T22:09:07.792369.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-04T22:09:07.792369.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_08_04T22_09_07.792369", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-04T22:09:07.792369.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-04T22:09:07.792369.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_08_04T22_09_07.792369", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-04T22:09:07.792369.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-04T22:09:07.792369.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_08_04T22_09_07.792369", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-04T22:09:07.792369.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-04T22:09:07.792369.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_08_04T22_09_07.792369", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-04T22:09:07.792369.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-04T22:09:07.792369.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_08_04T22_09_07.792369", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-04T22:09:07.792369.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-04T22:09:07.792369.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_08_04T22_09_07.792369", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-04T22:09:07.792369.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-04T22:09:07.792369.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_08_04T22_09_07.792369", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-04T22:09:07.792369.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-04T22:09:07.792369.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_08_04T22_09_07.792369", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-04T22:09:07.792369.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-04T22:09:07.792369.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_08_04T22_09_07.792369", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-04T22:09:07.792369.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-04T22:09:07.792369.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_08_04T22_09_07.792369", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-04T22:09:07.792369.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-04T22:09:07.792369.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_08_04T22_09_07.792369", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-04T22:09:07.792369.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-04T22:09:07.792369.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_08_04T22_09_07.792369", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-04T22:09:07.792369.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-04T22:09:07.792369.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_08_04T22_09_07.792369", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-04T22:09:07.792369.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-04T22:09:07.792369.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_08_04T22_09_07.792369", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-04T22:09:07.792369.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-04T22:09:07.792369.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_08_04T22_09_07.792369", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-04T22:09:07.792369.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-04T22:09:07.792369.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_08_04T22_09_07.792369", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-04T22:09:07.792369.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-04T22:09:07.792369.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_08_04T22_09_07.792369", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-04T22:09:07.792369.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-04T22:09:07.792369.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_08_04T22_09_07.792369", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-04T22:09:07.792369.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-04T22:09:07.792369.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_08_04T22_09_07.792369", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-04T22:09:07.792369.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-04T22:09:07.792369.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_08_04T22_09_07.792369", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-04T22:09:07.792369.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-04T22:09:07.792369.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_08_04T22_09_07.792369", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-04T22:09:07.792369.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-04T22:09:07.792369.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_08_04T22_09_07.792369", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-04T22:09:07.792369.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-04T22:09:07.792369.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_08_04T22_09_07.792369", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-04T22:09:07.792369.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-04T22:09:07.792369.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_08_04T22_09_07.792369", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-04T22:09:07.792369.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-04T22:09:07.792369.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_09_18T23_27_44.207127", "path": ["**/details_harness|winogrande|5_2023-09-18T23-27-44.207127.parquet"]}, {"split": "2023_09_23T05_14_21.286059", "path": ["**/details_harness|winogrande|5_2023-09-23T05-14-21.286059.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-09-23T05-14-21.286059.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_08_04T22_09_07.792369", "path": ["results_2023-08-04T22:09:07.792369.parquet"]}, {"split": "2023_09_18T23_27_44.207127", "path": ["results_2023-09-18T23-27-44.207127.parquet"]}, {"split": "2023_09_23T05_14_21.286059", "path": ["results_2023-09-23T05-14-21.286059.parquet"]}, {"split": "latest", "path": ["results_2023-09-23T05-14-21.286059.parquet"]}]}]}
2023-09-23T04:14:33+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of openbmb/UltraLM-65b ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model openbmb/UltraLM-65b on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-09-23T05:14:21.286059(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of openbmb/UltraLM-65b", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model openbmb/UltraLM-65b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-09-23T05:14:21.286059(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of openbmb/UltraLM-65b", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model openbmb/UltraLM-65b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-09-23T05:14:21.286059(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 19, 31, 167, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of openbmb/UltraLM-65b## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model openbmb/UltraLM-65b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-09-23T05:14:21.286059(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
80ec9ab847d6772c0adcb674a52ffbd40b28e27f
# Dataset Card for Evaluation run of xhyi/PT_GPTNEO350_ATG ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/xhyi/PT_GPTNEO350_ATG - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [xhyi/PT_GPTNEO350_ATG](https://huggingface.co/xhyi/PT_GPTNEO350_ATG) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_xhyi__PT_GPTNEO350_ATG", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-09-16T19:50:14.065023](https://huggingface.co/datasets/open-llm-leaderboard/details_xhyi__PT_GPTNEO350_ATG/blob/main/results_2023-09-16T19-50-14.065023.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.0005243288590604027, "em_stderr": 0.0002344378046483565, "f1": 0.036350671140939664, "f1_stderr": 0.001029772885671985, "acc": 0.259575160680552, "acc_stderr": 0.007950023713639726 }, "harness|drop|3": { "em": 0.0005243288590604027, "em_stderr": 0.0002344378046483565, "f1": 0.036350671140939664, "f1_stderr": 0.001029772885671985 }, "harness|gsm8k|5": { "acc": 0.004548900682335102, "acc_stderr": 0.0018535550440036202 }, "harness|winogrande|5": { "acc": 0.5146014206787688, "acc_stderr": 0.014046492383275832 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_xhyi__PT_GPTNEO350_ATG
[ "region:us" ]
2023-08-18T11:04:49+00:00
{"pretty_name": "Evaluation run of xhyi/PT_GPTNEO350_ATG", "dataset_summary": "Dataset automatically created during the evaluation run of model [xhyi/PT_GPTNEO350_ATG](https://huggingface.co/xhyi/PT_GPTNEO350_ATG) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_xhyi__PT_GPTNEO350_ATG\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-09-16T19:50:14.065023](https://huggingface.co/datasets/open-llm-leaderboard/details_xhyi__PT_GPTNEO350_ATG/blob/main/results_2023-09-16T19-50-14.065023.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0005243288590604027,\n \"em_stderr\": 0.0002344378046483565,\n \"f1\": 0.036350671140939664,\n \"f1_stderr\": 0.001029772885671985,\n \"acc\": 0.259575160680552,\n \"acc_stderr\": 0.007950023713639726\n },\n \"harness|drop|3\": {\n \"em\": 0.0005243288590604027,\n \"em_stderr\": 0.0002344378046483565,\n \"f1\": 0.036350671140939664,\n \"f1_stderr\": 0.001029772885671985\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.004548900682335102,\n \"acc_stderr\": 0.0018535550440036202\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.5146014206787688,\n \"acc_stderr\": 0.014046492383275832\n }\n}\n```", "repo_url": "https://huggingface.co/xhyi/PT_GPTNEO350_ATG", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_19T11_43_22.024559", "path": ["**/details_harness|arc:challenge|25_2023-07-19T11:43:22.024559.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-19T11:43:22.024559.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_09_16T19_50_14.065023", "path": ["**/details_harness|drop|3_2023-09-16T19-50-14.065023.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-09-16T19-50-14.065023.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_09_16T19_50_14.065023", "path": ["**/details_harness|gsm8k|5_2023-09-16T19-50-14.065023.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-09-16T19-50-14.065023.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_19T11_43_22.024559", "path": ["**/details_harness|hellaswag|10_2023-07-19T11:43:22.024559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-19T11:43:22.024559.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_19T11_43_22.024559", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T11:43:22.024559.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T11:43:22.024559.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T11:43:22.024559.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T11:43:22.024559.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T11:43:22.024559.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T11:43:22.024559.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T11:43:22.024559.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T11:43:22.024559.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T11:43:22.024559.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T11:43:22.024559.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T11:43:22.024559.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T11:43:22.024559.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T11:43:22.024559.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T11:43:22.024559.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T11:43:22.024559.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T11:43:22.024559.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T11:43:22.024559.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T11:43:22.024559.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T11:43:22.024559.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T11:43:22.024559.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T11:43:22.024559.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T11:43:22.024559.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T11:43:22.024559.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T11:43:22.024559.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T11:43:22.024559.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T11:43:22.024559.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T11:43:22.024559.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T11:43:22.024559.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T11:43:22.024559.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T11:43:22.024559.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T11:43:22.024559.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T11:43:22.024559.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T11:43:22.024559.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T11:43:22.024559.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T11:43:22.024559.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T11:43:22.024559.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T11:43:22.024559.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T11:43:22.024559.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T11:43:22.024559.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T11:43:22.024559.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T11:43:22.024559.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T11:43:22.024559.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T11:43:22.024559.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T11:43:22.024559.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T11:43:22.024559.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T11:43:22.024559.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T11:43:22.024559.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T11:43:22.024559.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T11:43:22.024559.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T11:43:22.024559.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T11:43:22.024559.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T11:43:22.024559.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T11:43:22.024559.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T11:43:22.024559.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T11:43:22.024559.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T11:43:22.024559.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T11:43:22.024559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T11:43:22.024559.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T11:43:22.024559.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T11:43:22.024559.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T11:43:22.024559.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T11:43:22.024559.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T11:43:22.024559.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T11:43:22.024559.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T11:43:22.024559.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T11:43:22.024559.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T11:43:22.024559.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T11:43:22.024559.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T11:43:22.024559.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T11:43:22.024559.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T11:43:22.024559.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T11:43:22.024559.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T11:43:22.024559.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T11:43:22.024559.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T11:43:22.024559.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T11:43:22.024559.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T11:43:22.024559.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T11:43:22.024559.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T11:43:22.024559.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T11:43:22.024559.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T11:43:22.024559.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T11:43:22.024559.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T11:43:22.024559.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T11:43:22.024559.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T11:43:22.024559.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T11:43:22.024559.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T11:43:22.024559.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T11:43:22.024559.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T11:43:22.024559.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T11:43:22.024559.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T11:43:22.024559.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T11:43:22.024559.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T11:43:22.024559.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T11:43:22.024559.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T11:43:22.024559.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T11:43:22.024559.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T11:43:22.024559.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T11:43:22.024559.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T11:43:22.024559.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T11:43:22.024559.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T11:43:22.024559.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T11:43:22.024559.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T11:43:22.024559.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T11:43:22.024559.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T11:43:22.024559.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T11:43:22.024559.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T11:43:22.024559.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T11:43:22.024559.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T11:43:22.024559.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T11:43:22.024559.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T11:43:22.024559.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T11:43:22.024559.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T11:43:22.024559.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T11:43:22.024559.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_19T11_43_22.024559", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T11:43:22.024559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T11:43:22.024559.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_19T11_43_22.024559", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T11:43:22.024559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T11:43:22.024559.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_19T11_43_22.024559", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T11:43:22.024559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T11:43:22.024559.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_19T11_43_22.024559", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T11:43:22.024559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T11:43:22.024559.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_19T11_43_22.024559", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T11:43:22.024559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T11:43:22.024559.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_19T11_43_22.024559", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T11:43:22.024559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T11:43:22.024559.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_19T11_43_22.024559", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T11:43:22.024559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T11:43:22.024559.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_19T11_43_22.024559", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T11:43:22.024559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T11:43:22.024559.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_19T11_43_22.024559", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T11:43:22.024559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T11:43:22.024559.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_19T11_43_22.024559", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T11:43:22.024559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T11:43:22.024559.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_19T11_43_22.024559", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T11:43:22.024559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T11:43:22.024559.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_19T11_43_22.024559", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T11:43:22.024559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T11:43:22.024559.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_19T11_43_22.024559", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T11:43:22.024559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T11:43:22.024559.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_19T11_43_22.024559", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T11:43:22.024559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T11:43:22.024559.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_19T11_43_22.024559", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T11:43:22.024559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T11:43:22.024559.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_19T11_43_22.024559", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T11:43:22.024559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T11:43:22.024559.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_19T11_43_22.024559", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T11:43:22.024559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T11:43:22.024559.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_19T11_43_22.024559", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T11:43:22.024559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T11:43:22.024559.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_19T11_43_22.024559", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T11:43:22.024559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T11:43:22.024559.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_19T11_43_22.024559", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T11:43:22.024559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T11:43:22.024559.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_19T11_43_22.024559", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T11:43:22.024559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T11:43:22.024559.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_19T11_43_22.024559", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T11:43:22.024559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T11:43:22.024559.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_19T11_43_22.024559", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T11:43:22.024559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T11:43:22.024559.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_19T11_43_22.024559", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T11:43:22.024559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T11:43:22.024559.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_19T11_43_22.024559", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T11:43:22.024559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T11:43:22.024559.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_19T11_43_22.024559", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T11:43:22.024559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T11:43:22.024559.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_19T11_43_22.024559", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T11:43:22.024559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T11:43:22.024559.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_19T11_43_22.024559", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T11:43:22.024559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T11:43:22.024559.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_19T11_43_22.024559", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T11:43:22.024559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T11:43:22.024559.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_19T11_43_22.024559", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T11:43:22.024559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T11:43:22.024559.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_19T11_43_22.024559", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T11:43:22.024559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T11:43:22.024559.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_19T11_43_22.024559", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T11:43:22.024559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T11:43:22.024559.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_19T11_43_22.024559", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T11:43:22.024559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T11:43:22.024559.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_19T11_43_22.024559", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T11:43:22.024559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T11:43:22.024559.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_19T11_43_22.024559", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T11:43:22.024559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T11:43:22.024559.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_19T11_43_22.024559", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T11:43:22.024559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T11:43:22.024559.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_19T11_43_22.024559", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T11:43:22.024559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T11:43:22.024559.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_19T11_43_22.024559", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T11:43:22.024559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T11:43:22.024559.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_19T11_43_22.024559", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T11:43:22.024559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T11:43:22.024559.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_19T11_43_22.024559", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T11:43:22.024559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T11:43:22.024559.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_19T11_43_22.024559", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T11:43:22.024559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T11:43:22.024559.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_19T11_43_22.024559", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T11:43:22.024559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T11:43:22.024559.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_19T11_43_22.024559", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T11:43:22.024559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T11:43:22.024559.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_19T11_43_22.024559", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T11:43:22.024559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T11:43:22.024559.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_19T11_43_22.024559", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T11:43:22.024559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T11:43:22.024559.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_19T11_43_22.024559", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T11:43:22.024559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T11:43:22.024559.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_19T11_43_22.024559", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T11:43:22.024559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T11:43:22.024559.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_19T11_43_22.024559", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T11:43:22.024559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T11:43:22.024559.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_19T11_43_22.024559", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T11:43:22.024559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T11:43:22.024559.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_19T11_43_22.024559", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T11:43:22.024559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T11:43:22.024559.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_19T11_43_22.024559", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T11:43:22.024559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T11:43:22.024559.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_19T11_43_22.024559", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T11:43:22.024559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T11:43:22.024559.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_19T11_43_22.024559", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T11:43:22.024559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T11:43:22.024559.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_19T11_43_22.024559", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T11:43:22.024559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T11:43:22.024559.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_19T11_43_22.024559", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T11:43:22.024559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T11:43:22.024559.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_19T11_43_22.024559", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T11:43:22.024559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T11:43:22.024559.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_19T11_43_22.024559", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T11:43:22.024559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T11:43:22.024559.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_19T11_43_22.024559", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T11:43:22.024559.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T11:43:22.024559.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_09_16T19_50_14.065023", "path": ["**/details_harness|winogrande|5_2023-09-16T19-50-14.065023.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-09-16T19-50-14.065023.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_19T11_43_22.024559", "path": ["results_2023-07-19T11:43:22.024559.parquet"]}, {"split": "2023_09_16T19_50_14.065023", "path": ["results_2023-09-16T19-50-14.065023.parquet"]}, {"split": "latest", "path": ["results_2023-09-16T19-50-14.065023.parquet"]}]}]}
2023-09-16T18:50:25+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of xhyi/PT_GPTNEO350_ATG ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model xhyi/PT_GPTNEO350_ATG on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-09-16T19:50:14.065023(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of xhyi/PT_GPTNEO350_ATG", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model xhyi/PT_GPTNEO350_ATG on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-09-16T19:50:14.065023(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of xhyi/PT_GPTNEO350_ATG", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model xhyi/PT_GPTNEO350_ATG on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-09-16T19:50:14.065023(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 22, 31, 170, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of xhyi/PT_GPTNEO350_ATG## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model xhyi/PT_GPTNEO350_ATG on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-09-16T19:50:14.065023(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
1a8c88ca4d273d3fb1f1e68135e3ecbfaf05af09
# Dataset Card for "mmlu-high_school_world_history-neg-prepend-fix" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
joey234/mmlu-high_school_world_history-neg-prepend-fix
[ "region:us" ]
2023-08-18T11:04:52+00:00
{"configs": [{"config_name": "default", "data_files": [{"split": "dev", "path": "data/dev-*"}, {"split": "test", "path": "data/test-*"}]}], "dataset_info": {"features": [{"name": "question", "dtype": "string"}, {"name": "choices", "sequence": "string"}, {"name": "answer", "dtype": {"class_label": {"names": {"0": "A", "1": "B", "2": "C", "3": "D"}}}}, {"name": "negate_openai_prompt", "struct": [{"name": "content", "dtype": "string"}, {"name": "role", "dtype": "string"}]}, {"name": "neg_question", "dtype": "string"}, {"name": "fewshot_context", "dtype": "string"}, {"name": "ori_prompt", "dtype": "string"}], "splits": [{"name": "dev", "num_bytes": 17052, "num_examples": 5}, {"name": "test", "num_bytes": 1694250, "num_examples": 237}], "download_size": 31580, "dataset_size": 1711302}}
2023-08-21T06:37:34+00:00
[]
[]
TAGS #region-us
# Dataset Card for "mmlu-high_school_world_history-neg-prepend-fix" More Information needed
[ "# Dataset Card for \"mmlu-high_school_world_history-neg-prepend-fix\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"mmlu-high_school_world_history-neg-prepend-fix\"\n\nMore Information needed" ]
[ 6, 30 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"mmlu-high_school_world_history-neg-prepend-fix\"\n\nMore Information needed" ]
757a718a4a12bd4abdc2e485d99f59ed91917ef4
# Dataset Card for "mmlu-human_aging-neg-prepend-fix" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
joey234/mmlu-human_aging-neg-prepend-fix
[ "region:us" ]
2023-08-18T11:05:06+00:00
{"configs": [{"config_name": "default", "data_files": [{"split": "dev", "path": "data/dev-*"}, {"split": "test", "path": "data/test-*"}]}], "dataset_info": {"features": [{"name": "question", "dtype": "string"}, {"name": "choices", "sequence": "string"}, {"name": "answer", "dtype": {"class_label": {"names": {"0": "A", "1": "B", "2": "C", "3": "D"}}}}, {"name": "negate_openai_prompt", "struct": [{"name": "content", "dtype": "string"}, {"name": "role", "dtype": "string"}]}, {"name": "neg_question", "dtype": "string"}, {"name": "fewshot_context", "dtype": "string"}, {"name": "ori_prompt", "dtype": "string"}], "splits": [{"name": "dev", "num_bytes": 5130, "num_examples": 5}, {"name": "test", "num_bytes": 444054, "num_examples": 223}], "download_size": 12237, "dataset_size": 449184}}
2023-08-21T06:37:46+00:00
[]
[]
TAGS #region-us
# Dataset Card for "mmlu-human_aging-neg-prepend-fix" More Information needed
[ "# Dataset Card for \"mmlu-human_aging-neg-prepend-fix\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"mmlu-human_aging-neg-prepend-fix\"\n\nMore Information needed" ]
[ 6, 25 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"mmlu-human_aging-neg-prepend-fix\"\n\nMore Information needed" ]
4822b84dd29b03666aa062d4eb008caaecf5cc03
# Dataset Card for "mmlu-human_sexuality-neg-prepend-fix" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
joey234/mmlu-human_sexuality-neg-prepend-fix
[ "region:us" ]
2023-08-18T11:05:20+00:00
{"configs": [{"config_name": "default", "data_files": [{"split": "dev", "path": "data/dev-*"}, {"split": "test", "path": "data/test-*"}]}], "dataset_info": {"features": [{"name": "question", "dtype": "string"}, {"name": "choices", "sequence": "string"}, {"name": "answer", "dtype": {"class_label": {"names": {"0": "A", "1": "B", "2": "C", "3": "D"}}}}, {"name": "negate_openai_prompt", "struct": [{"name": "content", "dtype": "string"}, {"name": "role", "dtype": "string"}]}, {"name": "neg_question", "dtype": "string"}, {"name": "fewshot_context", "dtype": "string"}, {"name": "ori_prompt", "dtype": "string"}], "splits": [{"name": "dev", "num_bytes": 5197, "num_examples": 5}, {"name": "test", "num_bytes": 288820, "num_examples": 131}], "download_size": 13461, "dataset_size": 294017}}
2023-08-21T06:37:58+00:00
[]
[]
TAGS #region-us
# Dataset Card for "mmlu-human_sexuality-neg-prepend-fix" More Information needed
[ "# Dataset Card for \"mmlu-human_sexuality-neg-prepend-fix\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"mmlu-human_sexuality-neg-prepend-fix\"\n\nMore Information needed" ]
[ 6, 26 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"mmlu-human_sexuality-neg-prepend-fix\"\n\nMore Information needed" ]
eb6a8a02b05d750e90e1d105f13a71c3db378ae0
# Dataset Card for "mmlu-international_law-neg-prepend-fix" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
joey234/mmlu-international_law-neg-prepend-fix
[ "region:us" ]
2023-08-18T11:05:34+00:00
{"configs": [{"config_name": "default", "data_files": [{"split": "dev", "path": "data/dev-*"}, {"split": "test", "path": "data/test-*"}]}], "dataset_info": {"features": [{"name": "question", "dtype": "string"}, {"name": "choices", "sequence": "string"}, {"name": "answer", "dtype": {"class_label": {"names": {"0": "A", "1": "B", "2": "C", "3": "D"}}}}, {"name": "negate_openai_prompt", "struct": [{"name": "content", "dtype": "string"}, {"name": "role", "dtype": "string"}]}, {"name": "neg_question", "dtype": "string"}, {"name": "fewshot_context", "dtype": "string"}, {"name": "ori_prompt", "dtype": "string"}], "splits": [{"name": "dev", "num_bytes": 7660, "num_examples": 5}, {"name": "test", "num_bytes": 467360, "num_examples": 121}], "download_size": 15537, "dataset_size": 475020}}
2023-08-21T06:38:11+00:00
[]
[]
TAGS #region-us
# Dataset Card for "mmlu-international_law-neg-prepend-fix" More Information needed
[ "# Dataset Card for \"mmlu-international_law-neg-prepend-fix\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"mmlu-international_law-neg-prepend-fix\"\n\nMore Information needed" ]
[ 6, 25 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"mmlu-international_law-neg-prepend-fix\"\n\nMore Information needed" ]
7cbfe6760927141ca74cf2da55152e22dbd65a69
# Dataset Card for "mmlu-jurisprudence-neg-prepend-fix" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
joey234/mmlu-jurisprudence-neg-prepend-fix
[ "region:us" ]
2023-08-18T11:05:46+00:00
{"configs": [{"config_name": "default", "data_files": [{"split": "dev", "path": "data/dev-*"}, {"split": "test", "path": "data/test-*"}]}], "dataset_info": {"features": [{"name": "question", "dtype": "string"}, {"name": "choices", "sequence": "string"}, {"name": "answer", "dtype": {"class_label": {"names": {"0": "A", "1": "B", "2": "C", "3": "D"}}}}, {"name": "negate_openai_prompt", "struct": [{"name": "content", "dtype": "string"}, {"name": "role", "dtype": "string"}]}, {"name": "neg_question", "dtype": "string"}, {"name": "fewshot_context", "dtype": "string"}, {"name": "ori_prompt", "dtype": "string"}], "splits": [{"name": "dev", "num_bytes": 5970, "num_examples": 5}, {"name": "test", "num_bytes": 297404, "num_examples": 108}], "download_size": 12787, "dataset_size": 303374}}
2023-08-21T06:38:23+00:00
[]
[]
TAGS #region-us
# Dataset Card for "mmlu-jurisprudence-neg-prepend-fix" More Information needed
[ "# Dataset Card for \"mmlu-jurisprudence-neg-prepend-fix\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"mmlu-jurisprudence-neg-prepend-fix\"\n\nMore Information needed" ]
[ 6, 27 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"mmlu-jurisprudence-neg-prepend-fix\"\n\nMore Information needed" ]
b054987dfc31ce23b2fc57713fab991f7face257
# Dataset Card for "mmlu-logical_fallacies-neg-prepend-fix" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
joey234/mmlu-logical_fallacies-neg-prepend-fix
[ "region:us" ]
2023-08-18T11:06:00+00:00
{"configs": [{"config_name": "default", "data_files": [{"split": "dev", "path": "data/dev-*"}, {"split": "test", "path": "data/test-*"}]}], "dataset_info": {"features": [{"name": "question", "dtype": "string"}, {"name": "choices", "sequence": "string"}, {"name": "answer", "dtype": {"class_label": {"names": {"0": "A", "1": "B", "2": "C", "3": "D"}}}}, {"name": "negate_openai_prompt", "struct": [{"name": "content", "dtype": "string"}, {"name": "role", "dtype": "string"}]}, {"name": "neg_question", "dtype": "string"}, {"name": "fewshot_context", "dtype": "string"}, {"name": "ori_prompt", "dtype": "string"}], "splits": [{"name": "dev", "num_bytes": 6380, "num_examples": 5}, {"name": "test", "num_bytes": 460595, "num_examples": 163}], "download_size": 13153, "dataset_size": 466975}}
2023-08-21T06:38:36+00:00
[]
[]
TAGS #region-us
# Dataset Card for "mmlu-logical_fallacies-neg-prepend-fix" More Information needed
[ "# Dataset Card for \"mmlu-logical_fallacies-neg-prepend-fix\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"mmlu-logical_fallacies-neg-prepend-fix\"\n\nMore Information needed" ]
[ 6, 28 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"mmlu-logical_fallacies-neg-prepend-fix\"\n\nMore Information needed" ]
493188fc4ff46fc5ed07540934f869f2df78ee41
# Dataset Card for "mmlu-machine_learning-neg-prepend-fix" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
joey234/mmlu-machine_learning-neg-prepend-fix
[ "region:us" ]
2023-08-18T11:06:14+00:00
{"configs": [{"config_name": "default", "data_files": [{"split": "dev", "path": "data/dev-*"}, {"split": "test", "path": "data/test-*"}]}], "dataset_info": {"features": [{"name": "question", "dtype": "string"}, {"name": "choices", "sequence": "string"}, {"name": "answer", "dtype": {"class_label": {"names": {"0": "A", "1": "B", "2": "C", "3": "D"}}}}, {"name": "negate_openai_prompt", "struct": [{"name": "content", "dtype": "string"}, {"name": "role", "dtype": "string"}]}, {"name": "neg_question", "dtype": "string"}, {"name": "fewshot_context", "dtype": "string"}, {"name": "ori_prompt", "dtype": "string"}], "splits": [{"name": "dev", "num_bytes": 8687, "num_examples": 5}, {"name": "test", "num_bytes": 385570, "num_examples": 112}], "download_size": 18663, "dataset_size": 394257}}
2023-08-21T06:38:47+00:00
[]
[]
TAGS #region-us
# Dataset Card for "mmlu-machine_learning-neg-prepend-fix" More Information needed
[ "# Dataset Card for \"mmlu-machine_learning-neg-prepend-fix\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"mmlu-machine_learning-neg-prepend-fix\"\n\nMore Information needed" ]
[ 6, 25 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"mmlu-machine_learning-neg-prepend-fix\"\n\nMore Information needed" ]
96c3a7e4d5946e04fc20bfbad628eca0045920cd
# Dataset Card for "mmlu-management-neg-prepend-fix" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
joey234/mmlu-management-neg-prepend-fix
[ "region:us" ]
2023-08-18T11:06:28+00:00
{"configs": [{"config_name": "default", "data_files": [{"split": "dev", "path": "data/dev-*"}, {"split": "test", "path": "data/test-*"}]}], "dataset_info": {"features": [{"name": "question", "dtype": "string"}, {"name": "choices", "sequence": "string"}, {"name": "answer", "dtype": {"class_label": {"names": {"0": "A", "1": "B", "2": "C", "3": "D"}}}}, {"name": "negate_openai_prompt", "struct": [{"name": "content", "dtype": "string"}, {"name": "role", "dtype": "string"}]}, {"name": "neg_question", "dtype": "string"}, {"name": "fewshot_context", "dtype": "string"}, {"name": "ori_prompt", "dtype": "string"}], "splits": [{"name": "dev", "num_bytes": 4679, "num_examples": 5}, {"name": "test", "num_bytes": 189715, "num_examples": 103}], "download_size": 11704, "dataset_size": 194394}}
2023-08-21T06:38:59+00:00
[]
[]
TAGS #region-us
# Dataset Card for "mmlu-management-neg-prepend-fix" More Information needed
[ "# Dataset Card for \"mmlu-management-neg-prepend-fix\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"mmlu-management-neg-prepend-fix\"\n\nMore Information needed" ]
[ 6, 23 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"mmlu-management-neg-prepend-fix\"\n\nMore Information needed" ]
d79d4338254f651411075533775ae6ce5c27c0b8
# Dataset Card for "mmlu-marketing-neg-prepend-fix" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
joey234/mmlu-marketing-neg-prepend-fix
[ "region:us" ]
2023-08-18T11:06:41+00:00
{"configs": [{"config_name": "default", "data_files": [{"split": "dev", "path": "data/dev-*"}, {"split": "test", "path": "data/test-*"}]}], "dataset_info": {"features": [{"name": "question", "dtype": "string"}, {"name": "choices", "sequence": "string"}, {"name": "answer", "dtype": {"class_label": {"names": {"0": "A", "1": "B", "2": "C", "3": "D"}}}}, {"name": "negate_openai_prompt", "struct": [{"name": "content", "dtype": "string"}, {"name": "role", "dtype": "string"}]}, {"name": "neg_question", "dtype": "string"}, {"name": "fewshot_context", "dtype": "string"}, {"name": "ori_prompt", "dtype": "string"}], "splits": [{"name": "dev", "num_bytes": 6193, "num_examples": 5}, {"name": "test", "num_bytes": 632204, "num_examples": 234}], "download_size": 14819, "dataset_size": 638397}}
2023-08-21T06:39:12+00:00
[]
[]
TAGS #region-us
# Dataset Card for "mmlu-marketing-neg-prepend-fix" More Information needed
[ "# Dataset Card for \"mmlu-marketing-neg-prepend-fix\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"mmlu-marketing-neg-prepend-fix\"\n\nMore Information needed" ]
[ 6, 23 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"mmlu-marketing-neg-prepend-fix\"\n\nMore Information needed" ]
11e14de699bf71652d24108733047d556ada0878
# Dataset Card for "mmlu-medical_genetics-neg-prepend-fix" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
joey234/mmlu-medical_genetics-neg-prepend-fix
[ "region:us" ]
2023-08-18T11:06:56+00:00
{"configs": [{"config_name": "default", "data_files": [{"split": "dev", "path": "data/dev-*"}, {"split": "test", "path": "data/test-*"}]}], "dataset_info": {"features": [{"name": "question", "dtype": "string"}, {"name": "choices", "sequence": "string"}, {"name": "answer", "dtype": {"class_label": {"names": {"0": "A", "1": "B", "2": "C", "3": "D"}}}}, {"name": "negate_openai_prompt", "struct": [{"name": "content", "dtype": "string"}, {"name": "role", "dtype": "string"}]}, {"name": "neg_question", "dtype": "string"}, {"name": "fewshot_context", "dtype": "string"}, {"name": "ori_prompt", "dtype": "string"}], "splits": [{"name": "dev", "num_bytes": 4785, "num_examples": 5}, {"name": "test", "num_bytes": 208171, "num_examples": 100}], "download_size": 11706, "dataset_size": 212956}}
2023-08-21T06:39:24+00:00
[]
[]
TAGS #region-us
# Dataset Card for "mmlu-medical_genetics-neg-prepend-fix" More Information needed
[ "# Dataset Card for \"mmlu-medical_genetics-neg-prepend-fix\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"mmlu-medical_genetics-neg-prepend-fix\"\n\nMore Information needed" ]
[ 6, 27 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"mmlu-medical_genetics-neg-prepend-fix\"\n\nMore Information needed" ]
fd17bd7ac6074c1f9a7d22ee3610ade73929fbde
# Dataset Card for "mmlu-miscellaneous-neg-prepend-fix" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
joey234/mmlu-miscellaneous-neg-prepend-fix
[ "region:us" ]
2023-08-18T11:07:11+00:00
{"configs": [{"config_name": "default", "data_files": [{"split": "dev", "path": "data/dev-*"}, {"split": "test", "path": "data/test-*"}]}], "dataset_info": {"features": [{"name": "question", "dtype": "string"}, {"name": "choices", "sequence": "string"}, {"name": "answer", "dtype": {"class_label": {"names": {"0": "A", "1": "B", "2": "C", "3": "D"}}}}, {"name": "negate_openai_prompt", "struct": [{"name": "content", "dtype": "string"}, {"name": "role", "dtype": "string"}]}, {"name": "neg_question", "dtype": "string"}, {"name": "fewshot_context", "dtype": "string"}, {"name": "ori_prompt", "dtype": "string"}], "splits": [{"name": "dev", "num_bytes": 4153, "num_examples": 5}, {"name": "test", "num_bytes": 1302583, "num_examples": 783}], "download_size": 10773, "dataset_size": 1306736}}
2023-08-21T06:39:36+00:00
[]
[]
TAGS #region-us
# Dataset Card for "mmlu-miscellaneous-neg-prepend-fix" More Information needed
[ "# Dataset Card for \"mmlu-miscellaneous-neg-prepend-fix\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"mmlu-miscellaneous-neg-prepend-fix\"\n\nMore Information needed" ]
[ 6, 26 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"mmlu-miscellaneous-neg-prepend-fix\"\n\nMore Information needed" ]
95e355ddc2f5573682802d68d0ad8ca07b17d560
# Dataset Card for "mmlu-moral_disputes-neg-prepend-fix" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
joey234/mmlu-moral_disputes-neg-prepend-fix
[ "region:us" ]
2023-08-18T11:07:26+00:00
{"configs": [{"config_name": "default", "data_files": [{"split": "dev", "path": "data/dev-*"}, {"split": "test", "path": "data/test-*"}]}], "dataset_info": {"features": [{"name": "question", "dtype": "string"}, {"name": "choices", "sequence": "string"}, {"name": "answer", "dtype": {"class_label": {"names": {"0": "A", "1": "B", "2": "C", "3": "D"}}}}, {"name": "negate_openai_prompt", "struct": [{"name": "content", "dtype": "string"}, {"name": "role", "dtype": "string"}]}, {"name": "neg_question", "dtype": "string"}, {"name": "fewshot_context", "dtype": "string"}, {"name": "ori_prompt", "dtype": "string"}], "splits": [{"name": "dev", "num_bytes": 6648, "num_examples": 5}, {"name": "test", "num_bytes": 1035772, "num_examples": 346}], "download_size": 12931, "dataset_size": 1042420}}
2023-08-21T06:39:49+00:00
[]
[]
TAGS #region-us
# Dataset Card for "mmlu-moral_disputes-neg-prepend-fix" More Information needed
[ "# Dataset Card for \"mmlu-moral_disputes-neg-prepend-fix\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"mmlu-moral_disputes-neg-prepend-fix\"\n\nMore Information needed" ]
[ 6, 27 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"mmlu-moral_disputes-neg-prepend-fix\"\n\nMore Information needed" ]
f1ea9a58ddf79ff2053ad2fb3958cf7bd08cd482
# Dataset Card for "mmlu-moral_scenarios-neg-prepend-fix" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
joey234/mmlu-moral_scenarios-neg-prepend-fix
[ "region:us" ]
2023-08-18T11:07:41+00:00
{"configs": [{"config_name": "default", "data_files": [{"split": "dev", "path": "data/dev-*"}, {"split": "test", "path": "data/test-*"}]}], "dataset_info": {"features": [{"name": "question", "dtype": "string"}, {"name": "choices", "sequence": "string"}, {"name": "answer", "dtype": {"class_label": {"names": {"0": "A", "1": "B", "2": "C", "3": "D"}}}}, {"name": "negate_openai_prompt", "struct": [{"name": "content", "dtype": "string"}, {"name": "role", "dtype": "string"}]}, {"name": "neg_question", "dtype": "string"}, {"name": "fewshot_context", "dtype": "string"}, {"name": "ori_prompt", "dtype": "string"}], "splits": [{"name": "dev", "num_bytes": 9395, "num_examples": 5}, {"name": "test", "num_bytes": 3529743, "num_examples": 895}], "download_size": 18412, "dataset_size": 3539138}}
2023-08-21T06:40:01+00:00
[]
[]
TAGS #region-us
# Dataset Card for "mmlu-moral_scenarios-neg-prepend-fix" More Information needed
[ "# Dataset Card for \"mmlu-moral_scenarios-neg-prepend-fix\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"mmlu-moral_scenarios-neg-prepend-fix\"\n\nMore Information needed" ]
[ 6, 26 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"mmlu-moral_scenarios-neg-prepend-fix\"\n\nMore Information needed" ]
a4d0f9990211a471968f6921e1c425bf8063d773
# Dataset Card for "mmlu-nutrition-neg-prepend-fix" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
joey234/mmlu-nutrition-neg-prepend-fix
[ "region:us" ]
2023-08-18T11:07:55+00:00
{"configs": [{"config_name": "default", "data_files": [{"split": "dev", "path": "data/dev-*"}, {"split": "test", "path": "data/test-*"}]}], "dataset_info": {"features": [{"name": "question", "dtype": "string"}, {"name": "choices", "sequence": "string"}, {"name": "answer", "dtype": {"class_label": {"names": {"0": "A", "1": "B", "2": "C", "3": "D"}}}}, {"name": "negate_openai_prompt", "struct": [{"name": "content", "dtype": "string"}, {"name": "role", "dtype": "string"}]}, {"name": "neg_question", "dtype": "string"}, {"name": "fewshot_context", "dtype": "string"}, {"name": "ori_prompt", "dtype": "string"}], "splits": [{"name": "dev", "num_bytes": 7518, "num_examples": 5}, {"name": "test", "num_bytes": 1003777, "num_examples": 306}], "download_size": 16915, "dataset_size": 1011295}}
2023-08-21T06:40:14+00:00
[]
[]
TAGS #region-us
# Dataset Card for "mmlu-nutrition-neg-prepend-fix" More Information needed
[ "# Dataset Card for \"mmlu-nutrition-neg-prepend-fix\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"mmlu-nutrition-neg-prepend-fix\"\n\nMore Information needed" ]
[ 6, 24 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"mmlu-nutrition-neg-prepend-fix\"\n\nMore Information needed" ]
10539a0d21a0cfae7964e5e2e7b1da6ff4f89393
# Dataset Card for "mmlu-philosophy-neg-prepend-fix" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
joey234/mmlu-philosophy-neg-prepend-fix
[ "region:us" ]
2023-08-18T11:08:09+00:00
{"configs": [{"config_name": "default", "data_files": [{"split": "dev", "path": "data/dev-*"}, {"split": "test", "path": "data/test-*"}]}], "dataset_info": {"features": [{"name": "question", "dtype": "string"}, {"name": "choices", "sequence": "string"}, {"name": "answer", "dtype": {"class_label": {"names": {"0": "A", "1": "B", "2": "C", "3": "D"}}}}, {"name": "negate_openai_prompt", "struct": [{"name": "content", "dtype": "string"}, {"name": "role", "dtype": "string"}]}, {"name": "neg_question", "dtype": "string"}, {"name": "fewshot_context", "dtype": "string"}, {"name": "ori_prompt", "dtype": "string"}], "splits": [{"name": "dev", "num_bytes": 4981, "num_examples": 5}, {"name": "test", "num_bytes": 647230, "num_examples": 311}], "download_size": 12766, "dataset_size": 652211}}
2023-08-21T06:40:28+00:00
[]
[]
TAGS #region-us
# Dataset Card for "mmlu-philosophy-neg-prepend-fix" More Information needed
[ "# Dataset Card for \"mmlu-philosophy-neg-prepend-fix\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"mmlu-philosophy-neg-prepend-fix\"\n\nMore Information needed" ]
[ 6, 24 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"mmlu-philosophy-neg-prepend-fix\"\n\nMore Information needed" ]
a64767916b50bef2ea7ff50fcdb77fda1be24656
# Dataset Card for "mmlu-prehistory-neg-prepend-fix" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
joey234/mmlu-prehistory-neg-prepend-fix
[ "region:us" ]
2023-08-18T11:08:23+00:00
{"configs": [{"config_name": "default", "data_files": [{"split": "dev", "path": "data/dev-*"}, {"split": "test", "path": "data/test-*"}]}], "dataset_info": {"features": [{"name": "question", "dtype": "string"}, {"name": "choices", "sequence": "string"}, {"name": "answer", "dtype": {"class_label": {"names": {"0": "A", "1": "B", "2": "C", "3": "D"}}}}, {"name": "negate_openai_prompt", "struct": [{"name": "content", "dtype": "string"}, {"name": "role", "dtype": "string"}]}, {"name": "neg_question", "dtype": "string"}, {"name": "fewshot_context", "dtype": "string"}, {"name": "ori_prompt", "dtype": "string"}], "splits": [{"name": "dev", "num_bytes": 6833, "num_examples": 5}, {"name": "test", "num_bytes": 1008140, "num_examples": 324}], "download_size": 14881, "dataset_size": 1014973}}
2023-08-21T06:40:40+00:00
[]
[]
TAGS #region-us
# Dataset Card for "mmlu-prehistory-neg-prepend-fix" More Information needed
[ "# Dataset Card for \"mmlu-prehistory-neg-prepend-fix\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"mmlu-prehistory-neg-prepend-fix\"\n\nMore Information needed" ]
[ 6, 25 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"mmlu-prehistory-neg-prepend-fix\"\n\nMore Information needed" ]
b2dba769d33a30568e531cf74bd267e879ef0ac5
# Dataset Card for "mmlu-professional_accounting-neg-prepend-fix" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
joey234/mmlu-professional_accounting-neg-prepend-fix
[ "region:us" ]
2023-08-18T11:08:37+00:00
{"configs": [{"config_name": "default", "data_files": [{"split": "dev", "path": "data/dev-*"}, {"split": "test", "path": "data/test-*"}]}], "dataset_info": {"features": [{"name": "question", "dtype": "string"}, {"name": "choices", "sequence": "string"}, {"name": "answer", "dtype": {"class_label": {"names": {"0": "A", "1": "B", "2": "C", "3": "D"}}}}, {"name": "negate_openai_prompt", "struct": [{"name": "content", "dtype": "string"}, {"name": "role", "dtype": "string"}]}, {"name": "neg_question", "dtype": "string"}, {"name": "fewshot_context", "dtype": "string"}, {"name": "ori_prompt", "dtype": "string"}], "splits": [{"name": "dev", "num_bytes": 9137, "num_examples": 5}, {"name": "test", "num_bytes": 837793, "num_examples": 282}], "download_size": 16120, "dataset_size": 846930}}
2023-08-21T06:40:52+00:00
[]
[]
TAGS #region-us
# Dataset Card for "mmlu-professional_accounting-neg-prepend-fix" More Information needed
[ "# Dataset Card for \"mmlu-professional_accounting-neg-prepend-fix\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"mmlu-professional_accounting-neg-prepend-fix\"\n\nMore Information needed" ]
[ 6, 27 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"mmlu-professional_accounting-neg-prepend-fix\"\n\nMore Information needed" ]
31e6ebfbe6202c7007b214e857dd3c5b28f7c3ce
# Dataset Card for "mmlu-professional_law-neg-prepend-fix" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
joey234/mmlu-professional_law-neg-prepend-fix
[ "region:us" ]
2023-08-18T11:08:52+00:00
{"configs": [{"config_name": "default", "data_files": [{"split": "dev", "path": "data/dev-*"}, {"split": "test", "path": "data/test-*"}]}], "dataset_info": {"features": [{"name": "question", "dtype": "string"}, {"name": "choices", "sequence": "string"}, {"name": "answer", "dtype": {"class_label": {"names": {"0": "A", "1": "B", "2": "C", "3": "D"}}}}, {"name": "negate_openai_prompt", "struct": [{"name": "content", "dtype": "string"}, {"name": "role", "dtype": "string"}]}, {"name": "neg_question", "dtype": "string"}, {"name": "fewshot_context", "dtype": "string"}, {"name": "ori_prompt", "dtype": "string"}], "splits": [{"name": "dev", "num_bytes": 21027, "num_examples": 5}, {"name": "test", "num_bytes": 11054114, "num_examples": 1534}], "download_size": 49326, "dataset_size": 11075141}}
2023-08-21T06:41:04+00:00
[]
[]
TAGS #region-us
# Dataset Card for "mmlu-professional_law-neg-prepend-fix" More Information needed
[ "# Dataset Card for \"mmlu-professional_law-neg-prepend-fix\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"mmlu-professional_law-neg-prepend-fix\"\n\nMore Information needed" ]
[ 6, 26 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"mmlu-professional_law-neg-prepend-fix\"\n\nMore Information needed" ]
e3b315f13b14aef0b3a2ef0f7f71d17cfef9cf39
# Dataset Card for "mmlu-professional_medicine-neg-prepend-fix" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
joey234/mmlu-professional_medicine-neg-prepend-fix
[ "region:us" ]
2023-08-18T11:09:08+00:00
{"configs": [{"config_name": "default", "data_files": [{"split": "dev", "path": "data/dev-*"}, {"split": "test", "path": "data/test-*"}]}], "dataset_info": {"features": [{"name": "question", "dtype": "string"}, {"name": "choices", "sequence": "string"}, {"name": "answer", "dtype": {"class_label": {"names": {"0": "A", "1": "B", "2": "C", "3": "D"}}}}, {"name": "negate_openai_prompt", "struct": [{"name": "content", "dtype": "string"}, {"name": "role", "dtype": "string"}]}, {"name": "neg_question", "dtype": "string"}, {"name": "fewshot_context", "dtype": "string"}, {"name": "ori_prompt", "dtype": "string"}], "splits": [{"name": "dev", "num_bytes": 13678, "num_examples": 5}, {"name": "test", "num_bytes": 1083116, "num_examples": 272}], "download_size": 26475, "dataset_size": 1096794}}
2023-08-21T06:41:16+00:00
[]
[]
TAGS #region-us
# Dataset Card for "mmlu-professional_medicine-neg-prepend-fix" More Information needed
[ "# Dataset Card for \"mmlu-professional_medicine-neg-prepend-fix\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"mmlu-professional_medicine-neg-prepend-fix\"\n\nMore Information needed" ]
[ 6, 27 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"mmlu-professional_medicine-neg-prepend-fix\"\n\nMore Information needed" ]
ca63d71942b27ddfcb4e91798ac017ab47d75416
# Dataset Card for "mmlu-professional_psychology-neg-prepend-fix" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
joey234/mmlu-professional_psychology-neg-prepend-fix
[ "region:us" ]
2023-08-18T11:09:26+00:00
{"configs": [{"config_name": "default", "data_files": [{"split": "dev", "path": "data/dev-*"}, {"split": "test", "path": "data/test-*"}]}], "dataset_info": {"features": [{"name": "question", "dtype": "string"}, {"name": "choices", "sequence": "string"}, {"name": "answer", "dtype": {"class_label": {"names": {"0": "A", "1": "B", "2": "C", "3": "D"}}}}, {"name": "negate_openai_prompt", "struct": [{"name": "content", "dtype": "string"}, {"name": "role", "dtype": "string"}]}, {"name": "neg_question", "dtype": "string"}, {"name": "fewshot_context", "dtype": "string"}, {"name": "ori_prompt", "dtype": "string"}], "splits": [{"name": "dev", "num_bytes": 7999, "num_examples": 5}, {"name": "test", "num_bytes": 2096464, "num_examples": 612}], "download_size": 14733, "dataset_size": 2104463}}
2023-08-21T06:41:29+00:00
[]
[]
TAGS #region-us
# Dataset Card for "mmlu-professional_psychology-neg-prepend-fix" More Information needed
[ "# Dataset Card for \"mmlu-professional_psychology-neg-prepend-fix\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"mmlu-professional_psychology-neg-prepend-fix\"\n\nMore Information needed" ]
[ 6, 28 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"mmlu-professional_psychology-neg-prepend-fix\"\n\nMore Information needed" ]
e5a84c1bafbda363ea8c7d4cac9ed7de808bc8db
# Dataset Card for "mmlu-public_relations-neg-prepend-fix" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
joey234/mmlu-public_relations-neg-prepend-fix
[ "region:us" ]
2023-08-18T11:09:40+00:00
{"configs": [{"config_name": "default", "data_files": [{"split": "dev", "path": "data/dev-*"}, {"split": "test", "path": "data/test-*"}]}], "dataset_info": {"features": [{"name": "question", "dtype": "string"}, {"name": "choices", "sequence": "string"}, {"name": "answer", "dtype": {"class_label": {"names": {"0": "A", "1": "B", "2": "C", "3": "D"}}}}, {"name": "negate_openai_prompt", "struct": [{"name": "content", "dtype": "string"}, {"name": "role", "dtype": "string"}]}, {"name": "neg_question", "dtype": "string"}, {"name": "fewshot_context", "dtype": "string"}, {"name": "ori_prompt", "dtype": "string"}], "splits": [{"name": "dev", "num_bytes": 6140, "num_examples": 5}, {"name": "test", "num_bytes": 293992, "num_examples": 110}], "download_size": 13927, "dataset_size": 300132}}
2023-08-21T06:41:39+00:00
[]
[]
TAGS #region-us
# Dataset Card for "mmlu-public_relations-neg-prepend-fix" More Information needed
[ "# Dataset Card for \"mmlu-public_relations-neg-prepend-fix\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"mmlu-public_relations-neg-prepend-fix\"\n\nMore Information needed" ]
[ 6, 26 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"mmlu-public_relations-neg-prepend-fix\"\n\nMore Information needed" ]
8adc8c3a84a44652d08a26b9e1df6373f10c0244
# Dataset Card for "mmlu-security_studies-neg-prepend-fix" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
joey234/mmlu-security_studies-neg-prepend-fix
[ "region:us" ]
2023-08-18T11:09:55+00:00
{"configs": [{"config_name": "default", "data_files": [{"split": "dev", "path": "data/dev-*"}, {"split": "test", "path": "data/test-*"}]}], "dataset_info": {"features": [{"name": "question", "dtype": "string"}, {"name": "choices", "sequence": "string"}, {"name": "answer", "dtype": {"class_label": {"names": {"0": "A", "1": "B", "2": "C", "3": "D"}}}}, {"name": "negate_openai_prompt", "struct": [{"name": "content", "dtype": "string"}, {"name": "role", "dtype": "string"}]}, {"name": "neg_question", "dtype": "string"}, {"name": "fewshot_context", "dtype": "string"}, {"name": "ori_prompt", "dtype": "string"}], "splits": [{"name": "dev", "num_bytes": 13696, "num_examples": 5}, {"name": "test", "num_bytes": 1861347, "num_examples": 245}], "download_size": 22717, "dataset_size": 1875043}}
2023-08-21T06:41:52+00:00
[]
[]
TAGS #region-us
# Dataset Card for "mmlu-security_studies-neg-prepend-fix" More Information needed
[ "# Dataset Card for \"mmlu-security_studies-neg-prepend-fix\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"mmlu-security_studies-neg-prepend-fix\"\n\nMore Information needed" ]
[ 6, 26 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"mmlu-security_studies-neg-prepend-fix\"\n\nMore Information needed" ]
7fe67871b41945494f3d267189a0fb009467d149
# Dataset Card for "mmlu-sociology-neg-prepend-fix" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
joey234/mmlu-sociology-neg-prepend-fix
[ "region:us" ]
2023-08-18T11:10:10+00:00
{"configs": [{"config_name": "default", "data_files": [{"split": "dev", "path": "data/dev-*"}, {"split": "test", "path": "data/test-*"}]}], "dataset_info": {"features": [{"name": "question", "dtype": "string"}, {"name": "choices", "sequence": "string"}, {"name": "answer", "dtype": {"class_label": {"names": {"0": "A", "1": "B", "2": "C", "3": "D"}}}}, {"name": "negate_openai_prompt", "struct": [{"name": "content", "dtype": "string"}, {"name": "role", "dtype": "string"}]}, {"name": "neg_question", "dtype": "string"}, {"name": "fewshot_context", "dtype": "string"}, {"name": "ori_prompt", "dtype": "string"}], "splits": [{"name": "dev", "num_bytes": 6037, "num_examples": 5}, {"name": "test", "num_bytes": 570178, "num_examples": 201}], "download_size": 14589, "dataset_size": 576215}}
2023-08-21T06:42:03+00:00
[]
[]
TAGS #region-us
# Dataset Card for "mmlu-sociology-neg-prepend-fix" More Information needed
[ "# Dataset Card for \"mmlu-sociology-neg-prepend-fix\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"mmlu-sociology-neg-prepend-fix\"\n\nMore Information needed" ]
[ 6, 24 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"mmlu-sociology-neg-prepend-fix\"\n\nMore Information needed" ]
1ebd1f7a6892c339225da045de18eadb8978d0c7
# Dataset Card for "mmlu-us_foreign_policy-neg-prepend-fix" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
joey234/mmlu-us_foreign_policy-neg-prepend-fix
[ "region:us" ]
2023-08-18T11:10:24+00:00
{"configs": [{"config_name": "default", "data_files": [{"split": "dev", "path": "data/dev-*"}, {"split": "test", "path": "data/test-*"}]}], "dataset_info": {"features": [{"name": "question", "dtype": "string"}, {"name": "choices", "sequence": "string"}, {"name": "answer", "dtype": {"class_label": {"names": {"0": "A", "1": "B", "2": "C", "3": "D"}}}}, {"name": "negate_openai_prompt", "struct": [{"name": "content", "dtype": "string"}, {"name": "role", "dtype": "string"}]}, {"name": "neg_question", "dtype": "string"}, {"name": "fewshot_context", "dtype": "string"}, {"name": "ori_prompt", "dtype": "string"}], "splits": [{"name": "dev", "num_bytes": 6163, "num_examples": 5}, {"name": "test", "num_bytes": 275921, "num_examples": 100}], "download_size": 13599, "dataset_size": 282084}}
2023-08-21T06:42:16+00:00
[]
[]
TAGS #region-us
# Dataset Card for "mmlu-us_foreign_policy-neg-prepend-fix" More Information needed
[ "# Dataset Card for \"mmlu-us_foreign_policy-neg-prepend-fix\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"mmlu-us_foreign_policy-neg-prepend-fix\"\n\nMore Information needed" ]
[ 6, 28 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"mmlu-us_foreign_policy-neg-prepend-fix\"\n\nMore Information needed" ]
8858e18444bba1661a91c8743a9cc876b6efece2
# Dataset Card for "mmlu-virology-neg-prepend-fix" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
joey234/mmlu-virology-neg-prepend-fix
[ "region:us" ]
2023-08-18T11:10:39+00:00
{"configs": [{"config_name": "default", "data_files": [{"split": "dev", "path": "data/dev-*"}, {"split": "test", "path": "data/test-*"}]}], "dataset_info": {"features": [{"name": "question", "dtype": "string"}, {"name": "choices", "sequence": "string"}, {"name": "answer", "dtype": {"class_label": {"names": {"0": "A", "1": "B", "2": "C", "3": "D"}}}}, {"name": "negate_openai_prompt", "struct": [{"name": "content", "dtype": "string"}, {"name": "role", "dtype": "string"}]}, {"name": "neg_question", "dtype": "string"}, {"name": "fewshot_context", "dtype": "string"}, {"name": "ori_prompt", "dtype": "string"}], "splits": [{"name": "dev", "num_bytes": 5034, "num_examples": 5}, {"name": "test", "num_bytes": 356177, "num_examples": 166}], "download_size": 12357, "dataset_size": 361211}}
2023-08-21T06:42:28+00:00
[]
[]
TAGS #region-us
# Dataset Card for "mmlu-virology-neg-prepend-fix" More Information needed
[ "# Dataset Card for \"mmlu-virology-neg-prepend-fix\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"mmlu-virology-neg-prepend-fix\"\n\nMore Information needed" ]
[ 6, 24 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"mmlu-virology-neg-prepend-fix\"\n\nMore Information needed" ]
fc9a7469988efa4a896c46a29665cc069077b756
# Dataset Card for "mmlu-world_religions-neg-prepend-fix" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
joey234/mmlu-world_religions-neg-prepend-fix
[ "region:us" ]
2023-08-18T11:10:53+00:00
{"configs": [{"config_name": "default", "data_files": [{"split": "dev", "path": "data/dev-*"}, {"split": "test", "path": "data/test-*"}]}], "dataset_info": {"features": [{"name": "question", "dtype": "string"}, {"name": "choices", "sequence": "string"}, {"name": "answer", "dtype": {"class_label": {"names": {"0": "A", "1": "B", "2": "C", "3": "D"}}}}, {"name": "negate_openai_prompt", "struct": [{"name": "content", "dtype": "string"}, {"name": "role", "dtype": "string"}]}, {"name": "neg_question", "dtype": "string"}, {"name": "fewshot_context", "dtype": "string"}, {"name": "ori_prompt", "dtype": "string"}], "splits": [{"name": "dev", "num_bytes": 4146, "num_examples": 5}, {"name": "test", "num_bytes": 260517, "num_examples": 171}], "download_size": 10872, "dataset_size": 264663}}
2023-08-21T06:42:40+00:00
[]
[]
TAGS #region-us
# Dataset Card for "mmlu-world_religions-neg-prepend-fix" More Information needed
[ "# Dataset Card for \"mmlu-world_religions-neg-prepend-fix\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"mmlu-world_religions-neg-prepend-fix\"\n\nMore Information needed" ]
[ 6, 26 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"mmlu-world_religions-neg-prepend-fix\"\n\nMore Information needed" ]
a44d3c4a4ee11511991a827c5db5660498931cc9
# Dataset of kishin_sagume/稀神サグメ/키신사구메 (Touhou) This is the dataset of kishin_sagume/稀神サグメ/키신사구메 (Touhou), containing 500 images and their tags. The core tags of this character are `short_hair, single_wing, wings, red_eyes, bow, feathered_wings, grey_hair, red_bow, bangs, white_hair, white_wings, breasts, hair_between_eyes, braid, french_braid`, which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)). ## List of Packages | Name | Images | Size | Download | Type | Description | |:-----------------|---------:|:-----------|:----------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------| | raw | 500 | 680.80 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kishin_sagume_touhou/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). | | 800 | 500 | 379.14 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kishin_sagume_touhou/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. | | stage3-p480-800 | 1175 | 782.84 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kishin_sagume_touhou/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | | 1200 | 500 | 593.66 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kishin_sagume_touhou/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. | | stage3-p480-1200 | 1175 | 1.07 GiB | [Download](https://huggingface.co/datasets/CyberHarem/kishin_sagume_touhou/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code ```python import os import zipfile from huggingface_hub import hf_hub_download from waifuc.source import LocalSource # download raw archive file zip_file = hf_hub_download( repo_id='CyberHarem/kishin_sagume_touhou', repo_type='dataset', filename='dataset-raw.zip', ) # extract files to your directory dataset_dir = 'dataset_dir' os.makedirs(dataset_dir, exist_ok=True) with zipfile.ZipFile(zip_file, 'r') as zf: zf.extractall(dataset_dir) # load the dataset with waifuc source = LocalSource(dataset_dir) for item in source: print(item.image, item.meta['filename'], item.meta['tags']) ``` ## List of Clusters List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| | 0 | 6 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, brooch, cowboy_shot, long_sleeves, looking_at_viewer, medium_breasts, open_jacket, purple_dress, red_bowtie, solo, standing, covering_mouth, blush | | 1 | 5 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | 1girl, closed_mouth, long_sleeves, looking_at_viewer, open_jacket, purple_dress, simple_background, solo, white_background, brooch, red_bowtie, upper_body, white_jacket | | 2 | 7 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | 1girl, bowtie, long_sleeves, looking_at_viewer, purple_dress, simple_background, solo, white_background, open_jacket | | 3 | 11 | ![](samples/3/clu3-sample0.png) | ![](samples/3/clu3-sample1.png) | ![](samples/3/clu3-sample2.png) | ![](samples/3/clu3-sample3.png) | ![](samples/3/clu3-sample4.png) | 1girl, long_sleeves, looking_at_viewer, purple_dress, solo, open_jacket, red_bowtie | | 4 | 9 | ![](samples/4/clu4-sample0.png) | ![](samples/4/clu4-sample1.png) | ![](samples/4/clu4-sample2.png) | ![](samples/4/clu4-sample3.png) | ![](samples/4/clu4-sample4.png) | 1girl, full_body, long_sleeves, purple_dress, red_bowtie, solo, boots, brown_footwear, looking_at_viewer, open_jacket, simple_background, white_background, blush, closed_mouth, covering_mouth | | 5 | 5 | ![](samples/5/clu5-sample0.png) | ![](samples/5/clu5-sample1.png) | ![](samples/5/clu5-sample2.png) | ![](samples/5/clu5-sample3.png) | ![](samples/5/clu5-sample4.png) | 1boy, 1girl, blush, hetero, large_breasts, completely_nude, navel, nipples, penis, sex, solo_focus, vaginal, mosaic_censoring, pov, spread_legs, closed_mouth, cowgirl_position, cum_in_pussy, girl_on_top, holding_hands, interlocked_fingers, looking_at_viewer, open_mouth, shiny_skin, smile | ### Table Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | brooch | cowboy_shot | long_sleeves | looking_at_viewer | medium_breasts | open_jacket | purple_dress | red_bowtie | solo | standing | covering_mouth | blush | closed_mouth | simple_background | white_background | upper_body | white_jacket | bowtie | full_body | boots | brown_footwear | 1boy | hetero | large_breasts | completely_nude | navel | nipples | penis | sex | solo_focus | vaginal | mosaic_censoring | pov | spread_legs | cowgirl_position | cum_in_pussy | girl_on_top | holding_hands | interlocked_fingers | open_mouth | shiny_skin | smile | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:---------|:--------------|:---------------|:--------------------|:-----------------|:--------------|:---------------|:-------------|:-------|:-----------|:-----------------|:--------|:---------------|:--------------------|:-------------------|:-------------|:---------------|:---------|:------------|:--------|:-----------------|:-------|:---------|:----------------|:------------------|:--------|:----------|:--------|:------|:-------------|:----------|:-------------------|:------|:--------------|:-------------------|:---------------|:--------------|:----------------|:----------------------|:-------------|:-------------|:--------| | 0 | 6 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 1 | 5 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | X | X | | X | X | | X | X | X | X | | | | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | 2 | 7 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | X | | | X | X | | X | X | | X | | | | | X | X | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | 3 | 11 | ![](samples/3/clu3-sample0.png) | ![](samples/3/clu3-sample1.png) | ![](samples/3/clu3-sample2.png) | ![](samples/3/clu3-sample3.png) | ![](samples/3/clu3-sample4.png) | X | | | X | X | | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 4 | 9 | ![](samples/4/clu4-sample0.png) | ![](samples/4/clu4-sample1.png) | ![](samples/4/clu4-sample2.png) | ![](samples/4/clu4-sample3.png) | ![](samples/4/clu4-sample4.png) | X | | | X | X | | X | X | X | X | | X | X | X | X | X | | | | X | X | X | | | | | | | | | | | | | | | | | | | | | | | 5 | 5 | ![](samples/5/clu5-sample0.png) | ![](samples/5/clu5-sample1.png) | ![](samples/5/clu5-sample2.png) | ![](samples/5/clu5-sample3.png) | ![](samples/5/clu5-sample4.png) | X | | | | X | | | | | | | | X | X | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
CyberHarem/kishin_sagume_touhou
[ "task_categories:text-to-image", "size_categories:n<1K", "license:mit", "art", "not-for-all-audiences", "region:us" ]
2023-08-18T11:19:41+00:00
{"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]}
2024-01-14T20:23:43+00:00
[]
[]
TAGS #task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
Dataset of kishin\_sagume/稀神サグメ/키신사구메 (Touhou) ============================================== This is the dataset of kishin\_sagume/稀神サグメ/키신사구메 (Touhou), containing 500 images and their tags. The core tags of this character are 'short\_hair, single\_wing, wings, red\_eyes, bow, feathered\_wings, grey\_hair, red\_bow, bangs, white\_hair, white\_wings, breasts, hair\_between\_eyes, braid, french\_braid', which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization). List of Packages ---------------- ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code List of Clusters ---------------- List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version ### Table Version
[ "### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.", "### Raw Text Version", "### Table Version" ]
[ "TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n", "### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.", "### Raw Text Version", "### Table Version" ]
[ 44, 61, 5, 4 ]
[ "passage: TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.### Raw Text Version### Table Version" ]
46bd4d0252da28f01f69f26725edeca1f53ed2df
# Dataset Card for "flipkart-scraped-dresses-10" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
harshiitsingh/flipkart-scraped-dresses-10
[ "region:us" ]
2023-08-18T11:47:45+00:00
{"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}], "dataset_info": {"features": [{"name": "image", "dtype": "image"}, {"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 102203.0, "num_examples": 10}], "download_size": 102337, "dataset_size": 102203.0}}
2023-08-18T11:47:51+00:00
[]
[]
TAGS #region-us
# Dataset Card for "flipkart-scraped-dresses-10" More Information needed
[ "# Dataset Card for \"flipkart-scraped-dresses-10\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"flipkart-scraped-dresses-10\"\n\nMore Information needed" ]
[ 6, 21 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"flipkart-scraped-dresses-10\"\n\nMore Information needed" ]
90487b057ba04d65c6df824a40e1e3dccb276b11
# Dataset Card for "ner-test" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
davanstrien/ner-test
[ "region:us" ]
2023-08-18T11:55:01+00:00
{"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "valid", "path": "data/valid-*"}]}], "dataset_info": {"features": [{"name": "id", "dtype": "string"}, {"name": "ner_tags", "sequence": "string"}, {"name": "tokens", "sequence": "string"}], "splits": [{"name": "train", "num_bytes": 1548186, "num_examples": 5216}, {"name": "valid", "num_bytes": 392764, "num_examples": 1304}], "download_size": 0, "dataset_size": 1940950}}
2023-08-18T11:56:53+00:00
[]
[]
TAGS #region-us
# Dataset Card for "ner-test" More Information needed
[ "# Dataset Card for \"ner-test\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"ner-test\"\n\nMore Information needed" ]
[ 6, 13 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"ner-test\"\n\nMore Information needed" ]
a36b8c633c2fd56438f2043934d71f17914c0e55
Read me.
ryanrwatkins/prompts_archive
[ "license:cc-by-4.0", "region:us" ]
2023-08-18T11:55:27+00:00
{"license": "cc-by-4.0"}
2023-08-18T22:01:13+00:00
[]
[]
TAGS #license-cc-by-4.0 #region-us
Read me.
[]
[ "TAGS\n#license-cc-by-4.0 #region-us \n" ]
[ 15 ]
[ "passage: TAGS\n#license-cc-by-4.0 #region-us \n" ]
ab7fffd224befb38c2efb0b71422db47047a5254
# Dataset of kisume/キスメ/키스메 (Touhou) This is the dataset of kisume/キスメ/키스메 (Touhou), containing 500 images and their tags. The core tags of this character are `green_hair, twintails, short_hair, hair_ornament, green_eyes`, which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)). ## List of Packages | Name | Images | Size | Download | Type | Description | |:-----------------|---------:|:-----------|:---------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------| | raw | 500 | 396.64 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kisume_touhou/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). | | 800 | 500 | 284.84 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kisume_touhou/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. | | stage3-p480-800 | 960 | 542.32 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kisume_touhou/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | | 1200 | 500 | 371.71 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kisume_touhou/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. | | stage3-p480-1200 | 960 | 678.64 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kisume_touhou/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code ```python import os import zipfile from huggingface_hub import hf_hub_download from waifuc.source import LocalSource # download raw archive file zip_file = hf_hub_download( repo_id='CyberHarem/kisume_touhou', repo_type='dataset', filename='dataset-raw.zip', ) # extract files to your directory dataset_dir = 'dataset_dir' os.makedirs(dataset_dir, exist_ok=True) with zipfile.ZipFile(zip_file, 'r') as zf: zf.extractall(dataset_dir) # load the dataset with waifuc source = LocalSource(dataset_dir) for item in source: print(item.image, item.meta['filename'], item.meta['tags']) ``` ## List of Clusters List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| | 0 | 9 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, bangs, closed_mouth, hair_bobbles, in_bucket, looking_at_viewer, solo, white_kimono, wooden_bucket, long_sleeves, collarbone, simple_background, white_background, blush, smile | | 1 | 9 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | 1girl, hair_bobbles, in_bucket, open_mouth, solo, wooden_bucket, blush | | 2 | 14 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | 2girls, blonde_hair, hair_bobbles, in_bucket, wooden_bucket, blush, smile | | 3 | 7 | ![](samples/3/clu3-sample0.png) | ![](samples/3/clu3-sample1.png) | ![](samples/3/clu3-sample2.png) | ![](samples/3/clu3-sample3.png) | ![](samples/3/clu3-sample4.png) | 1girl, hair_bobbles, nipples, nude, open_mouth, solo, barefoot, loli, spread_legs, blush, navel, small_breasts, anus, cum_in_pussy, uncensored, white_background, feet, flat_chest, spread_pussy, tears | ### Table Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | bangs | closed_mouth | hair_bobbles | in_bucket | looking_at_viewer | solo | white_kimono | wooden_bucket | long_sleeves | collarbone | simple_background | white_background | blush | smile | open_mouth | 2girls | blonde_hair | nipples | nude | barefoot | loli | spread_legs | navel | small_breasts | anus | cum_in_pussy | uncensored | feet | flat_chest | spread_pussy | tears | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------|:---------------|:---------------|:------------|:--------------------|:-------|:---------------|:----------------|:---------------|:-------------|:--------------------|:-------------------|:--------|:--------|:-------------|:---------|:--------------|:----------|:-------|:-----------|:-------|:--------------|:--------|:----------------|:-------|:---------------|:-------------|:-------|:-------------|:---------------|:--------| | 0 | 9 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | 1 | 9 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | X | | | X | X | | X | | X | | | | | X | | X | | | | | | | | | | | | | | | | | | 2 | 14 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | | | | X | X | | | | X | | | | | X | X | | X | X | | | | | | | | | | | | | | | | 3 | 7 | ![](samples/3/clu3-sample0.png) | ![](samples/3/clu3-sample1.png) | ![](samples/3/clu3-sample2.png) | ![](samples/3/clu3-sample3.png) | ![](samples/3/clu3-sample4.png) | X | | | X | | | X | | | | | | X | X | | X | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
CyberHarem/kisume_touhou
[ "task_categories:text-to-image", "size_categories:n<1K", "license:mit", "art", "not-for-all-audiences", "region:us" ]
2023-08-18T11:55:54+00:00
{"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]}
2024-01-14T21:34:24+00:00
[]
[]
TAGS #task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
Dataset of kisume/キスメ/키스메 (Touhou) ================================== This is the dataset of kisume/キスメ/키스메 (Touhou), containing 500 images and their tags. The core tags of this character are 'green\_hair, twintails, short\_hair, hair\_ornament, green\_eyes', which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization). List of Packages ---------------- ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code List of Clusters ---------------- List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version ### Table Version
[ "### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.", "### Raw Text Version", "### Table Version" ]
[ "TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n", "### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.", "### Raw Text Version", "### Table Version" ]
[ 44, 61, 5, 4 ]
[ "passage: TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.### Raw Text Version### Table Version" ]
2e8d8bf9326022ed824b266ed40d3b420b2905ee
Sentence Follow-up Dataset This dataset can be used to finetune models for text-to-text generation tasks (You can use this dataset if you want to predict the follow-up sentence). The dataset consists of the following headers: "sentence_1" and "sentence_2"; where sentence_2 will be a follow-up sentence for sentence_1.
Dhruvil47/sentence_followup
[ "task_categories:text-generation", "size_categories:1M<n<10M", "language:en", "license:unknown", "region:us" ]
2023-08-18T12:21:36+00:00
{"language": ["en"], "license": "unknown", "size_categories": ["1M<n<10M"], "task_categories": ["text-generation"]}
2023-08-18T14:18:42+00:00
[]
[ "en" ]
TAGS #task_categories-text-generation #size_categories-1M<n<10M #language-English #license-unknown #region-us
Sentence Follow-up Dataset This dataset can be used to finetune models for text-to-text generation tasks (You can use this dataset if you want to predict the follow-up sentence). The dataset consists of the following headers: "sentence_1" and "sentence_2"; where sentence_2 will be a follow-up sentence for sentence_1.
[]
[ "TAGS\n#task_categories-text-generation #size_categories-1M<n<10M #language-English #license-unknown #region-us \n" ]
[ 40 ]
[ "passage: TAGS\n#task_categories-text-generation #size_categories-1M<n<10M #language-English #license-unknown #region-us \n" ]
9b8c6e62a0456e3242c83a7f129ac16ab51c112d
# Dataset of medicine_melancholy/メディスン・メランコリー/메디슨멜랑콜리 (Touhou) This is the dataset of medicine_melancholy/メディスン・メランコリー/메디슨멜랑콜리 (Touhou), containing 500 images and their tags. The core tags of this character are `blonde_hair, short_hair, ribbon, hair_ribbon, bow, blue_eyes, red_ribbon, red_bow`, which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)). ## List of Packages | Name | Images | Size | Download | Type | Description | |:-----------------|---------:|:-----------|:----------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------| | raw | 500 | 514.00 MiB | [Download](https://huggingface.co/datasets/CyberHarem/medicine_melancholy_touhou/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). | | 800 | 500 | 335.69 MiB | [Download](https://huggingface.co/datasets/CyberHarem/medicine_melancholy_touhou/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. | | stage3-p480-800 | 1037 | 665.65 MiB | [Download](https://huggingface.co/datasets/CyberHarem/medicine_melancholy_touhou/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | | 1200 | 500 | 468.32 MiB | [Download](https://huggingface.co/datasets/CyberHarem/medicine_melancholy_touhou/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. | | stage3-p480-1200 | 1037 | 877.36 MiB | [Download](https://huggingface.co/datasets/CyberHarem/medicine_melancholy_touhou/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code ```python import os import zipfile from huggingface_hub import hf_hub_download from waifuc.source import LocalSource # download raw archive file zip_file = hf_hub_download( repo_id='CyberHarem/medicine_melancholy_touhou', repo_type='dataset', filename='dataset-raw.zip', ) # extract files to your directory dataset_dir = 'dataset_dir' os.makedirs(dataset_dir, exist_ok=True) with zipfile.ZipFile(zip_file, 'r') as zf: zf.extractall(dataset_dir) # load the dataset with waifuc source = LocalSource(dataset_dir) for item in source: print(item.image, item.meta['filename'], item.meta['tags']) ``` ## List of Clusters List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags | |----:|----------:|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| | 0 | 11 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, lily_of_the_valley, puffy_short_sleeves, solo, looking_at_viewer, shirt, blush, red_skirt | | 1 | 7 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | 1girl, shirt, skirt, smile, solo, lily_of_the_valley, puffy_sleeves, short_sleeves, looking_at_viewer, sitting | | 2 | 6 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | 1girl, flower, skirt, solo, smile, blush | | 3 | 12 | ![](samples/3/clu3-sample0.png) | ![](samples/3/clu3-sample1.png) | ![](samples/3/clu3-sample2.png) | ![](samples/3/clu3-sample3.png) | ![](samples/3/clu3-sample4.png) | 1girl, frilled_shirt_collar, red_bowtie, red_skirt, solo, black_shirt, frilled_sleeves, looking_at_viewer, puffy_short_sleeves, bangs, simple_background, closed_mouth, wavy_hair | | 4 | 11 | ![](samples/4/clu4-sample0.png) | ![](samples/4/clu4-sample1.png) | ![](samples/4/clu4-sample2.png) | ![](samples/4/clu4-sample3.png) | ![](samples/4/clu4-sample4.png) | 1girl, looking_at_viewer, red_bowtie, solo, upper_body, frilled_shirt_collar, puffy_short_sleeves, bangs, black_shirt, simple_background, white_background, closed_mouth, blush | | 5 | 5 | ![](samples/5/clu5-sample0.png) | ![](samples/5/clu5-sample1.png) | ![](samples/5/clu5-sample2.png) | ![](samples/5/clu5-sample3.png) | ![](samples/5/clu5-sample4.png) | 1girl, bangs, looking_at_viewer, medium_hair, puffy_short_sleeves, red_bowtie, red_skirt, solo, holding_flower, lily_of_the_valley, purple_background, white_flower, black_shirt, closed_mouth, frilled_skirt, hair_between_eyes, red_shirt, ribbon-trimmed_skirt, :d, cowboy_shot, frilled_sleeves, open_mouth, red_footwear, shoes, sitting, standing, white_socks | | 6 | 6 | ![](samples/6/clu6-sample0.png) | ![](samples/6/clu6-sample1.png) | ![](samples/6/clu6-sample2.png) | ![](samples/6/clu6-sample3.png) | ![](samples/6/clu6-sample4.png) | 1girl, smile, solo, ^_^, open_mouth, blush | | 7 | 5 | ![](samples/7/clu7-sample0.png) | ![](samples/7/clu7-sample1.png) | ![](samples/7/clu7-sample2.png) | ![](samples/7/clu7-sample3.png) | ![](samples/7/clu7-sample4.png) | 1girl, dress, lily_of_the_valley, solo, doll_joints, hair_bow, skirt | | 8 | 5 | ![](samples/8/clu8-sample0.png) | ![](samples/8/clu8-sample1.png) | ![](samples/8/clu8-sample2.png) | ![](samples/8/clu8-sample3.png) | ![](samples/8/clu8-sample4.png) | 1girl, black_bowtie, black_skirt, full_body, hair_bow, solo, bangs, closed_mouth, puffy_short_sleeves, shoes, collared_shirt, looking_at_viewer, red_bowtie, red_footwear, simple_background, white_background, chibi, fairy_wings, white_socks | | 9 | 5 | ![](samples/9/clu9-sample0.png) | ![](samples/9/clu9-sample1.png) | ![](samples/9/clu9-sample2.png) | ![](samples/9/clu9-sample3.png) | ![](samples/9/clu9-sample4.png) | 1girl, solo, hair_bow, open_mouth, blush, looking_at_viewer | | 10 | 8 | ![](samples/10/clu10-sample0.png) | ![](samples/10/clu10-sample1.png) | ![](samples/10/clu10-sample2.png) | ![](samples/10/clu10-sample3.png) | ![](samples/10/clu10-sample4.png) | 1girl, loli, solo, blush, on_back, flat_chest, navel, nipples, pussy_juice, censored, nude, open_mouth, spread_legs | | 11 | 9 | ![](samples/11/clu11-sample0.png) | ![](samples/11/clu11-sample1.png) | ![](samples/11/clu11-sample2.png) | ![](samples/11/clu11-sample3.png) | ![](samples/11/clu11-sample4.png) | 1girl, hetero, solo_focus, blush, 1boy, penis, loli, cum_in_pussy, facial, sex, vaginal, bar_censor, open_mouth | ### Table Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | lily_of_the_valley | puffy_short_sleeves | solo | looking_at_viewer | shirt | blush | red_skirt | skirt | smile | puffy_sleeves | short_sleeves | sitting | flower | frilled_shirt_collar | red_bowtie | black_shirt | frilled_sleeves | bangs | simple_background | closed_mouth | wavy_hair | upper_body | white_background | medium_hair | holding_flower | purple_background | white_flower | frilled_skirt | hair_between_eyes | red_shirt | ribbon-trimmed_skirt | :d | cowboy_shot | open_mouth | red_footwear | shoes | standing | white_socks | ^_^ | dress | doll_joints | hair_bow | black_bowtie | black_skirt | full_body | collared_shirt | chibi | fairy_wings | loli | on_back | flat_chest | navel | nipples | pussy_juice | censored | nude | spread_legs | hetero | solo_focus | 1boy | penis | cum_in_pussy | facial | sex | vaginal | bar_censor | |----:|----------:|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:--------|:---------------------|:----------------------|:-------|:--------------------|:--------|:--------|:------------|:--------|:--------|:----------------|:----------------|:----------|:---------|:-----------------------|:-------------|:--------------|:------------------|:--------|:--------------------|:---------------|:------------|:-------------|:-------------------|:--------------|:-----------------|:--------------------|:---------------|:----------------|:--------------------|:------------|:-----------------------|:-----|:--------------|:-------------|:---------------|:--------|:-----------|:--------------|:------|:--------|:--------------|:-----------|:---------------|:--------------|:------------|:-----------------|:--------|:--------------|:-------|:----------|:-------------|:--------|:----------|:--------------|:-----------|:-------|:--------------|:---------|:-------------|:-------|:--------|:---------------|:---------|:------|:----------|:-------------| | 0 | 11 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 1 | 7 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | X | X | | X | X | X | | | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 2 | 6 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | X | | | X | | | X | | X | X | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 3 | 12 | ![](samples/3/clu3-sample0.png) | ![](samples/3/clu3-sample1.png) | ![](samples/3/clu3-sample2.png) | ![](samples/3/clu3-sample3.png) | ![](samples/3/clu3-sample4.png) | X | | X | X | X | | | X | | | | | | | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 4 | 11 | ![](samples/4/clu4-sample0.png) | ![](samples/4/clu4-sample1.png) | ![](samples/4/clu4-sample2.png) | ![](samples/4/clu4-sample3.png) | ![](samples/4/clu4-sample4.png) | X | | X | X | X | | X | | | | | | | | X | X | X | | X | X | X | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 5 | 5 | ![](samples/5/clu5-sample0.png) | ![](samples/5/clu5-sample1.png) | ![](samples/5/clu5-sample2.png) | ![](samples/5/clu5-sample3.png) | ![](samples/5/clu5-sample4.png) | X | X | X | X | X | | | X | | | | | X | | | X | X | X | X | | X | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 6 | 6 | ![](samples/6/clu6-sample0.png) | ![](samples/6/clu6-sample1.png) | ![](samples/6/clu6-sample2.png) | ![](samples/6/clu6-sample3.png) | ![](samples/6/clu6-sample4.png) | X | | | X | | | X | | | X | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 7 | 5 | ![](samples/7/clu7-sample0.png) | ![](samples/7/clu7-sample1.png) | ![](samples/7/clu7-sample2.png) | ![](samples/7/clu7-sample3.png) | ![](samples/7/clu7-sample4.png) | X | X | | X | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | 8 | 5 | ![](samples/8/clu8-sample0.png) | ![](samples/8/clu8-sample1.png) | ![](samples/8/clu8-sample2.png) | ![](samples/8/clu8-sample3.png) | ![](samples/8/clu8-sample4.png) | X | | X | X | X | | | | | | | | | | | X | | | X | X | X | | | X | | | | | | | | | | | | X | X | | X | | | | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | 9 | 5 | ![](samples/9/clu9-sample0.png) | ![](samples/9/clu9-sample1.png) | ![](samples/9/clu9-sample2.png) | ![](samples/9/clu9-sample3.png) | ![](samples/9/clu9-sample4.png) | X | | | X | X | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | 10 | 8 | ![](samples/10/clu10-sample0.png) | ![](samples/10/clu10-sample1.png) | ![](samples/10/clu10-sample2.png) | ![](samples/10/clu10-sample3.png) | ![](samples/10/clu10-sample4.png) | X | | | X | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | | | | | | | | | | | 11 | 9 | ![](samples/11/clu11-sample0.png) | ![](samples/11/clu11-sample1.png) | ![](samples/11/clu11-sample2.png) | ![](samples/11/clu11-sample3.png) | ![](samples/11/clu11-sample4.png) | X | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | X | | | | | | | | | X | X | X | X | X | X | X | X | X |
CyberHarem/medicine_melancholy_touhou
[ "task_categories:text-to-image", "size_categories:n<1K", "license:mit", "art", "not-for-all-audiences", "region:us" ]
2023-08-18T12:24:45+00:00
{"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]}
2024-01-14T22:42:18+00:00
[]
[]
TAGS #task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
Dataset of medicine\_melancholy/メディスン・メランコリー/메디슨멜랑콜리 (Touhou) ============================================================= This is the dataset of medicine\_melancholy/メディスン・メランコリー/메디슨멜랑콜리 (Touhou), containing 500 images and their tags. The core tags of this character are 'blonde\_hair, short\_hair, ribbon, hair\_ribbon, bow, blue\_eyes, red\_ribbon, red\_bow', which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization). List of Packages ---------------- ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code List of Clusters ---------------- List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version ### Table Version
[ "### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.", "### Raw Text Version", "### Table Version" ]
[ "TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n", "### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.", "### Raw Text Version", "### Table Version" ]
[ 44, 61, 5, 4 ]
[ "passage: TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.### Raw Text Version### Table Version" ]
fe7916a0e8ccb347f41c1683ec2dfb0e375f72cb
# Dataset Card for "portuguese_europarl_1" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
arubenruben/portuguese_europarl
[ "region:us" ]
2023-08-18T12:30:47+00:00
{"dataset_info": {"features": [{"name": "text", "dtype": "string"}, {"name": "label", "dtype": {"class_label": {"names": {"0": "pt-PT", "1": "pt-BR"}}}}], "splits": [{"name": "train", "num_bytes": 276595020, "num_examples": 7547}, {"name": "test", "num_bytes": 80381927, "num_examples": 1887}], "download_size": 193710364, "dataset_size": 356976947}}
2023-08-18T12:31:51+00:00
[]
[]
TAGS #region-us
# Dataset Card for "portuguese_europarl_1" More Information needed
[ "# Dataset Card for \"portuguese_europarl_1\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"portuguese_europarl_1\"\n\nMore Information needed" ]
[ 6, 17 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"portuguese_europarl_1\"\n\nMore Information needed" ]
a58e928286ce6e06e49e2c1d49d553dc8246b6c5
# Dataset Card for "Open-Platypus-flattened-text" This is a version of the [garage-bAInd/Open-Platypus](https://huggingface.co/datasets/garage-bAInd/Open-Platypus) dataset. It has a single "text" column containing the "Instruction", "Input" and "Response" concatenated in a large string. The following templates are used (without prompt preamble). 1. If there is no "Input": ``` ### Instruction: Some instruction goes here ### Response: The response output goes here ``` 2. If there is an "Input" text: ``` ### Instruction: Some instruction goes here ### Input: Here is the input text ### Response: The response output goes here ```
alup/Open-Platypus-flattened-text
[ "license:mit", "region:us" ]
2023-08-18T12:44:33+00:00
{"license": "mit", "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}], "dataset_info": {"features": [{"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 31108949, "num_examples": 24926}], "download_size": 15282012, "dataset_size": 31108949}}
2023-08-18T12:57:12+00:00
[]
[]
TAGS #license-mit #region-us
# Dataset Card for "Open-Platypus-flattened-text" This is a version of the garage-bAInd/Open-Platypus dataset. It has a single "text" column containing the "Instruction", "Input" and "Response" concatenated in a large string. The following templates are used (without prompt preamble). 1. If there is no "Input": 2. If there is an "Input" text:
[ "# Dataset Card for \"Open-Platypus-flattened-text\"\n\nThis is a version of the garage-bAInd/Open-Platypus dataset.\nIt has a single \"text\" column containing the \"Instruction\", \"Input\" and \"Response\" concatenated in a large string.\n\nThe following templates are used (without prompt preamble).\n\n1. If there is no \"Input\":\n\n\n\n2. If there is an \"Input\" text:" ]
[ "TAGS\n#license-mit #region-us \n", "# Dataset Card for \"Open-Platypus-flattened-text\"\n\nThis is a version of the garage-bAInd/Open-Platypus dataset.\nIt has a single \"text\" column containing the \"Instruction\", \"Input\" and \"Response\" concatenated in a large string.\n\nThe following templates are used (without prompt preamble).\n\n1. If there is no \"Input\":\n\n\n\n2. If there is an \"Input\" text:" ]
[ 11, 108 ]
[ "passage: TAGS\n#license-mit #region-us \n# Dataset Card for \"Open-Platypus-flattened-text\"\n\nThis is a version of the garage-bAInd/Open-Platypus dataset.\nIt has a single \"text\" column containing the \"Instruction\", \"Input\" and \"Response\" concatenated in a large string.\n\nThe following templates are used (without prompt preamble).\n\n1. If there is no \"Input\":\n\n\n\n2. If there is an \"Input\" text:" ]
f605a7d27f1e50c20e5a348b0329acb3667ef9ff
# Dataset of kudamaki_tsukasa/菅牧典/쿠다마키츠카사 (Touhou) This is the dataset of kudamaki_tsukasa/菅牧典/쿠다마키츠카사 (Touhou), containing 500 images and their tags. The core tags of this character are `animal_ears, fox_ears, short_hair, hair_between_eyes, fox_tail, blonde_hair, tail, yellow_eyes, green_ribbon, ribbon, fox_girl, bangs, breasts`, which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)). ## List of Packages | Name | Images | Size | Download | Type | Description | |:-----------------|---------:|:-----------|:-------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------| | raw | 500 | 671.70 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kudamaki_tsukasa_touhou/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). | | 800 | 500 | 356.02 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kudamaki_tsukasa_touhou/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. | | stage3-p480-800 | 1221 | 783.34 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kudamaki_tsukasa_touhou/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | | 1200 | 500 | 584.60 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kudamaki_tsukasa_touhou/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. | | stage3-p480-1200 | 1221 | 1.15 GiB | [Download](https://huggingface.co/datasets/CyberHarem/kudamaki_tsukasa_touhou/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code ```python import os import zipfile from huggingface_hub import hf_hub_download from waifuc.source import LocalSource # download raw archive file zip_file = hf_hub_download( repo_id='CyberHarem/kudamaki_tsukasa_touhou', repo_type='dataset', filename='dataset-raw.zip', ) # extract files to your directory dataset_dir = 'dataset_dir' os.makedirs(dataset_dir, exist_ok=True) with zipfile.ZipFile(zip_file, 'r') as zf: zf.extractall(dataset_dir) # load the dataset with waifuc source = LocalSource(dataset_dir) for item in source: print(item.image, item.meta['filename'], item.meta['tags']) ``` ## List of Clusters List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| | 0 | 9 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, full_body, romper, short_sleeves, simple_background, solo, brown_eyes, looking_at_viewer, smile, test_tube, white_socks, holding, blush, fox_shadow_puppet, open_mouth, white_background, standing, tabi | | 1 | 8 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | 1girl, romper, short_sleeves, simple_background, smile, solo, upper_body, white_background, looking_at_viewer, open_mouth, :3, double_fox_shadow_puppet, blush, animal_ear_fluff | | 2 | 25 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | 1girl, blush, closed_eyes, romper, signature, smile, light_brown_hair, solo, open_mouth, short_sleeves, full_body, barefoot, white_background | ### Table Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | full_body | romper | short_sleeves | simple_background | solo | brown_eyes | looking_at_viewer | smile | test_tube | white_socks | holding | blush | fox_shadow_puppet | open_mouth | white_background | standing | tabi | upper_body | :3 | double_fox_shadow_puppet | animal_ear_fluff | closed_eyes | signature | light_brown_hair | barefoot | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:------------|:---------|:----------------|:--------------------|:-------|:-------------|:--------------------|:--------|:------------|:--------------|:----------|:--------|:--------------------|:-------------|:-------------------|:-----------|:-------|:-------------|:-----|:---------------------------|:-------------------|:--------------|:------------|:-------------------|:-----------| | 0 | 9 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | 1 | 8 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | X | | X | X | X | X | | X | X | | | | X | | X | X | | | X | X | X | X | | | | | | 2 | 25 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | X | X | X | X | | X | | | X | | | | X | | X | X | | | | | | | X | X | X | X |
CyberHarem/kudamaki_tsukasa_touhou
[ "task_categories:text-to-image", "size_categories:n<1K", "license:mit", "art", "not-for-all-audiences", "region:us" ]
2023-08-18T12:48:17+00:00
{"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]}
2024-01-15T04:07:25+00:00
[]
[]
TAGS #task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
Dataset of kudamaki\_tsukasa/菅牧典/쿠다마키츠카사 (Touhou) ================================================= This is the dataset of kudamaki\_tsukasa/菅牧典/쿠다마키츠카사 (Touhou), containing 500 images and their tags. The core tags of this character are 'animal\_ears, fox\_ears, short\_hair, hair\_between\_eyes, fox\_tail, blonde\_hair, tail, yellow\_eyes, green\_ribbon, ribbon, fox\_girl, bangs, breasts', which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization). List of Packages ---------------- ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code List of Clusters ---------------- List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version ### Table Version
[ "### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.", "### Raw Text Version", "### Table Version" ]
[ "TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n", "### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.", "### Raw Text Version", "### Table Version" ]
[ 44, 61, 5, 4 ]
[ "passage: TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.### Raw Text Version### Table Version" ]
ebcef6051532f5d9357e2c186c8db6065e2c36ca
# FakeRecogna ## Dataset Description - **Homepage:** [https://github.com/Gabriel-Lino-Garcia/FakeRecogna](https://github.com/Gabriel-Lino-Garcia/FakeRecogna) - **Repository:** - **Paper:** - **Leaderboard:** - **Point of Contact:** ### Dataset Summary FakeRecogna is a dataset comprised of real and fake news. The real news is not directly linked to fake news and vice-versa, which could lead to a biased classification. The news collection was performed by crawlers developed for mining pages of well-known and of great national importance agency news. ### Supported Tasks and Leaderboards [More Information Needed] ### Languages The dataset is in Portuguese. ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information If you use "FakeRecogna Dataset", please cite: ```bibtex @inproceedings{10.1007/978-3-030-98305-5_6, author = {Garcia, Gabriel L. and Afonso, Luis C. S. and Papa, Jo\~{a}o P.}, title = {FakeRecogna: A New Brazilian Corpus for Fake News Detection}, year = {2022}, isbn = {978-3-030-98304-8}, publisher = {Springer-Verlag}, address = {Berlin, Heidelberg}, url = {https://doi.org/10.1007/978-3-030-98305-5_6}, doi = {10.1007/978-3-030-98305-5_6}, abstract = {Fake news has become a research topic of great importance in Natural Language Processing due to its negative impact on our society. Although its pertinence, there are few datasets available in Brazilian Portuguese and mostly comprise few samples. Therefore, this paper proposes creating a new fake news dataset named FakeRecogna that contains a greater number of samples, more up-to-date news, and covering a few of the most important categories. We perform a toy evaluation over the created dataset using traditional classifiers such as Naive Bayes, Optimum-Path Forest, and Support Vector Machines. A Convolutional Neural Network is also evaluated in the context of fake news detection in the proposed dataset.}, booktitle = {Computational Processing of the Portuguese Language: 15th International Conference, PROPOR 2022, Fortaleza, Brazil, March 21–23, 2022, Proceedings}, pages = {57–67}, numpages = {11}, keywords = {Fake news, Corpus, Portuguese}, location = {Fortaleza, Brazil} } ``` ### Contributions Thanks to [@ju-resplande](https://github.com/ju-resplande) for adding this dataset.
fake-news-UFG/FakeRecogna
[ "language_creators:found", "multilinguality:monolingual", "size_categories:10K<n<100K", "language:pt", "region:us" ]
2023-08-18T13:04:34+00:00
{"language_creators": ["found"], "language": ["pt"], "multilinguality": ["monolingual"], "size_categories": ["10K<n<100K"], "pretty_name": "FakeRecogna", "language_details": "pt-BR"}
2023-08-18T13:45:07+00:00
[]
[ "pt" ]
TAGS #language_creators-found #multilinguality-monolingual #size_categories-10K<n<100K #language-Portuguese #region-us
# FakeRecogna ## Dataset Description - Homepage: URL - Repository: - Paper: - Leaderboard: - Point of Contact: ### Dataset Summary FakeRecogna is a dataset comprised of real and fake news. The real news is not directly linked to fake news and vice-versa, which could lead to a biased classification. The news collection was performed by crawlers developed for mining pages of well-known and of great national importance agency news. ### Supported Tasks and Leaderboards ### Languages The dataset is in Portuguese. ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information If you use "FakeRecogna Dataset", please cite: ### Contributions Thanks to @ju-resplande for adding this dataset.
[ "# FakeRecogna", "## Dataset Description\n\n- Homepage: URL\n- Repository: \n- Paper: \n- Leaderboard: \n- Point of Contact:", "### Dataset Summary\n\nFakeRecogna is a dataset comprised of real and fake news. \nThe real news is not directly linked to fake news and vice-versa, which could lead to a biased classification.\nThe news collection was performed by crawlers developed for mining pages of well-known and of great national importance agency news.", "### Supported Tasks and Leaderboards", "### Languages\n\nThe dataset is in Portuguese.", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information\n\n\n\n\n\nIf you use \"FakeRecogna Dataset\", please cite:", "### Contributions\n\nThanks to @ju-resplande for adding this dataset." ]
[ "TAGS\n#language_creators-found #multilinguality-monolingual #size_categories-10K<n<100K #language-Portuguese #region-us \n", "# FakeRecogna", "## Dataset Description\n\n- Homepage: URL\n- Repository: \n- Paper: \n- Leaderboard: \n- Point of Contact:", "### Dataset Summary\n\nFakeRecogna is a dataset comprised of real and fake news. \nThe real news is not directly linked to fake news and vice-versa, which could lead to a biased classification.\nThe news collection was performed by crawlers developed for mining pages of well-known and of great national importance agency news.", "### Supported Tasks and Leaderboards", "### Languages\n\nThe dataset is in Portuguese.", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information\n\n\n\n\n\nIf you use \"FakeRecogna Dataset\", please cite:", "### Contributions\n\nThanks to @ju-resplande for adding this dataset." ]
[ 40, 6, 25, 76, 10, 13, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 21, 19 ]
[ "passage: TAGS\n#language_creators-found #multilinguality-monolingual #size_categories-10K<n<100K #language-Portuguese #region-us \n# FakeRecogna## Dataset Description\n\n- Homepage: URL\n- Repository: \n- Paper: \n- Leaderboard: \n- Point of Contact:### Dataset Summary\n\nFakeRecogna is a dataset comprised of real and fake news. \nThe real news is not directly linked to fake news and vice-versa, which could lead to a biased classification.\nThe news collection was performed by crawlers developed for mining pages of well-known and of great national importance agency news.### Supported Tasks and Leaderboards### Languages\n\nThe dataset is in Portuguese.## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information\n\n\n\n\n\nIf you use \"FakeRecogna Dataset\", please cite:### Contributions\n\nThanks to @ju-resplande for adding this dataset." ]
9b1a9bfb5ad02172b6a1960affda0ff428943b26
## source mix data from https://www.kaggle.com/datasets/allanyiinai/chinesecorpus - use ```python from datasets import load_datasets ds = load_datasets("ticoAg/ChineseCorpus-Kaggle-fanti") ``` - example ```json [ { "text": "2017年12月5日,重慶市交委正式下發《關于新建市郊鐵路磨心坡至合川線工程初步設計的批復》,2017年計劃開工四個節點工程,包括渭沱貨運站場、土場貨運站場、嘉陵江特大橋、九峰山遂道。" }, { "text": "2017年7月6日,線路重要節點合川渭沱貨運站開工建設,線路開始建設,項目建設工期為48個月。" }, { "text": "日前,渝合線二期(合川段)施工出現了停滯,至今仍未解決,合川區人民政府在2019、2020年均稱將力促市郊鐵路渝合線復工。" }, { "text": "2012年,12歲的加比亞加盟米蘭青訓營。在 2017 年 5 月 7 日米蘭主場對陣羅馬的意甲比賽之前,他第一次受到主教練蒙特拉的征召。然而,他仍然是一個沒獲得出場機會的替補。 2017 年 8 月 24 日,他在歐聯杯預選賽對陣斯肯迪亞的比賽中首次代表俱樂部出場,他在第 73 分鐘替補洛卡特利出場。" }, { "text": "他在2018 年歐洲 19 歲以下歐洲錦標賽上代表意大利 U19參加了兩場小組賽,意大利獲得亞軍。隨后他隨意大利 U20參加了2019 年國際足聯 U-20 世界杯。" } ] ```
ticoAg/ChineseCorpus-Kaggle-fanti
[ "task_categories:text-generation", "size_categories:10M<n<100M", "language:tw", "language:zh", "license:apache-2.0", "region:us" ]
2023-08-18T13:08:03+00:00
{"language": ["tw", "zh"], "license": "apache-2.0", "size_categories": ["10M<n<100M"], "task_categories": ["text-generation"], "39436887": "examples", "raw size": "4G"}
2023-08-19T08:52:06+00:00
[]
[ "tw", "zh" ]
TAGS #task_categories-text-generation #size_categories-10M<n<100M #language-Twi #language-Chinese #license-apache-2.0 #region-us
## source mix data from URL - use - example
[ "## source\n\nmix data from URL\n\n- use\n\n\n- example" ]
[ "TAGS\n#task_categories-text-generation #size_categories-10M<n<100M #language-Twi #language-Chinese #license-apache-2.0 #region-us \n", "## source\n\nmix data from URL\n\n- use\n\n\n- example" ]
[ 47, 10 ]
[ "passage: TAGS\n#task_categories-text-generation #size_categories-10M<n<100M #language-Twi #language-Chinese #license-apache-2.0 #region-us \n## source\n\nmix data from URL\n\n- use\n\n\n- example" ]
08ad8c5ff29558ec92443d9bf3ad8cfb37407bed
# Dataset of hecatia_lapislazuli/ヘカーティア・ラピスラズリ (Touhou) This is the dataset of hecatia_lapislazuli/ヘカーティア・ラピスラズリ (Touhou), containing 500 images and their tags. The core tags of this character are `red_hair, polos_crown, red_eyes, breasts, bangs, long_hair, medium_hair, hair_between_eyes`, which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)). ## List of Packages | Name | Images | Size | Download | Type | Description | |:-----------------|---------:|:-----------|:----------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------| | raw | 500 | 655.49 MiB | [Download](https://huggingface.co/datasets/CyberHarem/hecatia_lapislazuli_touhou/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). | | 800 | 500 | 381.01 MiB | [Download](https://huggingface.co/datasets/CyberHarem/hecatia_lapislazuli_touhou/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. | | stage3-p480-800 | 1243 | 825.21 MiB | [Download](https://huggingface.co/datasets/CyberHarem/hecatia_lapislazuli_touhou/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | | 1200 | 500 | 585.82 MiB | [Download](https://huggingface.co/datasets/CyberHarem/hecatia_lapislazuli_touhou/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. | | stage3-p480-1200 | 1243 | 1.14 GiB | [Download](https://huggingface.co/datasets/CyberHarem/hecatia_lapislazuli_touhou/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code ```python import os import zipfile from huggingface_hub import hf_hub_download from waifuc.source import LocalSource # download raw archive file zip_file = hf_hub_download( repo_id='CyberHarem/hecatia_lapislazuli_touhou', repo_type='dataset', filename='dataset-raw.zip', ) # extract files to your directory dataset_dir = 'dataset_dir' os.makedirs(dataset_dir, exist_ok=True) with zipfile.ZipFile(zip_file, 'r') as zf: zf.extractall(dataset_dir) # load the dataset with waifuc source = LocalSource(dataset_dir) for item in source: print(item.image, item.meta['filename'], item.meta['tags']) ``` ## List of Clusters List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| | 0 | 8 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, bare_shoulders, black_shirt, cleavage, clothes_writing, earrings, large_breasts, off-shoulder_shirt, pointy_ears, simple_background, solo, t-shirt, upper_body, collarbone, short_sleeves, smile, looking_at_viewer, moon_(ornament), grey_background, closed_mouth, one-hour_drawing_challenge, one_eye_closed, black_choker, black_headwear, gold_chain, heart, white_background | | 1 | 19 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | 1girl, black_shirt, clothes_writing, earth_(ornament), gold_chain, green_skirt, moon_(ornament), multicolored_skirt, plaid_skirt, short_sleeves, solo, t-shirt, bare_shoulders, black_choker, black_headwear, off-shoulder_shirt, smile, looking_at_viewer, simple_background, heart_print, medium_breasts, standing, blush, collarbone, white_background, purple_skirt, open_mouth, blue_skirt, hands_up, closed_mouth | | 2 | 14 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | 1girl, bare_shoulders, black_shirt, clothes_writing, earth_(ornament), looking_at_viewer, moon_(ornament), multicolored_skirt, off-shoulder_shirt, plaid_skirt, smile, solo, t-shirt, gold_chain, short_sleeves, simple_background, black_choker, white_background, barefoot, full_body, green_skirt, open_mouth, bare_legs | | 3 | 9 | ![](samples/3/clu3-sample0.png) | ![](samples/3/clu3-sample1.png) | ![](samples/3/clu3-sample2.png) | ![](samples/3/clu3-sample3.png) | ![](samples/3/clu3-sample4.png) | 1girl, black_shirt, clothes_writing, earth_(ornament), moon_(ornament), multicolored_skirt, off-shoulder_shirt, plaid_skirt, smile, solo, t-shirt, looking_at_viewer, short_sleeves, bare_shoulders, gold_chain, open_mouth, blush | | 4 | 10 | ![](samples/4/clu4-sample0.png) | ![](samples/4/clu4-sample1.png) | ![](samples/4/clu4-sample2.png) | ![](samples/4/clu4-sample3.png) | ![](samples/4/clu4-sample4.png) | 1girl, bare_shoulders, chain, clothes_writing, collar, hat, looking_at_viewer, multicolored_skirt, solo, earth_(ornament), moon_(ornament), off-shoulder_shirt, t-shirt, smile, miniskirt, open_mouth | | 5 | 8 | ![](samples/5/clu5-sample0.png) | ![](samples/5/clu5-sample1.png) | ![](samples/5/clu5-sample2.png) | ![](samples/5/clu5-sample3.png) | ![](samples/5/clu5-sample4.png) | 1girl, black_shirt, cleavage, crop_top, earrings, earth_(ornament), large_breasts, midriff, moon_(ornament), off-shoulder_shirt, pointy_ears, solo, t-shirt, clothes_writing, gold_chain, navel, plaid_skirt, simple_background, cropped_shirt, belt, multicolored_skirt, sitting, one-hour_drawing_challenge, white_background | | 6 | 5 | ![](samples/6/clu6-sample0.png) | ![](samples/6/clu6-sample1.png) | ![](samples/6/clu6-sample2.png) | ![](samples/6/clu6-sample3.png) | ![](samples/6/clu6-sample4.png) | 1girl, bare_shoulders, black_choker, black_headwear, black_shirt, clothes_writing, collarbone, heart_print, looking_at_viewer, medium_breasts, off-shoulder_shirt, short_sleeves, simple_background, smile, solo, t-shirt, upper_body, blush, earth_(ornament), gold_chain, moon_(ornament), white_background, hand_up, closed_mouth, crop_top, earrings, hands_up, navel, open_mouth | | 7 | 6 | ![](samples/7/clu7-sample0.png) | ![](samples/7/clu7-sample1.png) | ![](samples/7/clu7-sample2.png) | ![](samples/7/clu7-sample3.png) | ![](samples/7/clu7-sample4.png) | 1girl, blush, chain, large_breasts, looking_at_viewer, nipples, solo, completely_nude, collar, navel, simple_background, smile | | 8 | 5 | ![](samples/8/clu8-sample0.png) | ![](samples/8/clu8-sample1.png) | ![](samples/8/clu8-sample2.png) | ![](samples/8/clu8-sample3.png) | ![](samples/8/clu8-sample4.png) | 1boy, 1girl, blush, hetero, large_breasts, nipples, penis, solo_focus, sweat, collar, navel, sex, shirt_lift, vaginal, chain, heart, multicolored_skirt, on_back, pov, spread_legs, bare_shoulders, bed_sheet, black_shirt, bottomless, closed_eyes, clothes_pull, cowgirl_position, cum_in_pussy, earth_(ornament), girl_on_top, moon_(ornament), mosaic_censoring, off_shoulder, open_mouth, pubic_hair, uncensored | ### Table Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | bare_shoulders | black_shirt | cleavage | clothes_writing | earrings | large_breasts | off-shoulder_shirt | pointy_ears | simple_background | solo | t-shirt | upper_body | collarbone | short_sleeves | smile | looking_at_viewer | moon_(ornament) | grey_background | closed_mouth | one-hour_drawing_challenge | one_eye_closed | black_choker | black_headwear | gold_chain | heart | white_background | earth_(ornament) | green_skirt | multicolored_skirt | plaid_skirt | heart_print | medium_breasts | standing | blush | purple_skirt | open_mouth | blue_skirt | hands_up | barefoot | full_body | bare_legs | chain | collar | hat | miniskirt | crop_top | midriff | navel | cropped_shirt | belt | sitting | hand_up | nipples | completely_nude | 1boy | hetero | penis | solo_focus | sweat | sex | shirt_lift | vaginal | on_back | pov | spread_legs | bed_sheet | bottomless | closed_eyes | clothes_pull | cowgirl_position | cum_in_pussy | girl_on_top | mosaic_censoring | off_shoulder | pubic_hair | uncensored | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-----------------|:--------------|:-----------|:------------------|:-----------|:----------------|:---------------------|:--------------|:--------------------|:-------|:----------|:-------------|:-------------|:----------------|:--------|:--------------------|:------------------|:------------------|:---------------|:-----------------------------|:-----------------|:---------------|:-----------------|:-------------|:--------|:-------------------|:-------------------|:--------------|:---------------------|:--------------|:--------------|:-----------------|:-----------|:--------|:---------------|:-------------|:-------------|:-----------|:-----------|:------------|:------------|:--------|:---------|:------|:------------|:-----------|:----------|:--------|:----------------|:-------|:----------|:----------|:----------|:------------------|:-------|:---------|:--------|:-------------|:--------|:------|:-------------|:----------|:----------|:------|:--------------|:------------|:-------------|:--------------|:---------------|:-------------------|:---------------|:--------------|:-------------------|:---------------|:-------------|:-------------| | 0 | 8 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 1 | 19 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | X | X | X | | X | | | X | | X | X | X | | X | X | X | X | X | | X | | | X | X | X | | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 2 | 14 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | X | X | X | | X | | | X | | X | X | X | | | X | X | X | X | | | | | X | | X | | X | X | X | X | X | | | | | | X | | | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 3 | 9 | ![](samples/3/clu3-sample0.png) | ![](samples/3/clu3-sample1.png) | ![](samples/3/clu3-sample2.png) | ![](samples/3/clu3-sample3.png) | ![](samples/3/clu3-sample4.png) | X | X | X | | X | | | X | | | X | X | | | X | X | X | X | | | | | | | X | | | X | | X | X | | | | X | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 4 | 10 | ![](samples/4/clu4-sample0.png) | ![](samples/4/clu4-sample1.png) | ![](samples/4/clu4-sample2.png) | ![](samples/4/clu4-sample3.png) | ![](samples/4/clu4-sample4.png) | X | X | | | X | | | X | | | X | X | | | | X | X | X | | | | | | | | | | X | | X | | | | | | | X | | | | | | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 5 | 8 | ![](samples/5/clu5-sample0.png) | ![](samples/5/clu5-sample1.png) | ![](samples/5/clu5-sample2.png) | ![](samples/5/clu5-sample3.png) | ![](samples/5/clu5-sample4.png) | X | | X | X | X | X | X | X | X | X | X | X | | | | | | X | | | X | | | | X | | X | X | | X | X | | | | | | | | | | | | | | | | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | 6 | 5 | ![](samples/6/clu6-sample0.png) | ![](samples/6/clu6-sample1.png) | ![](samples/6/clu6-sample2.png) | ![](samples/6/clu6-sample3.png) | ![](samples/6/clu6-sample4.png) | X | X | X | | X | X | | X | | X | X | X | X | X | X | X | X | X | | X | | | X | X | X | | X | X | | | | X | X | | X | | X | | X | | | | | | | | X | | X | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | 7 | 6 | ![](samples/7/clu7-sample0.png) | ![](samples/7/clu7-sample1.png) | ![](samples/7/clu7-sample2.png) | ![](samples/7/clu7-sample3.png) | ![](samples/7/clu7-sample4.png) | X | | | | | | X | | | X | X | | | | | X | X | | | | | | | | | | | | | | | | | | X | | | | | | | | X | X | | | | | X | | | | | X | X | | | | | | | | | | | | | | | | | | | | | | | | 8 | 5 | ![](samples/8/clu8-sample0.png) | ![](samples/8/clu8-sample1.png) | ![](samples/8/clu8-sample2.png) | ![](samples/8/clu8-sample3.png) | ![](samples/8/clu8-sample4.png) | X | X | X | | | | X | | | | | | | | | | | X | | | | | | | | X | | X | | X | | | | | X | | X | | | | | | X | X | | | | | X | | | | | X | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
CyberHarem/hecatia_lapislazuli_touhou
[ "task_categories:text-to-image", "size_categories:n<1K", "license:mit", "art", "not-for-all-audiences", "region:us" ]
2023-08-18T13:14:36+00:00
{"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]}
2024-01-14T22:16:19+00:00
[]
[]
TAGS #task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
Dataset of hecatia\_lapislazuli/ヘカーティア・ラピスラズリ (Touhou) ====================================================== This is the dataset of hecatia\_lapislazuli/ヘカーティア・ラピスラズリ (Touhou), containing 500 images and their tags. The core tags of this character are 'red\_hair, polos\_crown, red\_eyes, breasts, bangs, long\_hair, medium\_hair, hair\_between\_eyes', which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization). List of Packages ---------------- ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code List of Clusters ---------------- List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version ### Table Version
[ "### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.", "### Raw Text Version", "### Table Version" ]
[ "TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n", "### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.", "### Raw Text Version", "### Table Version" ]
[ 44, 61, 5, 4 ]
[ "passage: TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.### Raw Text Version### Table Version" ]
293beaff154aa5fe0a025b160c1f9ba257f17027
# Dataset Card for "directv-zocalos-18-agosto-5fps" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
Seenka/directv-zocalos-18-agosto-5fps
[ "region:us" ]
2023-08-18T13:15:37+00:00
{"dataset_info": {"features": [{"name": "image", "dtype": "image"}, {"name": "frame_time", "dtype": "time64[us]"}, {"name": "video_storage_path", "dtype": "string"}, {"name": "zocalo_id", "dtype": "string"}, {"name": "frame_number", "dtype": "int64"}], "splits": [{"name": "train", "num_bytes": 3021197.0, "num_examples": 25}], "download_size": 1857619, "dataset_size": 3021197.0}}
2023-08-18T13:15:44+00:00
[]
[]
TAGS #region-us
# Dataset Card for "directv-zocalos-18-agosto-5fps" More Information needed
[ "# Dataset Card for \"directv-zocalos-18-agosto-5fps\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"directv-zocalos-18-agosto-5fps\"\n\nMore Information needed" ]
[ 6, 22 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"directv-zocalos-18-agosto-5fps\"\n\nMore Information needed" ]
7be1129defd5d71916303066d4629abe3cfb2230
# Dataset of elly (Touhou) This is the dataset of elly (Touhou), containing 209 images and their tags. The core tags of this character are `blonde_hair, hat, ribbon, short_hair, yellow_eyes, hat_ribbon, bow, curly_hair, white_headwear, drill_hair, red_ribbon`, which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)). ## List of Packages | Name | Images | Size | Download | Type | Description | |:-----------------|---------:|:-----------|:-------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------| | raw | 209 | 185.66 MiB | [Download](https://huggingface.co/datasets/CyberHarem/elly_touhou/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). | | 800 | 209 | 127.87 MiB | [Download](https://huggingface.co/datasets/CyberHarem/elly_touhou/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. | | stage3-p480-800 | 384 | 235.86 MiB | [Download](https://huggingface.co/datasets/CyberHarem/elly_touhou/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | | 1200 | 209 | 171.74 MiB | [Download](https://huggingface.co/datasets/CyberHarem/elly_touhou/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. | | stage3-p480-1200 | 384 | 302.14 MiB | [Download](https://huggingface.co/datasets/CyberHarem/elly_touhou/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code ```python import os import zipfile from huggingface_hub import hf_hub_download from waifuc.source import LocalSource # download raw archive file zip_file = hf_hub_download( repo_id='CyberHarem/elly_touhou', repo_type='dataset', filename='dataset-raw.zip', ) # extract files to your directory dataset_dir = 'dataset_dir' os.makedirs(dataset_dir, exist_ok=True) with zipfile.ZipFile(zip_file, 'r') as zf: zf.extractall(dataset_dir) # load the dataset with waifuc source = LocalSource(dataset_dir) for item in source: print(item.image, item.meta['filename'], item.meta['tags']) ``` ## List of Clusters List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| | 0 | 10 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, bangs, holding_scythe, juliet_sleeves, red_dress, solo, frills, hat_bow, looking_at_viewer, smile, medium_breasts, sun_hat, closed_mouth, upper_body, open_mouth, red_bowtie | | 1 | 5 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | 1girl, holding_scythe, long_sleeves, looking_at_viewer, smile, solo, red_dress, weapon, open_mouth | | 2 | 16 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | 1girl, holding_scythe, solo, dress, smile, weapon | | 3 | 11 | ![](samples/3/clu3-sample0.png) | ![](samples/3/clu3-sample1.png) | ![](samples/3/clu3-sample2.png) | ![](samples/3/clu3-sample3.png) | ![](samples/3/clu3-sample4.png) | 1girl, blush, puffy_short_sleeves, solo, looking_at_viewer, medium_breasts, navel, red_skirt, brown_shirt, hat_bow, holding_scythe, open_mouth, smile, bangs, black_panties, highleg_panties, midriff, red_bow, frilled_skirt, stomach, black_ribbon, bobby_socks, brown_ribbon, crop_top, drill_locks, full_body, holding_weapon, neck_ribbon, shoes | ### Table Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | bangs | holding_scythe | juliet_sleeves | red_dress | solo | frills | hat_bow | looking_at_viewer | smile | medium_breasts | sun_hat | closed_mouth | upper_body | open_mouth | red_bowtie | long_sleeves | weapon | dress | blush | puffy_short_sleeves | navel | red_skirt | brown_shirt | black_panties | highleg_panties | midriff | red_bow | frilled_skirt | stomach | black_ribbon | bobby_socks | brown_ribbon | crop_top | drill_locks | full_body | holding_weapon | neck_ribbon | shoes | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------|:-----------------|:-----------------|:------------|:-------|:---------|:----------|:--------------------|:--------|:-----------------|:----------|:---------------|:-------------|:-------------|:-------------|:---------------|:---------|:--------|:--------|:----------------------|:--------|:------------|:--------------|:----------------|:------------------|:----------|:----------|:----------------|:----------|:---------------|:--------------|:---------------|:-----------|:--------------|:------------|:-----------------|:--------------|:--------| | 0 | 10 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | 1 | 5 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | X | | X | | X | X | | | X | X | | | | | X | | X | X | | | | | | | | | | | | | | | | | | | | | | | 2 | 16 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | X | | X | | | X | | | | X | | | | | | | | X | X | | | | | | | | | | | | | | | | | | | | | | 3 | 11 | ![](samples/3/clu3-sample0.png) | ![](samples/3/clu3-sample1.png) | ![](samples/3/clu3-sample2.png) | ![](samples/3/clu3-sample3.png) | ![](samples/3/clu3-sample4.png) | X | X | X | | | X | | X | X | X | X | | | | X | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
CyberHarem/elly_touhou
[ "task_categories:text-to-image", "size_categories:n<1K", "license:mit", "art", "not-for-all-audiences", "region:us" ]
2023-08-18T13:20:01+00:00
{"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]}
2024-01-15T06:07:32+00:00
[]
[]
TAGS #task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
Dataset of elly (Touhou) ======================== This is the dataset of elly (Touhou), containing 209 images and their tags. The core tags of this character are 'blonde\_hair, hat, ribbon, short\_hair, yellow\_eyes, hat\_ribbon, bow, curly\_hair, white\_headwear, drill\_hair, red\_ribbon', which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization). List of Packages ---------------- ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code List of Clusters ---------------- List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version ### Table Version
[ "### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.", "### Raw Text Version", "### Table Version" ]
[ "TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n", "### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.", "### Raw Text Version", "### Table Version" ]
[ 44, 61, 5, 4 ]
[ "passage: TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.### Raw Text Version### Table Version" ]