sha
stringlengths
40
40
text
stringlengths
1
13.4M
id
stringlengths
2
117
tags
listlengths
1
7.91k
created_at
stringlengths
25
25
metadata
stringlengths
2
875k
last_modified
stringlengths
25
25
arxiv
listlengths
0
25
languages
listlengths
0
7.91k
tags_str
stringlengths
17
159k
text_str
stringlengths
1
447k
text_lists
listlengths
0
352
processed_texts
listlengths
1
353
tokens_length
listlengths
1
353
input_texts
listlengths
1
40
3dfa8d756d9562cfa4e5c0de2d8df2d94f492156
# Dataset Card for Evaluation run of Voicelab/trurl-2-13b-academic ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/Voicelab/trurl-2-13b-academic - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [Voicelab/trurl-2-13b-academic](https://huggingface.co/Voicelab/trurl-2-13b-academic) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_Voicelab__trurl-2-13b-academic", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-10-26T13:54:25.329738](https://huggingface.co/datasets/open-llm-leaderboard/details_Voicelab__trurl-2-13b-academic/blob/main/results_2023-10-26T13-54-25.329738.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.38265520134228187, "em_stderr": 0.004977455184961271, "f1": 0.45275587248322363, "f1_stderr": 0.004784339979418239, "acc": 0.4373808097665532, "acc_stderr": 0.010248109703374565 }, "harness|drop|3": { "em": 0.38265520134228187, "em_stderr": 0.004977455184961271, "f1": 0.45275587248322363, "f1_stderr": 0.004784339979418239 }, "harness|gsm8k|5": { "acc": 0.10917361637604246, "acc_stderr": 0.008590089300511146 }, "harness|winogrande|5": { "acc": 0.7655880031570639, "acc_stderr": 0.011906130106237986 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_Voicelab__trurl-2-13b-academic
[ "region:us" ]
2023-09-21T20:27:11+00:00
{"pretty_name": "Evaluation run of Voicelab/trurl-2-13b-academic", "dataset_summary": "Dataset automatically created during the evaluation run of model [Voicelab/trurl-2-13b-academic](https://huggingface.co/Voicelab/trurl-2-13b-academic) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Voicelab__trurl-2-13b-academic\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-26T13:54:25.329738](https://huggingface.co/datasets/open-llm-leaderboard/details_Voicelab__trurl-2-13b-academic/blob/main/results_2023-10-26T13-54-25.329738.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.38265520134228187,\n \"em_stderr\": 0.004977455184961271,\n \"f1\": 0.45275587248322363,\n \"f1_stderr\": 0.004784339979418239,\n \"acc\": 0.4373808097665532,\n \"acc_stderr\": 0.010248109703374565\n },\n \"harness|drop|3\": {\n \"em\": 0.38265520134228187,\n \"em_stderr\": 0.004977455184961271,\n \"f1\": 0.45275587248322363,\n \"f1_stderr\": 0.004784339979418239\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.10917361637604246,\n \"acc_stderr\": 0.008590089300511146\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7655880031570639,\n \"acc_stderr\": 0.011906130106237986\n }\n}\n```", "repo_url": "https://huggingface.co/Voicelab/trurl-2-13b-academic", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_09_21T21_26_52.608718", "path": ["**/details_harness|arc:challenge|25_2023-09-21T21-26-52.608718.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-09-21T21-26-52.608718.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_26T13_54_25.329738", "path": ["**/details_harness|drop|3_2023-10-26T13-54-25.329738.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-26T13-54-25.329738.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_26T13_54_25.329738", "path": ["**/details_harness|gsm8k|5_2023-10-26T13-54-25.329738.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-26T13-54-25.329738.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_09_21T21_26_52.608718", "path": ["**/details_harness|hellaswag|10_2023-09-21T21-26-52.608718.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-09-21T21-26-52.608718.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_09_21T21_26_52.608718", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-21T21-26-52.608718.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-09-21T21-26-52.608718.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-09-21T21-26-52.608718.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-09-21T21-26-52.608718.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-21T21-26-52.608718.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-09-21T21-26-52.608718.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-09-21T21-26-52.608718.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-09-21T21-26-52.608718.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-09-21T21-26-52.608718.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-09-21T21-26-52.608718.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-09-21T21-26-52.608718.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-09-21T21-26-52.608718.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-21T21-26-52.608718.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-09-21T21-26-52.608718.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-21T21-26-52.608718.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-21T21-26-52.608718.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-09-21T21-26-52.608718.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-09-21T21-26-52.608718.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-09-21T21-26-52.608718.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-21T21-26-52.608718.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-21T21-26-52.608718.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-21T21-26-52.608718.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-09-21T21-26-52.608718.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-21T21-26-52.608718.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-21T21-26-52.608718.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-21T21-26-52.608718.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-21T21-26-52.608718.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-09-21T21-26-52.608718.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-21T21-26-52.608718.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-21T21-26-52.608718.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-21T21-26-52.608718.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-21T21-26-52.608718.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-09-21T21-26-52.608718.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-09-21T21-26-52.608718.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-09-21T21-26-52.608718.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-09-21T21-26-52.608718.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-21T21-26-52.608718.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-09-21T21-26-52.608718.parquet", "**/details_harness|hendrycksTest-management|5_2023-09-21T21-26-52.608718.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-09-21T21-26-52.608718.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-09-21T21-26-52.608718.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-09-21T21-26-52.608718.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-09-21T21-26-52.608718.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-21T21-26-52.608718.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-09-21T21-26-52.608718.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-09-21T21-26-52.608718.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-09-21T21-26-52.608718.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-09-21T21-26-52.608718.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-09-21T21-26-52.608718.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-09-21T21-26-52.608718.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-09-21T21-26-52.608718.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-09-21T21-26-52.608718.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-09-21T21-26-52.608718.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-09-21T21-26-52.608718.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-21T21-26-52.608718.parquet", "**/details_harness|hendrycksTest-virology|5_2023-09-21T21-26-52.608718.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-09-21T21-26-52.608718.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-21T21-26-52.608718.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-09-21T21-26-52.608718.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-09-21T21-26-52.608718.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-09-21T21-26-52.608718.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-21T21-26-52.608718.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-09-21T21-26-52.608718.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-09-21T21-26-52.608718.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-09-21T21-26-52.608718.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-09-21T21-26-52.608718.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-09-21T21-26-52.608718.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-09-21T21-26-52.608718.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-09-21T21-26-52.608718.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-21T21-26-52.608718.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-09-21T21-26-52.608718.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-21T21-26-52.608718.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-21T21-26-52.608718.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-09-21T21-26-52.608718.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-09-21T21-26-52.608718.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-09-21T21-26-52.608718.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-21T21-26-52.608718.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-21T21-26-52.608718.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-21T21-26-52.608718.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-09-21T21-26-52.608718.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-21T21-26-52.608718.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-21T21-26-52.608718.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-21T21-26-52.608718.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-21T21-26-52.608718.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-09-21T21-26-52.608718.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-21T21-26-52.608718.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-21T21-26-52.608718.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-21T21-26-52.608718.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-21T21-26-52.608718.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-09-21T21-26-52.608718.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-09-21T21-26-52.608718.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-09-21T21-26-52.608718.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-09-21T21-26-52.608718.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-21T21-26-52.608718.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-09-21T21-26-52.608718.parquet", "**/details_harness|hendrycksTest-management|5_2023-09-21T21-26-52.608718.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-09-21T21-26-52.608718.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-09-21T21-26-52.608718.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-09-21T21-26-52.608718.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-09-21T21-26-52.608718.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-21T21-26-52.608718.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-09-21T21-26-52.608718.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-09-21T21-26-52.608718.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-09-21T21-26-52.608718.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-09-21T21-26-52.608718.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-09-21T21-26-52.608718.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-09-21T21-26-52.608718.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-09-21T21-26-52.608718.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-09-21T21-26-52.608718.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-09-21T21-26-52.608718.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-09-21T21-26-52.608718.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-21T21-26-52.608718.parquet", "**/details_harness|hendrycksTest-virology|5_2023-09-21T21-26-52.608718.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-09-21T21-26-52.608718.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_09_21T21_26_52.608718", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-21T21-26-52.608718.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-21T21-26-52.608718.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_09_21T21_26_52.608718", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-09-21T21-26-52.608718.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-09-21T21-26-52.608718.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_09_21T21_26_52.608718", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-09-21T21-26-52.608718.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-09-21T21-26-52.608718.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_09_21T21_26_52.608718", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-09-21T21-26-52.608718.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-09-21T21-26-52.608718.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_09_21T21_26_52.608718", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-21T21-26-52.608718.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-21T21-26-52.608718.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_09_21T21_26_52.608718", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-09-21T21-26-52.608718.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-09-21T21-26-52.608718.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_09_21T21_26_52.608718", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-09-21T21-26-52.608718.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-09-21T21-26-52.608718.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_09_21T21_26_52.608718", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-09-21T21-26-52.608718.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-09-21T21-26-52.608718.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_09_21T21_26_52.608718", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-09-21T21-26-52.608718.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-09-21T21-26-52.608718.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_09_21T21_26_52.608718", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-09-21T21-26-52.608718.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-09-21T21-26-52.608718.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_09_21T21_26_52.608718", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-09-21T21-26-52.608718.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-09-21T21-26-52.608718.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_09_21T21_26_52.608718", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-09-21T21-26-52.608718.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-09-21T21-26-52.608718.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_09_21T21_26_52.608718", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-21T21-26-52.608718.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-21T21-26-52.608718.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_09_21T21_26_52.608718", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-09-21T21-26-52.608718.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-09-21T21-26-52.608718.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_09_21T21_26_52.608718", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-21T21-26-52.608718.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-21T21-26-52.608718.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_09_21T21_26_52.608718", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-21T21-26-52.608718.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-21T21-26-52.608718.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_09_21T21_26_52.608718", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-09-21T21-26-52.608718.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-09-21T21-26-52.608718.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_09_21T21_26_52.608718", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-09-21T21-26-52.608718.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-09-21T21-26-52.608718.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_09_21T21_26_52.608718", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-09-21T21-26-52.608718.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-09-21T21-26-52.608718.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_09_21T21_26_52.608718", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-21T21-26-52.608718.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-21T21-26-52.608718.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_09_21T21_26_52.608718", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-21T21-26-52.608718.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-21T21-26-52.608718.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_09_21T21_26_52.608718", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-21T21-26-52.608718.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-21T21-26-52.608718.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_09_21T21_26_52.608718", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-09-21T21-26-52.608718.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-09-21T21-26-52.608718.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_09_21T21_26_52.608718", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-21T21-26-52.608718.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-21T21-26-52.608718.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_09_21T21_26_52.608718", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-21T21-26-52.608718.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-21T21-26-52.608718.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_09_21T21_26_52.608718", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-21T21-26-52.608718.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-21T21-26-52.608718.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_09_21T21_26_52.608718", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-21T21-26-52.608718.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-21T21-26-52.608718.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_09_21T21_26_52.608718", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-09-21T21-26-52.608718.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-09-21T21-26-52.608718.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_09_21T21_26_52.608718", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-21T21-26-52.608718.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-21T21-26-52.608718.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_09_21T21_26_52.608718", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-21T21-26-52.608718.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-21T21-26-52.608718.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_09_21T21_26_52.608718", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-21T21-26-52.608718.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-21T21-26-52.608718.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_09_21T21_26_52.608718", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-21T21-26-52.608718.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-21T21-26-52.608718.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_09_21T21_26_52.608718", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-09-21T21-26-52.608718.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-09-21T21-26-52.608718.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_09_21T21_26_52.608718", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-09-21T21-26-52.608718.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-09-21T21-26-52.608718.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_09_21T21_26_52.608718", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-09-21T21-26-52.608718.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-09-21T21-26-52.608718.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_09_21T21_26_52.608718", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-09-21T21-26-52.608718.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-09-21T21-26-52.608718.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_09_21T21_26_52.608718", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-21T21-26-52.608718.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-21T21-26-52.608718.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_09_21T21_26_52.608718", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-09-21T21-26-52.608718.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-09-21T21-26-52.608718.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_09_21T21_26_52.608718", "path": ["**/details_harness|hendrycksTest-management|5_2023-09-21T21-26-52.608718.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-09-21T21-26-52.608718.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_09_21T21_26_52.608718", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-09-21T21-26-52.608718.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-09-21T21-26-52.608718.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_09_21T21_26_52.608718", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-09-21T21-26-52.608718.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-09-21T21-26-52.608718.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_09_21T21_26_52.608718", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-09-21T21-26-52.608718.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-09-21T21-26-52.608718.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_09_21T21_26_52.608718", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-09-21T21-26-52.608718.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-09-21T21-26-52.608718.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_09_21T21_26_52.608718", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-21T21-26-52.608718.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-21T21-26-52.608718.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_09_21T21_26_52.608718", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-09-21T21-26-52.608718.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-09-21T21-26-52.608718.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_09_21T21_26_52.608718", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-09-21T21-26-52.608718.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-09-21T21-26-52.608718.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_09_21T21_26_52.608718", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-09-21T21-26-52.608718.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-09-21T21-26-52.608718.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_09_21T21_26_52.608718", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-09-21T21-26-52.608718.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-09-21T21-26-52.608718.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_09_21T21_26_52.608718", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-09-21T21-26-52.608718.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-09-21T21-26-52.608718.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_09_21T21_26_52.608718", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-09-21T21-26-52.608718.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-09-21T21-26-52.608718.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_09_21T21_26_52.608718", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-09-21T21-26-52.608718.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-09-21T21-26-52.608718.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_09_21T21_26_52.608718", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-09-21T21-26-52.608718.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-09-21T21-26-52.608718.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_09_21T21_26_52.608718", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-09-21T21-26-52.608718.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-09-21T21-26-52.608718.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_09_21T21_26_52.608718", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-09-21T21-26-52.608718.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-09-21T21-26-52.608718.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_09_21T21_26_52.608718", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-21T21-26-52.608718.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-21T21-26-52.608718.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_09_21T21_26_52.608718", "path": ["**/details_harness|hendrycksTest-virology|5_2023-09-21T21-26-52.608718.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-09-21T21-26-52.608718.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_09_21T21_26_52.608718", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-09-21T21-26-52.608718.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-09-21T21-26-52.608718.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_09_21T21_26_52.608718", "path": ["**/details_harness|truthfulqa:mc|0_2023-09-21T21-26-52.608718.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-09-21T21-26-52.608718.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_26T13_54_25.329738", "path": ["**/details_harness|winogrande|5_2023-10-26T13-54-25.329738.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-26T13-54-25.329738.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_09_21T21_26_52.608718", "path": ["results_2023-09-21T21-26-52.608718.parquet"]}, {"split": "2023_10_26T13_54_25.329738", "path": ["results_2023-10-26T13-54-25.329738.parquet"]}, {"split": "latest", "path": ["results_2023-10-26T13-54-25.329738.parquet"]}]}]}
2023-10-26T12:54:38+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of Voicelab/trurl-2-13b-academic ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model Voicelab/trurl-2-13b-academic on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-10-26T13:54:25.329738(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of Voicelab/trurl-2-13b-academic", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model Voicelab/trurl-2-13b-academic on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-26T13:54:25.329738(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of Voicelab/trurl-2-13b-academic", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model Voicelab/trurl-2-13b-academic on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-26T13:54:25.329738(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 21, 31, 169, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of Voicelab/trurl-2-13b-academic## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model Voicelab/trurl-2-13b-academic on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-26T13:54:25.329738(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
f6f6c5385ad0612d8a30aaeb36ad41174375fbf8
# Dataset Card for "yandex_q_200k" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
dim/yandex_q_200k
[ "region:us" ]
2023-09-21T20:28:52+00:00
{"dataset_info": {"features": [{"name": "description", "dtype": "string"}, {"name": "question", "dtype": "string"}, {"name": "answer", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 291927288.0830295, "num_examples": 200000}], "download_size": 155069887, "dataset_size": 291927288.0830295}}
2023-09-21T20:33:14+00:00
[]
[]
TAGS #region-us
# Dataset Card for "yandex_q_200k" More Information needed
[ "# Dataset Card for \"yandex_q_200k\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"yandex_q_200k\"\n\nMore Information needed" ]
[ 6, 17 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"yandex_q_200k\"\n\nMore Information needed" ]
895e34ce6194c2295ea699097b6c4676c2ddcf9e
# Dataset Card for "sales-conversations" This dataset was created for the purpose of training a sales agent chatbot that can convince people. The initial idea came from: textbooks is all you need https://arxiv.org/abs/2306.11644 gpt-3.5-turbo was used for the generation # Structure The conversations have a customer and a salesman which appear always in changing order. customer, salesman, customer, salesman, etc. The customer always starts the conversation Who ends the conversation is not defined. # Generation Note that a textbook dataset is mandatory for this conversation generation. This examples rely on the following textbook dataset: https://huggingface.co/datasets/goendalf666/sales-textbook_for_convincing_and_selling The data generation code can be found here: https://github.com/tom813/salesGPT_foundation/blob/main/data_generation/textbook_and_conversation_gen.py The following prompt was used to create a conversation ``` def create_random_prompt(chapter, roles=["Customer", "Salesman"], range_vals=(3, 7), industries=None): if industries is None: industries = ["tech", "health", "finance"] # default industries; replace with your default list if different x = random.randint(*range_vals) y = 0 for i in reversed(range(3, 9)): # Generalized loop for range of values if i * x < 27: y = i break conversation_structure = "" for i in range(1, x+1): conversation_structure += f""" {roles[0]}: #{i}. sentence of {roles[0].lower()} {roles[1]}: #{i}. sentence of {roles[1].lower()}""" prompt = f"""Here is a chapter from a textbook about convincing people. The purpose of this data is to use it to fine tune a llm. Generate conversation examples that are based on the chapter that is provided and would help an ai to learn the topic by examples. Focus only on the topic that is given in the chapter when generating the examples. Let the example be in the {random.choice(industries)} industry. Follow this structure and put each conversation in a list of objects in json format. Only return the json nothing more: {conversation_structure} Generate {y} lists of those conversations Chapter:{chapter}""" return prompt ``` [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
goendalf666/sales-conversations
[ "task_categories:conversational", "size_categories:1K<n<10K", "language:en", "sales", "arxiv:2306.11644", "region:us" ]
2023-09-21T20:37:30+00:00
{"language": ["en"], "size_categories": ["1K<n<10K"], "task_categories": ["conversational"], "dataset_info": {"features": [{"name": "0", "dtype": "string"}, {"name": "1", "dtype": "string"}, {"name": "2", "dtype": "string"}, {"name": "3", "dtype": "string"}, {"name": "4", "dtype": "string"}, {"name": "5", "dtype": "string"}, {"name": "6", "dtype": "string"}, {"name": "7", "dtype": "string"}, {"name": "8", "dtype": "string"}, {"name": "9", "dtype": "string"}, {"name": "10", "dtype": "string"}, {"name": "11", "dtype": "string"}, {"name": "12", "dtype": "string"}, {"name": "13", "dtype": "string"}, {"name": "14", "dtype": "string"}, {"name": "15", "dtype": "string"}, {"name": "16", "dtype": "string"}, {"name": "17", "dtype": "string"}, {"name": "18", "dtype": "string"}, {"name": "19", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 6821725, "num_examples": 3412}], "download_size": 2644154, "dataset_size": 6821725}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}], "tags": ["sales"]}
2023-10-04T19:39:04+00:00
[ "2306.11644" ]
[ "en" ]
TAGS #task_categories-conversational #size_categories-1K<n<10K #language-English #sales #arxiv-2306.11644 #region-us
# Dataset Card for "sales-conversations" This dataset was created for the purpose of training a sales agent chatbot that can convince people. The initial idea came from: textbooks is all you need URL gpt-3.5-turbo was used for the generation # Structure The conversations have a customer and a salesman which appear always in changing order. customer, salesman, customer, salesman, etc. The customer always starts the conversation Who ends the conversation is not defined. # Generation Note that a textbook dataset is mandatory for this conversation generation. This examples rely on the following textbook dataset: URL The data generation code can be found here: URL The following prompt was used to create a conversation More Information needed
[ "# Dataset Card for \"sales-conversations\"\nThis dataset was created for the purpose of training a sales agent chatbot that can convince people.\n\nThe initial idea came from: textbooks is all you need URL\n\ngpt-3.5-turbo was used for the generation", "# Structure\nThe conversations have a customer and a salesman which appear always in changing order. customer, salesman, customer, salesman, etc. \nThe customer always starts the conversation\nWho ends the conversation is not defined.", "# Generation\nNote that a textbook dataset is mandatory for this conversation generation. This examples rely on the following textbook dataset:\nURL\n\nThe data generation code can be found here: URL\n\nThe following prompt was used to create a conversation\n\n\nMore Information needed" ]
[ "TAGS\n#task_categories-conversational #size_categories-1K<n<10K #language-English #sales #arxiv-2306.11644 #region-us \n", "# Dataset Card for \"sales-conversations\"\nThis dataset was created for the purpose of training a sales agent chatbot that can convince people.\n\nThe initial idea came from: textbooks is all you need URL\n\ngpt-3.5-turbo was used for the generation", "# Structure\nThe conversations have a customer and a salesman which appear always in changing order. customer, salesman, customer, salesman, etc. \nThe customer always starts the conversation\nWho ends the conversation is not defined.", "# Generation\nNote that a textbook dataset is mandatory for this conversation generation. This examples rely on the following textbook dataset:\nURL\n\nThe data generation code can be found here: URL\n\nThe following prompt was used to create a conversation\n\n\nMore Information needed" ]
[ 43, 58, 50, 53 ]
[ "passage: TAGS\n#task_categories-conversational #size_categories-1K<n<10K #language-English #sales #arxiv-2306.11644 #region-us \n# Dataset Card for \"sales-conversations\"\nThis dataset was created for the purpose of training a sales agent chatbot that can convince people.\n\nThe initial idea came from: textbooks is all you need URL\n\ngpt-3.5-turbo was used for the generation# Structure\nThe conversations have a customer and a salesman which appear always in changing order. customer, salesman, customer, salesman, etc. \nThe customer always starts the conversation\nWho ends the conversation is not defined.# Generation\nNote that a textbook dataset is mandatory for this conversation generation. This examples rely on the following textbook dataset:\nURL\n\nThe data generation code can be found here: URL\n\nThe following prompt was used to create a conversation\n\n\nMore Information needed" ]
bfde4fa2e5973e33fa2b226e25a13897e2ab512d
# Dataset Card for "13F_Reports" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
jkv53/13F_Reports
[ "region:us" ]
2023-09-21T20:40:37+00:00
{"dataset_info": {"features": [{"name": "title", "dtype": "string"}, {"name": "body", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 12636095, "num_examples": 1113}], "download_size": 3367995, "dataset_size": 12636095}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]}
2023-09-21T20:40:40+00:00
[]
[]
TAGS #region-us
# Dataset Card for "13F_Reports" More Information needed
[ "# Dataset Card for \"13F_Reports\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"13F_Reports\"\n\nMore Information needed" ]
[ 6, 15 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"13F_Reports\"\n\nMore Information needed" ]
26ea5d5953c10de8d63e0d1fe6021724cd268ab9
# Dataset Card for "allsides_metaphor" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
liyucheng/allsides_metaphor
[ "region:us" ]
2023-09-21T21:18:36+00:00
{"dataset_info": {"features": [{"name": "urls", "dtype": "string"}, {"name": "sents", "sequence": "string"}, {"name": "vua_metaphors", "sequence": "int64"}, {"name": "novel_metaphors", "sequence": "int64"}], "splits": [{"name": "train", "num_bytes": 23322603, "num_examples": 28883}], "download_size": 2935494, "dataset_size": 23322603}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]}
2023-09-25T19:38:03+00:00
[]
[]
TAGS #region-us
# Dataset Card for "allsides_metaphor" More Information needed
[ "# Dataset Card for \"allsides_metaphor\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"allsides_metaphor\"\n\nMore Information needed" ]
[ 6, 16 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"allsides_metaphor\"\n\nMore Information needed" ]
737584f55e49e393e62e240c3743ae3cb2fe2dc6
# Dataset Card for Evaluation run of OpenBuddy/openbuddy-openllama-7b-v12-bf16 ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/OpenBuddy/openbuddy-openllama-7b-v12-bf16 - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [OpenBuddy/openbuddy-openllama-7b-v12-bf16](https://huggingface.co/OpenBuddy/openbuddy-openllama-7b-v12-bf16) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_OpenBuddy__openbuddy-openllama-7b-v12-bf16", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-10-28T16:26:27.647445](https://huggingface.co/datasets/open-llm-leaderboard/details_OpenBuddy__openbuddy-openllama-7b-v12-bf16/blob/main/results_2023-10-28T16-26-27.647445.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.24716862416107382, "em_stderr": 0.00441758766706025, "f1": 0.32628565436241674, "f1_stderr": 0.004407349418682706, "acc": 0.3793853179772531, "acc_stderr": 0.010982963093739001 }, "harness|drop|3": { "em": 0.24716862416107382, "em_stderr": 0.00441758766706025, "f1": 0.32628565436241674, "f1_stderr": 0.004407349418682706 }, "harness|gsm8k|5": { "acc": 0.10841546626231995, "acc_stderr": 0.008563852506627497 }, "harness|winogrande|5": { "acc": 0.6503551696921863, "acc_stderr": 0.013402073680850505 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_OpenBuddy__openbuddy-openllama-7b-v12-bf16
[ "region:us" ]
2023-09-21T21:18:42+00:00
{"pretty_name": "Evaluation run of OpenBuddy/openbuddy-openllama-7b-v12-bf16", "dataset_summary": "Dataset automatically created during the evaluation run of model [OpenBuddy/openbuddy-openllama-7b-v12-bf16](https://huggingface.co/OpenBuddy/openbuddy-openllama-7b-v12-bf16) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_OpenBuddy__openbuddy-openllama-7b-v12-bf16\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-28T16:26:27.647445](https://huggingface.co/datasets/open-llm-leaderboard/details_OpenBuddy__openbuddy-openllama-7b-v12-bf16/blob/main/results_2023-10-28T16-26-27.647445.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.24716862416107382,\n \"em_stderr\": 0.00441758766706025,\n \"f1\": 0.32628565436241674,\n \"f1_stderr\": 0.004407349418682706,\n \"acc\": 0.3793853179772531,\n \"acc_stderr\": 0.010982963093739001\n },\n \"harness|drop|3\": {\n \"em\": 0.24716862416107382,\n \"em_stderr\": 0.00441758766706025,\n \"f1\": 0.32628565436241674,\n \"f1_stderr\": 0.004407349418682706\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.10841546626231995,\n \"acc_stderr\": 0.008563852506627497\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.6503551696921863,\n \"acc_stderr\": 0.013402073680850505\n }\n}\n```", "repo_url": "https://huggingface.co/OpenBuddy/openbuddy-openllama-7b-v12-bf16", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_09_21T22_18_19.303716", "path": ["**/details_harness|arc:challenge|25_2023-09-21T22-18-19.303716.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-09-21T22-18-19.303716.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_28T16_26_27.647445", "path": ["**/details_harness|drop|3_2023-10-28T16-26-27.647445.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-28T16-26-27.647445.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_28T16_26_27.647445", "path": ["**/details_harness|gsm8k|5_2023-10-28T16-26-27.647445.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-28T16-26-27.647445.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_09_21T22_18_19.303716", "path": ["**/details_harness|hellaswag|10_2023-09-21T22-18-19.303716.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-09-21T22-18-19.303716.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_09_21T22_18_19.303716", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-21T22-18-19.303716.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-09-21T22-18-19.303716.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-09-21T22-18-19.303716.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-09-21T22-18-19.303716.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-21T22-18-19.303716.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-09-21T22-18-19.303716.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-09-21T22-18-19.303716.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-09-21T22-18-19.303716.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-09-21T22-18-19.303716.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-09-21T22-18-19.303716.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-09-21T22-18-19.303716.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-09-21T22-18-19.303716.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-21T22-18-19.303716.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-09-21T22-18-19.303716.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-21T22-18-19.303716.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-21T22-18-19.303716.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-09-21T22-18-19.303716.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-09-21T22-18-19.303716.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-09-21T22-18-19.303716.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-21T22-18-19.303716.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-21T22-18-19.303716.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-21T22-18-19.303716.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-09-21T22-18-19.303716.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-21T22-18-19.303716.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-21T22-18-19.303716.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-21T22-18-19.303716.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-21T22-18-19.303716.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-09-21T22-18-19.303716.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-21T22-18-19.303716.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-21T22-18-19.303716.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-21T22-18-19.303716.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-21T22-18-19.303716.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-09-21T22-18-19.303716.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-09-21T22-18-19.303716.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-09-21T22-18-19.303716.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-09-21T22-18-19.303716.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-21T22-18-19.303716.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-09-21T22-18-19.303716.parquet", "**/details_harness|hendrycksTest-management|5_2023-09-21T22-18-19.303716.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-09-21T22-18-19.303716.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-09-21T22-18-19.303716.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-09-21T22-18-19.303716.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-09-21T22-18-19.303716.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-21T22-18-19.303716.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-09-21T22-18-19.303716.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-09-21T22-18-19.303716.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-09-21T22-18-19.303716.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-09-21T22-18-19.303716.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-09-21T22-18-19.303716.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-09-21T22-18-19.303716.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-09-21T22-18-19.303716.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-09-21T22-18-19.303716.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-09-21T22-18-19.303716.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-09-21T22-18-19.303716.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-21T22-18-19.303716.parquet", "**/details_harness|hendrycksTest-virology|5_2023-09-21T22-18-19.303716.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-09-21T22-18-19.303716.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-21T22-18-19.303716.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-09-21T22-18-19.303716.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-09-21T22-18-19.303716.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-09-21T22-18-19.303716.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-21T22-18-19.303716.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-09-21T22-18-19.303716.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-09-21T22-18-19.303716.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-09-21T22-18-19.303716.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-09-21T22-18-19.303716.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-09-21T22-18-19.303716.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-09-21T22-18-19.303716.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-09-21T22-18-19.303716.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-21T22-18-19.303716.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-09-21T22-18-19.303716.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-21T22-18-19.303716.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-21T22-18-19.303716.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-09-21T22-18-19.303716.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-09-21T22-18-19.303716.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-09-21T22-18-19.303716.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-21T22-18-19.303716.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-21T22-18-19.303716.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-21T22-18-19.303716.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-09-21T22-18-19.303716.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-21T22-18-19.303716.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-21T22-18-19.303716.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-21T22-18-19.303716.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-21T22-18-19.303716.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-09-21T22-18-19.303716.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-21T22-18-19.303716.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-21T22-18-19.303716.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-21T22-18-19.303716.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-21T22-18-19.303716.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-09-21T22-18-19.303716.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-09-21T22-18-19.303716.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-09-21T22-18-19.303716.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-09-21T22-18-19.303716.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-21T22-18-19.303716.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-09-21T22-18-19.303716.parquet", "**/details_harness|hendrycksTest-management|5_2023-09-21T22-18-19.303716.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-09-21T22-18-19.303716.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-09-21T22-18-19.303716.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-09-21T22-18-19.303716.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-09-21T22-18-19.303716.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-21T22-18-19.303716.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-09-21T22-18-19.303716.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-09-21T22-18-19.303716.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-09-21T22-18-19.303716.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-09-21T22-18-19.303716.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-09-21T22-18-19.303716.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-09-21T22-18-19.303716.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-09-21T22-18-19.303716.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-09-21T22-18-19.303716.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-09-21T22-18-19.303716.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-09-21T22-18-19.303716.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-21T22-18-19.303716.parquet", "**/details_harness|hendrycksTest-virology|5_2023-09-21T22-18-19.303716.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-09-21T22-18-19.303716.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_09_21T22_18_19.303716", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-21T22-18-19.303716.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-21T22-18-19.303716.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_09_21T22_18_19.303716", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-09-21T22-18-19.303716.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-09-21T22-18-19.303716.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_09_21T22_18_19.303716", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-09-21T22-18-19.303716.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-09-21T22-18-19.303716.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_09_21T22_18_19.303716", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-09-21T22-18-19.303716.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-09-21T22-18-19.303716.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_09_21T22_18_19.303716", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-21T22-18-19.303716.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-21T22-18-19.303716.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_09_21T22_18_19.303716", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-09-21T22-18-19.303716.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-09-21T22-18-19.303716.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_09_21T22_18_19.303716", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-09-21T22-18-19.303716.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-09-21T22-18-19.303716.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_09_21T22_18_19.303716", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-09-21T22-18-19.303716.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-09-21T22-18-19.303716.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_09_21T22_18_19.303716", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-09-21T22-18-19.303716.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-09-21T22-18-19.303716.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_09_21T22_18_19.303716", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-09-21T22-18-19.303716.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-09-21T22-18-19.303716.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_09_21T22_18_19.303716", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-09-21T22-18-19.303716.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-09-21T22-18-19.303716.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_09_21T22_18_19.303716", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-09-21T22-18-19.303716.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-09-21T22-18-19.303716.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_09_21T22_18_19.303716", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-21T22-18-19.303716.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-21T22-18-19.303716.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_09_21T22_18_19.303716", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-09-21T22-18-19.303716.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-09-21T22-18-19.303716.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_09_21T22_18_19.303716", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-21T22-18-19.303716.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-21T22-18-19.303716.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_09_21T22_18_19.303716", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-21T22-18-19.303716.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-21T22-18-19.303716.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_09_21T22_18_19.303716", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-09-21T22-18-19.303716.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-09-21T22-18-19.303716.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_09_21T22_18_19.303716", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-09-21T22-18-19.303716.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-09-21T22-18-19.303716.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_09_21T22_18_19.303716", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-09-21T22-18-19.303716.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-09-21T22-18-19.303716.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_09_21T22_18_19.303716", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-21T22-18-19.303716.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-21T22-18-19.303716.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_09_21T22_18_19.303716", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-21T22-18-19.303716.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-21T22-18-19.303716.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_09_21T22_18_19.303716", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-21T22-18-19.303716.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-21T22-18-19.303716.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_09_21T22_18_19.303716", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-09-21T22-18-19.303716.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-09-21T22-18-19.303716.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_09_21T22_18_19.303716", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-21T22-18-19.303716.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-21T22-18-19.303716.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_09_21T22_18_19.303716", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-21T22-18-19.303716.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-21T22-18-19.303716.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_09_21T22_18_19.303716", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-21T22-18-19.303716.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-21T22-18-19.303716.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_09_21T22_18_19.303716", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-21T22-18-19.303716.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-21T22-18-19.303716.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_09_21T22_18_19.303716", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-09-21T22-18-19.303716.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-09-21T22-18-19.303716.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_09_21T22_18_19.303716", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-21T22-18-19.303716.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-21T22-18-19.303716.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_09_21T22_18_19.303716", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-21T22-18-19.303716.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-21T22-18-19.303716.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_09_21T22_18_19.303716", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-21T22-18-19.303716.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-21T22-18-19.303716.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_09_21T22_18_19.303716", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-21T22-18-19.303716.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-21T22-18-19.303716.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_09_21T22_18_19.303716", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-09-21T22-18-19.303716.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-09-21T22-18-19.303716.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_09_21T22_18_19.303716", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-09-21T22-18-19.303716.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-09-21T22-18-19.303716.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_09_21T22_18_19.303716", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-09-21T22-18-19.303716.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-09-21T22-18-19.303716.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_09_21T22_18_19.303716", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-09-21T22-18-19.303716.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-09-21T22-18-19.303716.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_09_21T22_18_19.303716", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-21T22-18-19.303716.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-21T22-18-19.303716.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_09_21T22_18_19.303716", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-09-21T22-18-19.303716.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-09-21T22-18-19.303716.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_09_21T22_18_19.303716", "path": ["**/details_harness|hendrycksTest-management|5_2023-09-21T22-18-19.303716.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-09-21T22-18-19.303716.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_09_21T22_18_19.303716", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-09-21T22-18-19.303716.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-09-21T22-18-19.303716.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_09_21T22_18_19.303716", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-09-21T22-18-19.303716.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-09-21T22-18-19.303716.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_09_21T22_18_19.303716", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-09-21T22-18-19.303716.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-09-21T22-18-19.303716.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_09_21T22_18_19.303716", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-09-21T22-18-19.303716.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-09-21T22-18-19.303716.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_09_21T22_18_19.303716", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-21T22-18-19.303716.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-21T22-18-19.303716.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_09_21T22_18_19.303716", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-09-21T22-18-19.303716.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-09-21T22-18-19.303716.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_09_21T22_18_19.303716", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-09-21T22-18-19.303716.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-09-21T22-18-19.303716.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_09_21T22_18_19.303716", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-09-21T22-18-19.303716.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-09-21T22-18-19.303716.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_09_21T22_18_19.303716", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-09-21T22-18-19.303716.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-09-21T22-18-19.303716.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_09_21T22_18_19.303716", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-09-21T22-18-19.303716.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-09-21T22-18-19.303716.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_09_21T22_18_19.303716", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-09-21T22-18-19.303716.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-09-21T22-18-19.303716.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_09_21T22_18_19.303716", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-09-21T22-18-19.303716.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-09-21T22-18-19.303716.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_09_21T22_18_19.303716", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-09-21T22-18-19.303716.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-09-21T22-18-19.303716.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_09_21T22_18_19.303716", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-09-21T22-18-19.303716.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-09-21T22-18-19.303716.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_09_21T22_18_19.303716", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-09-21T22-18-19.303716.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-09-21T22-18-19.303716.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_09_21T22_18_19.303716", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-21T22-18-19.303716.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-21T22-18-19.303716.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_09_21T22_18_19.303716", "path": ["**/details_harness|hendrycksTest-virology|5_2023-09-21T22-18-19.303716.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-09-21T22-18-19.303716.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_09_21T22_18_19.303716", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-09-21T22-18-19.303716.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-09-21T22-18-19.303716.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_09_21T22_18_19.303716", "path": ["**/details_harness|truthfulqa:mc|0_2023-09-21T22-18-19.303716.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-09-21T22-18-19.303716.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_28T16_26_27.647445", "path": ["**/details_harness|winogrande|5_2023-10-28T16-26-27.647445.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-28T16-26-27.647445.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_09_21T22_18_19.303716", "path": ["results_2023-09-21T22-18-19.303716.parquet"]}, {"split": "2023_10_28T16_26_27.647445", "path": ["results_2023-10-28T16-26-27.647445.parquet"]}, {"split": "latest", "path": ["results_2023-10-28T16-26-27.647445.parquet"]}]}]}
2023-10-28T15:26:39+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of OpenBuddy/openbuddy-openllama-7b-v12-bf16 ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model OpenBuddy/openbuddy-openllama-7b-v12-bf16 on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-10-28T16:26:27.647445(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of OpenBuddy/openbuddy-openllama-7b-v12-bf16", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model OpenBuddy/openbuddy-openllama-7b-v12-bf16 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-28T16:26:27.647445(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of OpenBuddy/openbuddy-openllama-7b-v12-bf16", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model OpenBuddy/openbuddy-openllama-7b-v12-bf16 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-28T16:26:27.647445(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 28, 31, 176, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of OpenBuddy/openbuddy-openllama-7b-v12-bf16## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model OpenBuddy/openbuddy-openllama-7b-v12-bf16 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-28T16:26:27.647445(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
cc625d6d83bcd99fe26070f70c41dec5d069ddb3
# AutoTrain Dataset for project: vision-transformer ## Dataset Description This dataset has been automatically processed by AutoTrain for project vision-transformer. ### Languages The BCP-47 code for the dataset's language is unk. ## Dataset Structure ### Data Instances A sample from this dataset looks as follows: ```json [ { "image": "<224x224 RGB PIL image>", "target": 0 }, { "image": "<224x224 RGB PIL image>", "target": 1 } ] ``` ### Dataset Fields The dataset has the following fields (also called "features"): ```json { "image": "Image(decode=True, id=None)", "target": "ClassLabel(names=['benign', 'malignant'], id=None)" } ``` ### Dataset Splits This dataset is split into a train and validation split. The split sizes are as follow: | Split name | Num samples | | ------------ | ------------------- | | train | 397 | | valid | 101 |
dharun2049/dermaflow-v1
[ "task_categories:image-classification", "size_categories:n<1K", "license:apache-2.0", "biology", "region:us" ]
2023-09-21T21:22:24+00:00
{"license": "apache-2.0", "size_categories": ["n<1K"], "task_categories": ["image-classification"], "tags": ["biology"]}
2023-09-21T22:40:42+00:00
[]
[]
TAGS #task_categories-image-classification #size_categories-n<1K #license-apache-2.0 #biology #region-us
AutoTrain Dataset for project: vision-transformer ================================================= Dataset Description ------------------- This dataset has been automatically processed by AutoTrain for project vision-transformer. ### Languages The BCP-47 code for the dataset's language is unk. Dataset Structure ----------------- ### Data Instances A sample from this dataset looks as follows: ### Dataset Fields The dataset has the following fields (also called "features"): ### Dataset Splits This dataset is split into a train and validation split. The split sizes are as follow:
[ "### Languages\n\n\nThe BCP-47 code for the dataset's language is unk.\n\n\nDataset Structure\n-----------------", "### Data Instances\n\n\nA sample from this dataset looks as follows:", "### Dataset Fields\n\n\nThe dataset has the following fields (also called \"features\"):", "### Dataset Splits\n\n\nThis dataset is split into a train and validation split. The split sizes are as follow:" ]
[ "TAGS\n#task_categories-image-classification #size_categories-n<1K #license-apache-2.0 #biology #region-us \n", "### Languages\n\n\nThe BCP-47 code for the dataset's language is unk.\n\n\nDataset Structure\n-----------------", "### Data Instances\n\n\nA sample from this dataset looks as follows:", "### Dataset Fields\n\n\nThe dataset has the following fields (also called \"features\"):", "### Dataset Splits\n\n\nThis dataset is split into a train and validation split. The split sizes are as follow:" ]
[ 38, 27, 17, 23, 27 ]
[ "passage: TAGS\n#task_categories-image-classification #size_categories-n<1K #license-apache-2.0 #biology #region-us \n### Languages\n\n\nThe BCP-47 code for the dataset's language is unk.\n\n\nDataset Structure\n-----------------### Data Instances\n\n\nA sample from this dataset looks as follows:### Dataset Fields\n\n\nThe dataset has the following fields (also called \"features\"):### Dataset Splits\n\n\nThis dataset is split into a train and validation split. The split sizes are as follow:" ]
5ae25186cc5b092948aa5c4f7a1b6d77da426446
# Dataset Card for Evaluation run of Secbone/llama-33B-instructed ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/Secbone/llama-33B-instructed - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [Secbone/llama-33B-instructed](https://huggingface.co/Secbone/llama-33B-instructed) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_Secbone__llama-33B-instructed", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-10-25T13:35:27.526768](https://huggingface.co/datasets/open-llm-leaderboard/details_Secbone__llama-33B-instructed/blob/main/results_2023-10-25T13-35-27.526768.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.1983011744966443, "em_stderr": 0.004083268310230377, "f1": 0.2580683724832209, "f1_stderr": 0.0040518641243222474, "acc": 0.46863041707830366, "acc_stderr": 0.010527338901150537 }, "harness|drop|3": { "em": 0.1983011744966443, "em_stderr": 0.004083268310230377, "f1": 0.2580683724832209, "f1_stderr": 0.0040518641243222474 }, "harness|gsm8k|5": { "acc": 0.14404852160727824, "acc_stderr": 0.00967211097306528 }, "harness|winogrande|5": { "acc": 0.7932123125493291, "acc_stderr": 0.011382566829235791 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_Secbone__llama-33B-instructed
[ "region:us" ]
2023-09-21T21:24:09+00:00
{"pretty_name": "Evaluation run of Secbone/llama-33B-instructed", "dataset_summary": "Dataset automatically created during the evaluation run of model [Secbone/llama-33B-instructed](https://huggingface.co/Secbone/llama-33B-instructed) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Secbone__llama-33B-instructed\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-25T13:35:27.526768](https://huggingface.co/datasets/open-llm-leaderboard/details_Secbone__llama-33B-instructed/blob/main/results_2023-10-25T13-35-27.526768.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.1983011744966443,\n \"em_stderr\": 0.004083268310230377,\n \"f1\": 0.2580683724832209,\n \"f1_stderr\": 0.0040518641243222474,\n \"acc\": 0.46863041707830366,\n \"acc_stderr\": 0.010527338901150537\n },\n \"harness|drop|3\": {\n \"em\": 0.1983011744966443,\n \"em_stderr\": 0.004083268310230377,\n \"f1\": 0.2580683724832209,\n \"f1_stderr\": 0.0040518641243222474\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.14404852160727824,\n \"acc_stderr\": 0.00967211097306528\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7932123125493291,\n \"acc_stderr\": 0.011382566829235791\n }\n}\n```", "repo_url": "https://huggingface.co/Secbone/llama-33B-instructed", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_09_21T22_23_50.443527", "path": ["**/details_harness|arc:challenge|25_2023-09-21T22-23-50.443527.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-09-21T22-23-50.443527.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_25T13_35_27.526768", "path": ["**/details_harness|drop|3_2023-10-25T13-35-27.526768.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-25T13-35-27.526768.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_25T13_35_27.526768", "path": ["**/details_harness|gsm8k|5_2023-10-25T13-35-27.526768.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-25T13-35-27.526768.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_09_21T22_23_50.443527", "path": ["**/details_harness|hellaswag|10_2023-09-21T22-23-50.443527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-09-21T22-23-50.443527.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_09_21T22_23_50.443527", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-21T22-23-50.443527.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-09-21T22-23-50.443527.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-09-21T22-23-50.443527.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-09-21T22-23-50.443527.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-21T22-23-50.443527.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-09-21T22-23-50.443527.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-09-21T22-23-50.443527.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-09-21T22-23-50.443527.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-09-21T22-23-50.443527.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-09-21T22-23-50.443527.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-09-21T22-23-50.443527.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-09-21T22-23-50.443527.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-21T22-23-50.443527.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-09-21T22-23-50.443527.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-21T22-23-50.443527.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-21T22-23-50.443527.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-09-21T22-23-50.443527.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-09-21T22-23-50.443527.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-09-21T22-23-50.443527.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-21T22-23-50.443527.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-21T22-23-50.443527.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-21T22-23-50.443527.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-09-21T22-23-50.443527.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-21T22-23-50.443527.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-21T22-23-50.443527.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-21T22-23-50.443527.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-21T22-23-50.443527.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-09-21T22-23-50.443527.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-21T22-23-50.443527.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-21T22-23-50.443527.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-21T22-23-50.443527.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-21T22-23-50.443527.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-09-21T22-23-50.443527.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-09-21T22-23-50.443527.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-09-21T22-23-50.443527.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-09-21T22-23-50.443527.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-21T22-23-50.443527.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-09-21T22-23-50.443527.parquet", "**/details_harness|hendrycksTest-management|5_2023-09-21T22-23-50.443527.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-09-21T22-23-50.443527.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-09-21T22-23-50.443527.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-09-21T22-23-50.443527.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-09-21T22-23-50.443527.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-21T22-23-50.443527.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-09-21T22-23-50.443527.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-09-21T22-23-50.443527.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-09-21T22-23-50.443527.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-09-21T22-23-50.443527.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-09-21T22-23-50.443527.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-09-21T22-23-50.443527.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-09-21T22-23-50.443527.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-09-21T22-23-50.443527.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-09-21T22-23-50.443527.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-09-21T22-23-50.443527.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-21T22-23-50.443527.parquet", "**/details_harness|hendrycksTest-virology|5_2023-09-21T22-23-50.443527.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-09-21T22-23-50.443527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-21T22-23-50.443527.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-09-21T22-23-50.443527.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-09-21T22-23-50.443527.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-09-21T22-23-50.443527.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-21T22-23-50.443527.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-09-21T22-23-50.443527.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-09-21T22-23-50.443527.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-09-21T22-23-50.443527.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-09-21T22-23-50.443527.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-09-21T22-23-50.443527.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-09-21T22-23-50.443527.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-09-21T22-23-50.443527.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-21T22-23-50.443527.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-09-21T22-23-50.443527.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-21T22-23-50.443527.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-21T22-23-50.443527.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-09-21T22-23-50.443527.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-09-21T22-23-50.443527.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-09-21T22-23-50.443527.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-21T22-23-50.443527.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-21T22-23-50.443527.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-21T22-23-50.443527.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-09-21T22-23-50.443527.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-21T22-23-50.443527.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-21T22-23-50.443527.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-21T22-23-50.443527.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-21T22-23-50.443527.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-09-21T22-23-50.443527.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-21T22-23-50.443527.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-21T22-23-50.443527.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-21T22-23-50.443527.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-21T22-23-50.443527.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-09-21T22-23-50.443527.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-09-21T22-23-50.443527.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-09-21T22-23-50.443527.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-09-21T22-23-50.443527.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-21T22-23-50.443527.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-09-21T22-23-50.443527.parquet", "**/details_harness|hendrycksTest-management|5_2023-09-21T22-23-50.443527.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-09-21T22-23-50.443527.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-09-21T22-23-50.443527.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-09-21T22-23-50.443527.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-09-21T22-23-50.443527.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-21T22-23-50.443527.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-09-21T22-23-50.443527.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-09-21T22-23-50.443527.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-09-21T22-23-50.443527.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-09-21T22-23-50.443527.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-09-21T22-23-50.443527.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-09-21T22-23-50.443527.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-09-21T22-23-50.443527.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-09-21T22-23-50.443527.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-09-21T22-23-50.443527.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-09-21T22-23-50.443527.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-21T22-23-50.443527.parquet", "**/details_harness|hendrycksTest-virology|5_2023-09-21T22-23-50.443527.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-09-21T22-23-50.443527.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_09_21T22_23_50.443527", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-21T22-23-50.443527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-21T22-23-50.443527.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_09_21T22_23_50.443527", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-09-21T22-23-50.443527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-09-21T22-23-50.443527.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_09_21T22_23_50.443527", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-09-21T22-23-50.443527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-09-21T22-23-50.443527.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_09_21T22_23_50.443527", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-09-21T22-23-50.443527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-09-21T22-23-50.443527.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_09_21T22_23_50.443527", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-21T22-23-50.443527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-21T22-23-50.443527.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_09_21T22_23_50.443527", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-09-21T22-23-50.443527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-09-21T22-23-50.443527.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_09_21T22_23_50.443527", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-09-21T22-23-50.443527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-09-21T22-23-50.443527.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_09_21T22_23_50.443527", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-09-21T22-23-50.443527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-09-21T22-23-50.443527.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_09_21T22_23_50.443527", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-09-21T22-23-50.443527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-09-21T22-23-50.443527.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_09_21T22_23_50.443527", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-09-21T22-23-50.443527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-09-21T22-23-50.443527.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_09_21T22_23_50.443527", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-09-21T22-23-50.443527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-09-21T22-23-50.443527.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_09_21T22_23_50.443527", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-09-21T22-23-50.443527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-09-21T22-23-50.443527.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_09_21T22_23_50.443527", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-21T22-23-50.443527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-21T22-23-50.443527.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_09_21T22_23_50.443527", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-09-21T22-23-50.443527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-09-21T22-23-50.443527.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_09_21T22_23_50.443527", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-21T22-23-50.443527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-21T22-23-50.443527.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_09_21T22_23_50.443527", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-21T22-23-50.443527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-21T22-23-50.443527.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_09_21T22_23_50.443527", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-09-21T22-23-50.443527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-09-21T22-23-50.443527.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_09_21T22_23_50.443527", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-09-21T22-23-50.443527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-09-21T22-23-50.443527.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_09_21T22_23_50.443527", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-09-21T22-23-50.443527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-09-21T22-23-50.443527.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_09_21T22_23_50.443527", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-21T22-23-50.443527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-21T22-23-50.443527.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_09_21T22_23_50.443527", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-21T22-23-50.443527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-21T22-23-50.443527.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_09_21T22_23_50.443527", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-21T22-23-50.443527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-21T22-23-50.443527.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_09_21T22_23_50.443527", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-09-21T22-23-50.443527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-09-21T22-23-50.443527.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_09_21T22_23_50.443527", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-21T22-23-50.443527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-21T22-23-50.443527.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_09_21T22_23_50.443527", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-21T22-23-50.443527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-21T22-23-50.443527.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_09_21T22_23_50.443527", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-21T22-23-50.443527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-21T22-23-50.443527.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_09_21T22_23_50.443527", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-21T22-23-50.443527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-21T22-23-50.443527.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_09_21T22_23_50.443527", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-09-21T22-23-50.443527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-09-21T22-23-50.443527.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_09_21T22_23_50.443527", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-21T22-23-50.443527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-21T22-23-50.443527.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_09_21T22_23_50.443527", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-21T22-23-50.443527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-21T22-23-50.443527.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_09_21T22_23_50.443527", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-21T22-23-50.443527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-21T22-23-50.443527.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_09_21T22_23_50.443527", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-21T22-23-50.443527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-21T22-23-50.443527.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_09_21T22_23_50.443527", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-09-21T22-23-50.443527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-09-21T22-23-50.443527.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_09_21T22_23_50.443527", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-09-21T22-23-50.443527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-09-21T22-23-50.443527.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_09_21T22_23_50.443527", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-09-21T22-23-50.443527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-09-21T22-23-50.443527.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_09_21T22_23_50.443527", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-09-21T22-23-50.443527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-09-21T22-23-50.443527.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_09_21T22_23_50.443527", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-21T22-23-50.443527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-21T22-23-50.443527.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_09_21T22_23_50.443527", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-09-21T22-23-50.443527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-09-21T22-23-50.443527.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_09_21T22_23_50.443527", "path": ["**/details_harness|hendrycksTest-management|5_2023-09-21T22-23-50.443527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-09-21T22-23-50.443527.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_09_21T22_23_50.443527", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-09-21T22-23-50.443527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-09-21T22-23-50.443527.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_09_21T22_23_50.443527", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-09-21T22-23-50.443527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-09-21T22-23-50.443527.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_09_21T22_23_50.443527", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-09-21T22-23-50.443527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-09-21T22-23-50.443527.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_09_21T22_23_50.443527", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-09-21T22-23-50.443527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-09-21T22-23-50.443527.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_09_21T22_23_50.443527", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-21T22-23-50.443527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-21T22-23-50.443527.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_09_21T22_23_50.443527", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-09-21T22-23-50.443527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-09-21T22-23-50.443527.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_09_21T22_23_50.443527", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-09-21T22-23-50.443527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-09-21T22-23-50.443527.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_09_21T22_23_50.443527", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-09-21T22-23-50.443527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-09-21T22-23-50.443527.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_09_21T22_23_50.443527", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-09-21T22-23-50.443527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-09-21T22-23-50.443527.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_09_21T22_23_50.443527", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-09-21T22-23-50.443527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-09-21T22-23-50.443527.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_09_21T22_23_50.443527", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-09-21T22-23-50.443527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-09-21T22-23-50.443527.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_09_21T22_23_50.443527", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-09-21T22-23-50.443527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-09-21T22-23-50.443527.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_09_21T22_23_50.443527", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-09-21T22-23-50.443527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-09-21T22-23-50.443527.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_09_21T22_23_50.443527", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-09-21T22-23-50.443527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-09-21T22-23-50.443527.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_09_21T22_23_50.443527", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-09-21T22-23-50.443527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-09-21T22-23-50.443527.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_09_21T22_23_50.443527", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-21T22-23-50.443527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-21T22-23-50.443527.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_09_21T22_23_50.443527", "path": ["**/details_harness|hendrycksTest-virology|5_2023-09-21T22-23-50.443527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-09-21T22-23-50.443527.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_09_21T22_23_50.443527", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-09-21T22-23-50.443527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-09-21T22-23-50.443527.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_09_21T22_23_50.443527", "path": ["**/details_harness|truthfulqa:mc|0_2023-09-21T22-23-50.443527.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-09-21T22-23-50.443527.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_25T13_35_27.526768", "path": ["**/details_harness|winogrande|5_2023-10-25T13-35-27.526768.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-25T13-35-27.526768.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_09_21T22_23_50.443527", "path": ["results_2023-09-21T22-23-50.443527.parquet"]}, {"split": "2023_10_25T13_35_27.526768", "path": ["results_2023-10-25T13-35-27.526768.parquet"]}, {"split": "latest", "path": ["results_2023-10-25T13-35-27.526768.parquet"]}]}]}
2023-10-25T12:35:40+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of Secbone/llama-33B-instructed ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model Secbone/llama-33B-instructed on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-10-25T13:35:27.526768(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of Secbone/llama-33B-instructed", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model Secbone/llama-33B-instructed on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-25T13:35:27.526768(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of Secbone/llama-33B-instructed", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model Secbone/llama-33B-instructed on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-25T13:35:27.526768(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 20, 31, 168, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of Secbone/llama-33B-instructed## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model Secbone/llama-33B-instructed on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-25T13:35:27.526768(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
8e8bc3e8e5ff49668c6380c8e89848401d915712
# Dataset Card for Evaluation run of codeparrot/codeparrot ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/codeparrot/codeparrot - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [codeparrot/codeparrot](https://huggingface.co/codeparrot/codeparrot) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_codeparrot__codeparrot", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-10-27T22:03:29.134706](https://huggingface.co/datasets/open-llm-leaderboard/details_codeparrot__codeparrot/blob/main/results_2023-10-27T22-03-29.134706.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.0007340604026845638, "em_stderr": 0.00027736144573357115, "f1": 0.020442533557047064, "f1_stderr": 0.0007057901378550561, "acc": 0.252123807648879, "acc_stderr": 0.007682267037046532 }, "harness|drop|3": { "em": 0.0007340604026845638, "em_stderr": 0.00027736144573357115, "f1": 0.020442533557047064, "f1_stderr": 0.0007057901378550561 }, "harness|gsm8k|5": { "acc": 0.002274450341167551, "acc_stderr": 0.0013121578148674346 }, "harness|winogrande|5": { "acc": 0.5019731649565904, "acc_stderr": 0.014052376259225629 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_codeparrot__codeparrot
[ "region:us" ]
2023-09-21T21:35:33+00:00
{"pretty_name": "Evaluation run of codeparrot/codeparrot", "dataset_summary": "Dataset automatically created during the evaluation run of model [codeparrot/codeparrot](https://huggingface.co/codeparrot/codeparrot) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_codeparrot__codeparrot\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-27T22:03:29.134706](https://huggingface.co/datasets/open-llm-leaderboard/details_codeparrot__codeparrot/blob/main/results_2023-10-27T22-03-29.134706.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0007340604026845638,\n \"em_stderr\": 0.00027736144573357115,\n \"f1\": 0.020442533557047064,\n \"f1_stderr\": 0.0007057901378550561,\n \"acc\": 0.252123807648879,\n \"acc_stderr\": 0.007682267037046532\n },\n \"harness|drop|3\": {\n \"em\": 0.0007340604026845638,\n \"em_stderr\": 0.00027736144573357115,\n \"f1\": 0.020442533557047064,\n \"f1_stderr\": 0.0007057901378550561\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.002274450341167551,\n \"acc_stderr\": 0.0013121578148674346\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.5019731649565904,\n \"acc_stderr\": 0.014052376259225629\n }\n}\n```", "repo_url": "https://huggingface.co/codeparrot/codeparrot", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_09_21T22_35_18.428619", "path": ["**/details_harness|arc:challenge|25_2023-09-21T22-35-18.428619.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-09-21T22-35-18.428619.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_27T22_03_29.134706", "path": ["**/details_harness|drop|3_2023-10-27T22-03-29.134706.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-27T22-03-29.134706.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_27T22_03_29.134706", "path": ["**/details_harness|gsm8k|5_2023-10-27T22-03-29.134706.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-27T22-03-29.134706.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_09_21T22_35_18.428619", "path": ["**/details_harness|hellaswag|10_2023-09-21T22-35-18.428619.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-09-21T22-35-18.428619.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_09_21T22_35_18.428619", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-21T22-35-18.428619.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-09-21T22-35-18.428619.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-09-21T22-35-18.428619.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-09-21T22-35-18.428619.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-21T22-35-18.428619.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-09-21T22-35-18.428619.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-09-21T22-35-18.428619.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-09-21T22-35-18.428619.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-09-21T22-35-18.428619.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-09-21T22-35-18.428619.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-09-21T22-35-18.428619.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-09-21T22-35-18.428619.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-21T22-35-18.428619.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-09-21T22-35-18.428619.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-21T22-35-18.428619.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-21T22-35-18.428619.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-09-21T22-35-18.428619.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-09-21T22-35-18.428619.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-09-21T22-35-18.428619.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-21T22-35-18.428619.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-21T22-35-18.428619.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-21T22-35-18.428619.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-09-21T22-35-18.428619.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-21T22-35-18.428619.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-21T22-35-18.428619.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-21T22-35-18.428619.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-21T22-35-18.428619.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-09-21T22-35-18.428619.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-21T22-35-18.428619.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-21T22-35-18.428619.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-21T22-35-18.428619.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-21T22-35-18.428619.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-09-21T22-35-18.428619.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-09-21T22-35-18.428619.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-09-21T22-35-18.428619.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-09-21T22-35-18.428619.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-21T22-35-18.428619.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-09-21T22-35-18.428619.parquet", "**/details_harness|hendrycksTest-management|5_2023-09-21T22-35-18.428619.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-09-21T22-35-18.428619.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-09-21T22-35-18.428619.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-09-21T22-35-18.428619.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-09-21T22-35-18.428619.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-21T22-35-18.428619.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-09-21T22-35-18.428619.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-09-21T22-35-18.428619.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-09-21T22-35-18.428619.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-09-21T22-35-18.428619.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-09-21T22-35-18.428619.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-09-21T22-35-18.428619.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-09-21T22-35-18.428619.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-09-21T22-35-18.428619.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-09-21T22-35-18.428619.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-09-21T22-35-18.428619.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-21T22-35-18.428619.parquet", "**/details_harness|hendrycksTest-virology|5_2023-09-21T22-35-18.428619.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-09-21T22-35-18.428619.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-21T22-35-18.428619.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-09-21T22-35-18.428619.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-09-21T22-35-18.428619.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-09-21T22-35-18.428619.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-21T22-35-18.428619.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-09-21T22-35-18.428619.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-09-21T22-35-18.428619.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-09-21T22-35-18.428619.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-09-21T22-35-18.428619.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-09-21T22-35-18.428619.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-09-21T22-35-18.428619.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-09-21T22-35-18.428619.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-21T22-35-18.428619.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-09-21T22-35-18.428619.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-21T22-35-18.428619.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-21T22-35-18.428619.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-09-21T22-35-18.428619.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-09-21T22-35-18.428619.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-09-21T22-35-18.428619.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-21T22-35-18.428619.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-21T22-35-18.428619.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-21T22-35-18.428619.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-09-21T22-35-18.428619.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-21T22-35-18.428619.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-21T22-35-18.428619.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-21T22-35-18.428619.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-21T22-35-18.428619.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-09-21T22-35-18.428619.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-21T22-35-18.428619.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-21T22-35-18.428619.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-21T22-35-18.428619.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-21T22-35-18.428619.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-09-21T22-35-18.428619.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-09-21T22-35-18.428619.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-09-21T22-35-18.428619.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-09-21T22-35-18.428619.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-21T22-35-18.428619.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-09-21T22-35-18.428619.parquet", "**/details_harness|hendrycksTest-management|5_2023-09-21T22-35-18.428619.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-09-21T22-35-18.428619.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-09-21T22-35-18.428619.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-09-21T22-35-18.428619.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-09-21T22-35-18.428619.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-21T22-35-18.428619.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-09-21T22-35-18.428619.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-09-21T22-35-18.428619.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-09-21T22-35-18.428619.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-09-21T22-35-18.428619.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-09-21T22-35-18.428619.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-09-21T22-35-18.428619.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-09-21T22-35-18.428619.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-09-21T22-35-18.428619.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-09-21T22-35-18.428619.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-09-21T22-35-18.428619.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-21T22-35-18.428619.parquet", "**/details_harness|hendrycksTest-virology|5_2023-09-21T22-35-18.428619.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-09-21T22-35-18.428619.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_09_21T22_35_18.428619", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-21T22-35-18.428619.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-21T22-35-18.428619.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_09_21T22_35_18.428619", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-09-21T22-35-18.428619.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-09-21T22-35-18.428619.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_09_21T22_35_18.428619", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-09-21T22-35-18.428619.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-09-21T22-35-18.428619.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_09_21T22_35_18.428619", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-09-21T22-35-18.428619.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-09-21T22-35-18.428619.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_09_21T22_35_18.428619", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-21T22-35-18.428619.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-21T22-35-18.428619.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_09_21T22_35_18.428619", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-09-21T22-35-18.428619.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-09-21T22-35-18.428619.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_09_21T22_35_18.428619", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-09-21T22-35-18.428619.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-09-21T22-35-18.428619.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_09_21T22_35_18.428619", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-09-21T22-35-18.428619.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-09-21T22-35-18.428619.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_09_21T22_35_18.428619", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-09-21T22-35-18.428619.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-09-21T22-35-18.428619.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_09_21T22_35_18.428619", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-09-21T22-35-18.428619.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-09-21T22-35-18.428619.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_09_21T22_35_18.428619", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-09-21T22-35-18.428619.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-09-21T22-35-18.428619.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_09_21T22_35_18.428619", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-09-21T22-35-18.428619.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-09-21T22-35-18.428619.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_09_21T22_35_18.428619", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-21T22-35-18.428619.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-21T22-35-18.428619.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_09_21T22_35_18.428619", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-09-21T22-35-18.428619.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-09-21T22-35-18.428619.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_09_21T22_35_18.428619", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-21T22-35-18.428619.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-21T22-35-18.428619.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_09_21T22_35_18.428619", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-21T22-35-18.428619.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-21T22-35-18.428619.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_09_21T22_35_18.428619", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-09-21T22-35-18.428619.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-09-21T22-35-18.428619.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_09_21T22_35_18.428619", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-09-21T22-35-18.428619.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-09-21T22-35-18.428619.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_09_21T22_35_18.428619", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-09-21T22-35-18.428619.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-09-21T22-35-18.428619.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_09_21T22_35_18.428619", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-21T22-35-18.428619.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-21T22-35-18.428619.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_09_21T22_35_18.428619", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-21T22-35-18.428619.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-21T22-35-18.428619.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_09_21T22_35_18.428619", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-21T22-35-18.428619.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-21T22-35-18.428619.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_09_21T22_35_18.428619", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-09-21T22-35-18.428619.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-09-21T22-35-18.428619.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_09_21T22_35_18.428619", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-21T22-35-18.428619.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-21T22-35-18.428619.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_09_21T22_35_18.428619", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-21T22-35-18.428619.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-21T22-35-18.428619.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_09_21T22_35_18.428619", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-21T22-35-18.428619.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-21T22-35-18.428619.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_09_21T22_35_18.428619", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-21T22-35-18.428619.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-21T22-35-18.428619.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_09_21T22_35_18.428619", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-09-21T22-35-18.428619.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-09-21T22-35-18.428619.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_09_21T22_35_18.428619", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-21T22-35-18.428619.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-21T22-35-18.428619.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_09_21T22_35_18.428619", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-21T22-35-18.428619.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-21T22-35-18.428619.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_09_21T22_35_18.428619", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-21T22-35-18.428619.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-21T22-35-18.428619.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_09_21T22_35_18.428619", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-21T22-35-18.428619.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-21T22-35-18.428619.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_09_21T22_35_18.428619", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-09-21T22-35-18.428619.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-09-21T22-35-18.428619.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_09_21T22_35_18.428619", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-09-21T22-35-18.428619.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-09-21T22-35-18.428619.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_09_21T22_35_18.428619", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-09-21T22-35-18.428619.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-09-21T22-35-18.428619.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_09_21T22_35_18.428619", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-09-21T22-35-18.428619.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-09-21T22-35-18.428619.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_09_21T22_35_18.428619", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-21T22-35-18.428619.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-21T22-35-18.428619.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_09_21T22_35_18.428619", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-09-21T22-35-18.428619.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-09-21T22-35-18.428619.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_09_21T22_35_18.428619", "path": ["**/details_harness|hendrycksTest-management|5_2023-09-21T22-35-18.428619.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-09-21T22-35-18.428619.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_09_21T22_35_18.428619", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-09-21T22-35-18.428619.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-09-21T22-35-18.428619.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_09_21T22_35_18.428619", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-09-21T22-35-18.428619.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-09-21T22-35-18.428619.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_09_21T22_35_18.428619", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-09-21T22-35-18.428619.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-09-21T22-35-18.428619.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_09_21T22_35_18.428619", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-09-21T22-35-18.428619.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-09-21T22-35-18.428619.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_09_21T22_35_18.428619", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-21T22-35-18.428619.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-21T22-35-18.428619.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_09_21T22_35_18.428619", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-09-21T22-35-18.428619.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-09-21T22-35-18.428619.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_09_21T22_35_18.428619", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-09-21T22-35-18.428619.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-09-21T22-35-18.428619.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_09_21T22_35_18.428619", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-09-21T22-35-18.428619.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-09-21T22-35-18.428619.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_09_21T22_35_18.428619", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-09-21T22-35-18.428619.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-09-21T22-35-18.428619.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_09_21T22_35_18.428619", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-09-21T22-35-18.428619.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-09-21T22-35-18.428619.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_09_21T22_35_18.428619", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-09-21T22-35-18.428619.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-09-21T22-35-18.428619.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_09_21T22_35_18.428619", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-09-21T22-35-18.428619.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-09-21T22-35-18.428619.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_09_21T22_35_18.428619", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-09-21T22-35-18.428619.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-09-21T22-35-18.428619.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_09_21T22_35_18.428619", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-09-21T22-35-18.428619.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-09-21T22-35-18.428619.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_09_21T22_35_18.428619", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-09-21T22-35-18.428619.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-09-21T22-35-18.428619.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_09_21T22_35_18.428619", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-21T22-35-18.428619.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-21T22-35-18.428619.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_09_21T22_35_18.428619", "path": ["**/details_harness|hendrycksTest-virology|5_2023-09-21T22-35-18.428619.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-09-21T22-35-18.428619.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_09_21T22_35_18.428619", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-09-21T22-35-18.428619.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-09-21T22-35-18.428619.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_09_21T22_35_18.428619", "path": ["**/details_harness|truthfulqa:mc|0_2023-09-21T22-35-18.428619.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-09-21T22-35-18.428619.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_27T22_03_29.134706", "path": ["**/details_harness|winogrande|5_2023-10-27T22-03-29.134706.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-27T22-03-29.134706.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_09_21T22_35_18.428619", "path": ["results_2023-09-21T22-35-18.428619.parquet"]}, {"split": "2023_10_27T22_03_29.134706", "path": ["results_2023-10-27T22-03-29.134706.parquet"]}, {"split": "latest", "path": ["results_2023-10-27T22-03-29.134706.parquet"]}]}]}
2023-10-27T21:03:41+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of codeparrot/codeparrot ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model codeparrot/codeparrot on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-10-27T22:03:29.134706(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of codeparrot/codeparrot", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model codeparrot/codeparrot on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-27T22:03:29.134706(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of codeparrot/codeparrot", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model codeparrot/codeparrot on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-27T22:03:29.134706(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 16, 31, 164, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of codeparrot/codeparrot## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model codeparrot/codeparrot on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-27T22:03:29.134706(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
6c755644dc0241ab0162b4e937f1a04bc4e4408c
# Dataset Card for "Claim_Validation" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
Brecon/Claim_Validation
[ "region:us" ]
2023-09-21T21:41:07+00:00
{"dataset_info": {"features": [{"name": "label", "dtype": "int64"}, {"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 167303, "num_examples": 153}], "download_size": 88825, "dataset_size": 167303}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]}
2023-09-21T21:41:11+00:00
[]
[]
TAGS #region-us
# Dataset Card for "Claim_Validation" More Information needed
[ "# Dataset Card for \"Claim_Validation\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"Claim_Validation\"\n\nMore Information needed" ]
[ 6, 16 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"Claim_Validation\"\n\nMore Information needed" ]
b2e3d3d0c8a71de6efa3c610676fac065535a8b8
# AutoTrain Dataset for project: skinnnnnnn ## Dataset Description This dataset has been automatically processed by AutoTrain for project skinnnnnnn. ### Languages The BCP-47 code for the dataset's language is unk. ## Dataset Structure ### Data Instances A sample from this dataset looks as follows: ```json [ { "image": "<224x224 RGB PIL image>", "target": 0 }, { "image": "<224x224 RGB PIL image>", "target": 1 } ] ``` ### Dataset Fields The dataset has the following fields (also called "features"): ```json { "image": "Image(decode=True, id=None)", "target": "ClassLabel(names=['benign', 'malignant'], id=None)" } ``` ### Dataset Splits This dataset is split into a train and validation split. The split sizes are as follow: | Split name | Num samples | | ------------ | ------------------- | | train | 397 | | valid | 101 |
dharun2049/autotrain-data-skinnnnnnn
[ "task_categories:image-classification", "region:us" ]
2023-09-21T21:49:57+00:00
{"task_categories": ["image-classification"]}
2023-09-21T21:51:24+00:00
[]
[]
TAGS #task_categories-image-classification #region-us
AutoTrain Dataset for project: skinnnnnnn ========================================= Dataset Description ------------------- This dataset has been automatically processed by AutoTrain for project skinnnnnnn. ### Languages The BCP-47 code for the dataset's language is unk. Dataset Structure ----------------- ### Data Instances A sample from this dataset looks as follows: ### Dataset Fields The dataset has the following fields (also called "features"): ### Dataset Splits This dataset is split into a train and validation split. The split sizes are as follow:
[ "### Languages\n\n\nThe BCP-47 code for the dataset's language is unk.\n\n\nDataset Structure\n-----------------", "### Data Instances\n\n\nA sample from this dataset looks as follows:", "### Dataset Fields\n\n\nThe dataset has the following fields (also called \"features\"):", "### Dataset Splits\n\n\nThis dataset is split into a train and validation split. The split sizes are as follow:" ]
[ "TAGS\n#task_categories-image-classification #region-us \n", "### Languages\n\n\nThe BCP-47 code for the dataset's language is unk.\n\n\nDataset Structure\n-----------------", "### Data Instances\n\n\nA sample from this dataset looks as follows:", "### Dataset Fields\n\n\nThe dataset has the following fields (also called \"features\"):", "### Dataset Splits\n\n\nThis dataset is split into a train and validation split. The split sizes are as follow:" ]
[ 17, 27, 17, 23, 27 ]
[ "passage: TAGS\n#task_categories-image-classification #region-us \n### Languages\n\n\nThe BCP-47 code for the dataset's language is unk.\n\n\nDataset Structure\n-----------------### Data Instances\n\n\nA sample from this dataset looks as follows:### Dataset Fields\n\n\nThe dataset has the following fields (also called \"features\"):### Dataset Splits\n\n\nThis dataset is split into a train and validation split. The split sizes are as follow:" ]
f275d67428929486156f53ac19cb2c706c8658e7
# Dataset Card for "Train_Test" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
Brecon/Train_Test
[ "region:us" ]
2023-09-21T21:50:18+00:00
{"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "test", "path": "data/test-*"}]}], "dataset_info": {"features": [{"name": "label", "dtype": "int64"}, {"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 195875.8617511521, "num_examples": 173}, {"name": "test", "num_bytes": 49818.13824884793, "num_examples": 44}], "download_size": 143188, "dataset_size": 245694.0}}
2023-10-10T22:25:44+00:00
[]
[]
TAGS #region-us
# Dataset Card for "Train_Test" More Information needed
[ "# Dataset Card for \"Train_Test\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"Train_Test\"\n\nMore Information needed" ]
[ 6, 14 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"Train_Test\"\n\nMore Information needed" ]
49f5ed3691b0fe039d4641990a598a793dc56149
# AutoTrain Dataset for project: bingbongdomh ## Dataset Description This dataset has been automatically processed by AutoTrain for project bingbongdomh. ### Languages The BCP-47 code for the dataset's language is unk. ## Dataset Structure ### Data Instances A sample from this dataset looks as follows: ```json [ { "image": "<224x224 RGB PIL image>", "target": 0 }, { "image": "<224x224 RGB PIL image>", "target": 1 } ] ``` ### Dataset Fields The dataset has the following fields (also called "features"): ```json { "image": "Image(decode=True, id=None)", "target": "ClassLabel(names=['benign', 'malignant'], id=None)" } ``` ### Dataset Splits This dataset is split into a train and validation split. The split sizes are as follow: | Split name | Num samples | | ------------ | ------------------- | | train | 397 | | valid | 101 |
dharun2049/autotrain-data-bingbongdomh
[ "task_categories:image-classification", "region:us" ]
2023-09-21T21:54:13+00:00
{"task_categories": ["image-classification"]}
2023-09-21T22:01:44+00:00
[]
[]
TAGS #task_categories-image-classification #region-us
AutoTrain Dataset for project: bingbongdomh =========================================== Dataset Description ------------------- This dataset has been automatically processed by AutoTrain for project bingbongdomh. ### Languages The BCP-47 code for the dataset's language is unk. Dataset Structure ----------------- ### Data Instances A sample from this dataset looks as follows: ### Dataset Fields The dataset has the following fields (also called "features"): ### Dataset Splits This dataset is split into a train and validation split. The split sizes are as follow:
[ "### Languages\n\n\nThe BCP-47 code for the dataset's language is unk.\n\n\nDataset Structure\n-----------------", "### Data Instances\n\n\nA sample from this dataset looks as follows:", "### Dataset Fields\n\n\nThe dataset has the following fields (also called \"features\"):", "### Dataset Splits\n\n\nThis dataset is split into a train and validation split. The split sizes are as follow:" ]
[ "TAGS\n#task_categories-image-classification #region-us \n", "### Languages\n\n\nThe BCP-47 code for the dataset's language is unk.\n\n\nDataset Structure\n-----------------", "### Data Instances\n\n\nA sample from this dataset looks as follows:", "### Dataset Fields\n\n\nThe dataset has the following fields (also called \"features\"):", "### Dataset Splits\n\n\nThis dataset is split into a train and validation split. The split sizes are as follow:" ]
[ 17, 27, 17, 23, 27 ]
[ "passage: TAGS\n#task_categories-image-classification #region-us \n### Languages\n\n\nThe BCP-47 code for the dataset's language is unk.\n\n\nDataset Structure\n-----------------### Data Instances\n\n\nA sample from this dataset looks as follows:### Dataset Fields\n\n\nThe dataset has the following fields (also called \"features\"):### Dataset Splits\n\n\nThis dataset is split into a train and validation split. The split sizes are as follow:" ]
c7e6544bdb2c77589cc777a5998e63de0601fa69
# Dataset Card for Evaluation run of speechlessai/speechless-codellama-airoboros-orca-platypus-13b ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/speechlessai/speechless-codellama-airoboros-orca-platypus-13b - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [speechlessai/speechless-codellama-airoboros-orca-platypus-13b](https://huggingface.co/speechlessai/speechless-codellama-airoboros-orca-platypus-13b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_speechlessai__speechless-codellama-airoboros-orca-platypus-13b", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-10-24T13:10:29.056795](https://huggingface.co/datasets/open-llm-leaderboard/details_speechlessai__speechless-codellama-airoboros-orca-platypus-13b/blob/main/results_2023-10-24T13-10-29.056795.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.2918414429530201, "em_stderr": 0.004655629237839093, "f1": 0.3326352768456382, "f1_stderr": 0.004616275233277117, "acc": 0.3398002480892164, "acc_stderr": 0.008490890879958144 }, "harness|drop|3": { "em": 0.2918414429530201, "em_stderr": 0.004655629237839093, "f1": 0.3326352768456382, "f1_stderr": 0.004616275233277117 }, "harness|gsm8k|5": { "acc": 0.01819560272934041, "acc_stderr": 0.003681611894073872 }, "harness|winogrande|5": { "acc": 0.6614048934490924, "acc_stderr": 0.013300169865842416 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_speechlessai__speechless-codellama-airoboros-orca-platypus-13b
[ "region:us" ]
2023-09-21T21:55:38+00:00
{"pretty_name": "Evaluation run of speechlessai/speechless-codellama-airoboros-orca-platypus-13b", "dataset_summary": "Dataset automatically created during the evaluation run of model [speechlessai/speechless-codellama-airoboros-orca-platypus-13b](https://huggingface.co/speechlessai/speechless-codellama-airoboros-orca-platypus-13b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_speechlessai__speechless-codellama-airoboros-orca-platypus-13b\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-24T13:10:29.056795](https://huggingface.co/datasets/open-llm-leaderboard/details_speechlessai__speechless-codellama-airoboros-orca-platypus-13b/blob/main/results_2023-10-24T13-10-29.056795.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.2918414429530201,\n \"em_stderr\": 0.004655629237839093,\n \"f1\": 0.3326352768456382,\n \"f1_stderr\": 0.004616275233277117,\n \"acc\": 0.3398002480892164,\n \"acc_stderr\": 0.008490890879958144\n },\n \"harness|drop|3\": {\n \"em\": 0.2918414429530201,\n \"em_stderr\": 0.004655629237839093,\n \"f1\": 0.3326352768456382,\n \"f1_stderr\": 0.004616275233277117\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.01819560272934041,\n \"acc_stderr\": 0.003681611894073872\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.6614048934490924,\n \"acc_stderr\": 0.013300169865842416\n }\n}\n```", "repo_url": "https://huggingface.co/speechlessai/speechless-codellama-airoboros-orca-platypus-13b", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_09_21T22_55_13.794289", "path": ["**/details_harness|arc:challenge|25_2023-09-21T22-55-13.794289.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-09-21T22-55-13.794289.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_24T13_10_29.056795", "path": ["**/details_harness|drop|3_2023-10-24T13-10-29.056795.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-24T13-10-29.056795.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_24T13_10_29.056795", "path": ["**/details_harness|gsm8k|5_2023-10-24T13-10-29.056795.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-24T13-10-29.056795.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_09_21T22_55_13.794289", "path": ["**/details_harness|hellaswag|10_2023-09-21T22-55-13.794289.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-09-21T22-55-13.794289.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_09_21T22_55_13.794289", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-21T22-55-13.794289.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-09-21T22-55-13.794289.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-09-21T22-55-13.794289.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-09-21T22-55-13.794289.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-21T22-55-13.794289.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-09-21T22-55-13.794289.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-09-21T22-55-13.794289.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-09-21T22-55-13.794289.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-09-21T22-55-13.794289.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-09-21T22-55-13.794289.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-09-21T22-55-13.794289.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-09-21T22-55-13.794289.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-21T22-55-13.794289.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-09-21T22-55-13.794289.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-21T22-55-13.794289.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-21T22-55-13.794289.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-09-21T22-55-13.794289.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-09-21T22-55-13.794289.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-09-21T22-55-13.794289.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-21T22-55-13.794289.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-21T22-55-13.794289.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-21T22-55-13.794289.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-09-21T22-55-13.794289.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-21T22-55-13.794289.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-21T22-55-13.794289.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-21T22-55-13.794289.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-21T22-55-13.794289.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-09-21T22-55-13.794289.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-21T22-55-13.794289.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-21T22-55-13.794289.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-21T22-55-13.794289.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-21T22-55-13.794289.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-09-21T22-55-13.794289.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-09-21T22-55-13.794289.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-09-21T22-55-13.794289.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-09-21T22-55-13.794289.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-21T22-55-13.794289.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-09-21T22-55-13.794289.parquet", "**/details_harness|hendrycksTest-management|5_2023-09-21T22-55-13.794289.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-09-21T22-55-13.794289.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-09-21T22-55-13.794289.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-09-21T22-55-13.794289.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-09-21T22-55-13.794289.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-21T22-55-13.794289.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-09-21T22-55-13.794289.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-09-21T22-55-13.794289.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-09-21T22-55-13.794289.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-09-21T22-55-13.794289.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-09-21T22-55-13.794289.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-09-21T22-55-13.794289.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-09-21T22-55-13.794289.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-09-21T22-55-13.794289.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-09-21T22-55-13.794289.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-09-21T22-55-13.794289.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-21T22-55-13.794289.parquet", "**/details_harness|hendrycksTest-virology|5_2023-09-21T22-55-13.794289.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-09-21T22-55-13.794289.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-21T22-55-13.794289.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-09-21T22-55-13.794289.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-09-21T22-55-13.794289.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-09-21T22-55-13.794289.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-21T22-55-13.794289.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-09-21T22-55-13.794289.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-09-21T22-55-13.794289.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-09-21T22-55-13.794289.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-09-21T22-55-13.794289.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-09-21T22-55-13.794289.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-09-21T22-55-13.794289.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-09-21T22-55-13.794289.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-21T22-55-13.794289.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-09-21T22-55-13.794289.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-21T22-55-13.794289.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-21T22-55-13.794289.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-09-21T22-55-13.794289.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-09-21T22-55-13.794289.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-09-21T22-55-13.794289.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-21T22-55-13.794289.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-21T22-55-13.794289.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-21T22-55-13.794289.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-09-21T22-55-13.794289.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-21T22-55-13.794289.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-21T22-55-13.794289.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-21T22-55-13.794289.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-21T22-55-13.794289.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-09-21T22-55-13.794289.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-21T22-55-13.794289.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-21T22-55-13.794289.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-21T22-55-13.794289.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-21T22-55-13.794289.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-09-21T22-55-13.794289.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-09-21T22-55-13.794289.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-09-21T22-55-13.794289.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-09-21T22-55-13.794289.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-21T22-55-13.794289.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-09-21T22-55-13.794289.parquet", "**/details_harness|hendrycksTest-management|5_2023-09-21T22-55-13.794289.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-09-21T22-55-13.794289.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-09-21T22-55-13.794289.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-09-21T22-55-13.794289.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-09-21T22-55-13.794289.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-21T22-55-13.794289.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-09-21T22-55-13.794289.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-09-21T22-55-13.794289.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-09-21T22-55-13.794289.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-09-21T22-55-13.794289.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-09-21T22-55-13.794289.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-09-21T22-55-13.794289.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-09-21T22-55-13.794289.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-09-21T22-55-13.794289.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-09-21T22-55-13.794289.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-09-21T22-55-13.794289.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-21T22-55-13.794289.parquet", "**/details_harness|hendrycksTest-virology|5_2023-09-21T22-55-13.794289.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-09-21T22-55-13.794289.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_09_21T22_55_13.794289", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-21T22-55-13.794289.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-21T22-55-13.794289.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_09_21T22_55_13.794289", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-09-21T22-55-13.794289.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-09-21T22-55-13.794289.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_09_21T22_55_13.794289", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-09-21T22-55-13.794289.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-09-21T22-55-13.794289.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_09_21T22_55_13.794289", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-09-21T22-55-13.794289.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-09-21T22-55-13.794289.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_09_21T22_55_13.794289", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-21T22-55-13.794289.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-21T22-55-13.794289.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_09_21T22_55_13.794289", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-09-21T22-55-13.794289.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-09-21T22-55-13.794289.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_09_21T22_55_13.794289", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-09-21T22-55-13.794289.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-09-21T22-55-13.794289.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_09_21T22_55_13.794289", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-09-21T22-55-13.794289.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-09-21T22-55-13.794289.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_09_21T22_55_13.794289", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-09-21T22-55-13.794289.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-09-21T22-55-13.794289.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_09_21T22_55_13.794289", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-09-21T22-55-13.794289.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-09-21T22-55-13.794289.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_09_21T22_55_13.794289", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-09-21T22-55-13.794289.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-09-21T22-55-13.794289.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_09_21T22_55_13.794289", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-09-21T22-55-13.794289.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-09-21T22-55-13.794289.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_09_21T22_55_13.794289", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-21T22-55-13.794289.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-21T22-55-13.794289.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_09_21T22_55_13.794289", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-09-21T22-55-13.794289.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-09-21T22-55-13.794289.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_09_21T22_55_13.794289", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-21T22-55-13.794289.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-21T22-55-13.794289.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_09_21T22_55_13.794289", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-21T22-55-13.794289.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-21T22-55-13.794289.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_09_21T22_55_13.794289", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-09-21T22-55-13.794289.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-09-21T22-55-13.794289.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_09_21T22_55_13.794289", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-09-21T22-55-13.794289.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-09-21T22-55-13.794289.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_09_21T22_55_13.794289", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-09-21T22-55-13.794289.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-09-21T22-55-13.794289.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_09_21T22_55_13.794289", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-21T22-55-13.794289.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-21T22-55-13.794289.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_09_21T22_55_13.794289", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-21T22-55-13.794289.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-21T22-55-13.794289.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_09_21T22_55_13.794289", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-21T22-55-13.794289.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-21T22-55-13.794289.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_09_21T22_55_13.794289", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-09-21T22-55-13.794289.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-09-21T22-55-13.794289.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_09_21T22_55_13.794289", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-21T22-55-13.794289.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-21T22-55-13.794289.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_09_21T22_55_13.794289", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-21T22-55-13.794289.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-21T22-55-13.794289.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_09_21T22_55_13.794289", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-21T22-55-13.794289.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-21T22-55-13.794289.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_09_21T22_55_13.794289", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-21T22-55-13.794289.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-21T22-55-13.794289.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_09_21T22_55_13.794289", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-09-21T22-55-13.794289.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-09-21T22-55-13.794289.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_09_21T22_55_13.794289", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-21T22-55-13.794289.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-21T22-55-13.794289.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_09_21T22_55_13.794289", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-21T22-55-13.794289.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-21T22-55-13.794289.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_09_21T22_55_13.794289", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-21T22-55-13.794289.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-21T22-55-13.794289.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_09_21T22_55_13.794289", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-21T22-55-13.794289.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-21T22-55-13.794289.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_09_21T22_55_13.794289", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-09-21T22-55-13.794289.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-09-21T22-55-13.794289.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_09_21T22_55_13.794289", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-09-21T22-55-13.794289.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-09-21T22-55-13.794289.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_09_21T22_55_13.794289", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-09-21T22-55-13.794289.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-09-21T22-55-13.794289.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_09_21T22_55_13.794289", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-09-21T22-55-13.794289.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-09-21T22-55-13.794289.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_09_21T22_55_13.794289", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-21T22-55-13.794289.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-21T22-55-13.794289.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_09_21T22_55_13.794289", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-09-21T22-55-13.794289.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-09-21T22-55-13.794289.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_09_21T22_55_13.794289", "path": ["**/details_harness|hendrycksTest-management|5_2023-09-21T22-55-13.794289.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-09-21T22-55-13.794289.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_09_21T22_55_13.794289", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-09-21T22-55-13.794289.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-09-21T22-55-13.794289.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_09_21T22_55_13.794289", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-09-21T22-55-13.794289.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-09-21T22-55-13.794289.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_09_21T22_55_13.794289", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-09-21T22-55-13.794289.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-09-21T22-55-13.794289.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_09_21T22_55_13.794289", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-09-21T22-55-13.794289.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-09-21T22-55-13.794289.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_09_21T22_55_13.794289", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-21T22-55-13.794289.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-21T22-55-13.794289.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_09_21T22_55_13.794289", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-09-21T22-55-13.794289.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-09-21T22-55-13.794289.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_09_21T22_55_13.794289", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-09-21T22-55-13.794289.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-09-21T22-55-13.794289.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_09_21T22_55_13.794289", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-09-21T22-55-13.794289.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-09-21T22-55-13.794289.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_09_21T22_55_13.794289", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-09-21T22-55-13.794289.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-09-21T22-55-13.794289.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_09_21T22_55_13.794289", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-09-21T22-55-13.794289.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-09-21T22-55-13.794289.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_09_21T22_55_13.794289", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-09-21T22-55-13.794289.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-09-21T22-55-13.794289.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_09_21T22_55_13.794289", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-09-21T22-55-13.794289.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-09-21T22-55-13.794289.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_09_21T22_55_13.794289", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-09-21T22-55-13.794289.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-09-21T22-55-13.794289.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_09_21T22_55_13.794289", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-09-21T22-55-13.794289.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-09-21T22-55-13.794289.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_09_21T22_55_13.794289", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-09-21T22-55-13.794289.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-09-21T22-55-13.794289.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_09_21T22_55_13.794289", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-21T22-55-13.794289.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-21T22-55-13.794289.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_09_21T22_55_13.794289", "path": ["**/details_harness|hendrycksTest-virology|5_2023-09-21T22-55-13.794289.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-09-21T22-55-13.794289.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_09_21T22_55_13.794289", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-09-21T22-55-13.794289.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-09-21T22-55-13.794289.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_09_21T22_55_13.794289", "path": ["**/details_harness|truthfulqa:mc|0_2023-09-21T22-55-13.794289.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-09-21T22-55-13.794289.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_24T13_10_29.056795", "path": ["**/details_harness|winogrande|5_2023-10-24T13-10-29.056795.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-24T13-10-29.056795.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_09_21T22_55_13.794289", "path": ["results_2023-09-21T22-55-13.794289.parquet"]}, {"split": "2023_10_24T13_10_29.056795", "path": ["results_2023-10-24T13-10-29.056795.parquet"]}, {"split": "latest", "path": ["results_2023-10-24T13-10-29.056795.parquet"]}]}]}
2023-10-24T12:10:42+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of speechlessai/speechless-codellama-airoboros-orca-platypus-13b ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model speechlessai/speechless-codellama-airoboros-orca-platypus-13b on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-10-24T13:10:29.056795(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of speechlessai/speechless-codellama-airoboros-orca-platypus-13b", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model speechlessai/speechless-codellama-airoboros-orca-platypus-13b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-24T13:10:29.056795(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of speechlessai/speechless-codellama-airoboros-orca-platypus-13b", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model speechlessai/speechless-codellama-airoboros-orca-platypus-13b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-24T13:10:29.056795(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 33, 31, 181, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of speechlessai/speechless-codellama-airoboros-orca-platypus-13b## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model speechlessai/speechless-codellama-airoboros-orca-platypus-13b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-24T13:10:29.056795(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
361e77f387cb1cdbbc9df054d6123780dc940b26
# AutoTrain Dataset for project: vit-skin-derna ## Dataset Description This dataset has been automatically processed by AutoTrain for project vit-skin-derna. ### Languages The BCP-47 code for the dataset's language is unk. ## Dataset Structure ### Data Instances A sample from this dataset looks as follows: ```json [ { "image": "<32x32 RGB PIL image>", "target": 4 }, { "image": "<32x32 RGB PIL image>", "target": 8 } ] ``` ### Dataset Fields The dataset has the following fields (also called "features"): ```json { "image": "Image(decode=True, id=None)", "target": "ClassLabel(names=['airplane', 'automobile', 'bird', 'cat', 'deer', 'dog', 'frog', 'horse', 'ship', 'truck'], id=None)" } ``` ### Dataset Splits This dataset is split into a train and validation split. The split sizes are as follow: | Split name | Num samples | | ------------ | ------------------- | | train | 40000 | | valid | 10000 |
dharun2049/autotrain-data-vit-skin-derna
[ "task_categories:image-classification", "region:us" ]
2023-09-21T22:04:34+00:00
{"task_categories": ["image-classification"]}
2023-09-21T22:19:24+00:00
[]
[]
TAGS #task_categories-image-classification #region-us
AutoTrain Dataset for project: vit-skin-derna ============================================= Dataset Description ------------------- This dataset has been automatically processed by AutoTrain for project vit-skin-derna. ### Languages The BCP-47 code for the dataset's language is unk. Dataset Structure ----------------- ### Data Instances A sample from this dataset looks as follows: ### Dataset Fields The dataset has the following fields (also called "features"): ### Dataset Splits This dataset is split into a train and validation split. The split sizes are as follow:
[ "### Languages\n\n\nThe BCP-47 code for the dataset's language is unk.\n\n\nDataset Structure\n-----------------", "### Data Instances\n\n\nA sample from this dataset looks as follows:", "### Dataset Fields\n\n\nThe dataset has the following fields (also called \"features\"):", "### Dataset Splits\n\n\nThis dataset is split into a train and validation split. The split sizes are as follow:" ]
[ "TAGS\n#task_categories-image-classification #region-us \n", "### Languages\n\n\nThe BCP-47 code for the dataset's language is unk.\n\n\nDataset Structure\n-----------------", "### Data Instances\n\n\nA sample from this dataset looks as follows:", "### Dataset Fields\n\n\nThe dataset has the following fields (also called \"features\"):", "### Dataset Splits\n\n\nThis dataset is split into a train and validation split. The split sizes are as follow:" ]
[ 17, 27, 17, 23, 27 ]
[ "passage: TAGS\n#task_categories-image-classification #region-us \n### Languages\n\n\nThe BCP-47 code for the dataset's language is unk.\n\n\nDataset Structure\n-----------------### Data Instances\n\n\nA sample from this dataset looks as follows:### Dataset Fields\n\n\nThe dataset has the following fields (also called \"features\"):### Dataset Splits\n\n\nThis dataset is split into a train and validation split. The split sizes are as follow:" ]
cdc27125a77ef23a5d7055dbb0ddb87046c58e02
# Dataset Card for "forum_uristov_rf_prompts" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
dim/forum_uristov_rf_prompts
[ "region:us" ]
2023-09-21T22:06:19+00:00
{"dataset_info": {"features": [{"name": "prompt", "dtype": "string"}, {"name": "solution", "dtype": "string"}, {"name": "link", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 3043144, "num_examples": 1849}], "download_size": 1343977, "dataset_size": 3043144}}
2023-09-21T22:06:22+00:00
[]
[]
TAGS #region-us
# Dataset Card for "forum_uristov_rf_prompts" More Information needed
[ "# Dataset Card for \"forum_uristov_rf_prompts\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"forum_uristov_rf_prompts\"\n\nMore Information needed" ]
[ 6, 20 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"forum_uristov_rf_prompts\"\n\nMore Information needed" ]
e7a6743b92f5312cf36f55a465418825230a1c05
Another swing at pet filing data.
jwixel/pet-insurance-data-2
[ "region:us" ]
2023-09-21T22:11:07+00:00
{}
2023-09-24T16:34:59+00:00
[]
[]
TAGS #region-us
Another swing at pet filing data.
[]
[ "TAGS\n#region-us \n" ]
[ 6 ]
[ "passage: TAGS\n#region-us \n" ]
4fa87247594acd1b97616a1c23e1901367c95bae
# Dataset Card for Evaluation run of wei123602/Llama-2-13b-FINETUNE4_TEST ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/wei123602/Llama-2-13b-FINETUNE4_TEST - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [wei123602/Llama-2-13b-FINETUNE4_TEST](https://huggingface.co/wei123602/Llama-2-13b-FINETUNE4_TEST) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_wei123602__Llama-2-13b-FINETUNE4_TEST", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-10-25T04:35:36.269188](https://huggingface.co/datasets/open-llm-leaderboard/details_wei123602__Llama-2-13b-FINETUNE4_TEST/blob/main/results_2023-10-25T04-35-36.269188.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.2558724832214765, "em_stderr": 0.004468637497676013, "f1": 0.29727348993288566, "f1_stderr": 0.0043971826108447475, "acc": 0.4511208594202994, "acc_stderr": 0.010571455427847876 }, "harness|drop|3": { "em": 0.2558724832214765, "em_stderr": 0.004468637497676013, "f1": 0.29727348993288566, "f1_stderr": 0.0043971826108447475 }, "harness|gsm8k|5": { "acc": 0.13191811978771797, "acc_stderr": 0.009321265253857515 }, "harness|winogrande|5": { "acc": 0.7703235990528808, "acc_stderr": 0.011821645601838234 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_wei123602__Llama-2-13b-FINETUNE4_TEST
[ "region:us" ]
2023-09-21T22:18:19+00:00
{"pretty_name": "Evaluation run of wei123602/Llama-2-13b-FINETUNE4_TEST", "dataset_summary": "Dataset automatically created during the evaluation run of model [wei123602/Llama-2-13b-FINETUNE4_TEST](https://huggingface.co/wei123602/Llama-2-13b-FINETUNE4_TEST) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_wei123602__Llama-2-13b-FINETUNE4_TEST\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-25T04:35:36.269188](https://huggingface.co/datasets/open-llm-leaderboard/details_wei123602__Llama-2-13b-FINETUNE4_TEST/blob/main/results_2023-10-25T04-35-36.269188.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.2558724832214765,\n \"em_stderr\": 0.004468637497676013,\n \"f1\": 0.29727348993288566,\n \"f1_stderr\": 0.0043971826108447475,\n \"acc\": 0.4511208594202994,\n \"acc_stderr\": 0.010571455427847876\n },\n \"harness|drop|3\": {\n \"em\": 0.2558724832214765,\n \"em_stderr\": 0.004468637497676013,\n \"f1\": 0.29727348993288566,\n \"f1_stderr\": 0.0043971826108447475\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.13191811978771797,\n \"acc_stderr\": 0.009321265253857515\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7703235990528808,\n \"acc_stderr\": 0.011821645601838234\n }\n}\n```", "repo_url": "https://huggingface.co/wei123602/Llama-2-13b-FINETUNE4_TEST", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_09_21T23_17_56.003321", "path": ["**/details_harness|arc:challenge|25_2023-09-21T23-17-56.003321.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-09-21T23-17-56.003321.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_25T04_35_36.269188", "path": ["**/details_harness|drop|3_2023-10-25T04-35-36.269188.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-25T04-35-36.269188.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_25T04_35_36.269188", "path": ["**/details_harness|gsm8k|5_2023-10-25T04-35-36.269188.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-25T04-35-36.269188.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_09_21T23_17_56.003321", "path": ["**/details_harness|hellaswag|10_2023-09-21T23-17-56.003321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-09-21T23-17-56.003321.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_09_21T23_17_56.003321", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-21T23-17-56.003321.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-09-21T23-17-56.003321.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-09-21T23-17-56.003321.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-09-21T23-17-56.003321.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-21T23-17-56.003321.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-09-21T23-17-56.003321.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-09-21T23-17-56.003321.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-09-21T23-17-56.003321.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-09-21T23-17-56.003321.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-09-21T23-17-56.003321.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-09-21T23-17-56.003321.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-09-21T23-17-56.003321.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-21T23-17-56.003321.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-09-21T23-17-56.003321.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-21T23-17-56.003321.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-21T23-17-56.003321.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-09-21T23-17-56.003321.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-09-21T23-17-56.003321.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-09-21T23-17-56.003321.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-21T23-17-56.003321.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-21T23-17-56.003321.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-21T23-17-56.003321.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-09-21T23-17-56.003321.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-21T23-17-56.003321.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-21T23-17-56.003321.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-21T23-17-56.003321.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-21T23-17-56.003321.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-09-21T23-17-56.003321.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-21T23-17-56.003321.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-21T23-17-56.003321.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-21T23-17-56.003321.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-21T23-17-56.003321.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-09-21T23-17-56.003321.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-09-21T23-17-56.003321.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-09-21T23-17-56.003321.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-09-21T23-17-56.003321.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-21T23-17-56.003321.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-09-21T23-17-56.003321.parquet", "**/details_harness|hendrycksTest-management|5_2023-09-21T23-17-56.003321.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-09-21T23-17-56.003321.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-09-21T23-17-56.003321.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-09-21T23-17-56.003321.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-09-21T23-17-56.003321.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-21T23-17-56.003321.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-09-21T23-17-56.003321.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-09-21T23-17-56.003321.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-09-21T23-17-56.003321.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-09-21T23-17-56.003321.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-09-21T23-17-56.003321.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-09-21T23-17-56.003321.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-09-21T23-17-56.003321.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-09-21T23-17-56.003321.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-09-21T23-17-56.003321.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-09-21T23-17-56.003321.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-21T23-17-56.003321.parquet", "**/details_harness|hendrycksTest-virology|5_2023-09-21T23-17-56.003321.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-09-21T23-17-56.003321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-21T23-17-56.003321.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-09-21T23-17-56.003321.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-09-21T23-17-56.003321.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-09-21T23-17-56.003321.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-21T23-17-56.003321.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-09-21T23-17-56.003321.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-09-21T23-17-56.003321.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-09-21T23-17-56.003321.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-09-21T23-17-56.003321.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-09-21T23-17-56.003321.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-09-21T23-17-56.003321.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-09-21T23-17-56.003321.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-21T23-17-56.003321.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-09-21T23-17-56.003321.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-21T23-17-56.003321.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-21T23-17-56.003321.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-09-21T23-17-56.003321.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-09-21T23-17-56.003321.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-09-21T23-17-56.003321.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-21T23-17-56.003321.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-21T23-17-56.003321.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-21T23-17-56.003321.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-09-21T23-17-56.003321.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-21T23-17-56.003321.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-21T23-17-56.003321.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-21T23-17-56.003321.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-21T23-17-56.003321.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-09-21T23-17-56.003321.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-21T23-17-56.003321.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-21T23-17-56.003321.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-21T23-17-56.003321.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-21T23-17-56.003321.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-09-21T23-17-56.003321.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-09-21T23-17-56.003321.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-09-21T23-17-56.003321.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-09-21T23-17-56.003321.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-21T23-17-56.003321.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-09-21T23-17-56.003321.parquet", "**/details_harness|hendrycksTest-management|5_2023-09-21T23-17-56.003321.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-09-21T23-17-56.003321.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-09-21T23-17-56.003321.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-09-21T23-17-56.003321.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-09-21T23-17-56.003321.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-21T23-17-56.003321.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-09-21T23-17-56.003321.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-09-21T23-17-56.003321.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-09-21T23-17-56.003321.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-09-21T23-17-56.003321.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-09-21T23-17-56.003321.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-09-21T23-17-56.003321.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-09-21T23-17-56.003321.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-09-21T23-17-56.003321.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-09-21T23-17-56.003321.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-09-21T23-17-56.003321.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-21T23-17-56.003321.parquet", "**/details_harness|hendrycksTest-virology|5_2023-09-21T23-17-56.003321.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-09-21T23-17-56.003321.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_09_21T23_17_56.003321", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-21T23-17-56.003321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-21T23-17-56.003321.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_09_21T23_17_56.003321", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-09-21T23-17-56.003321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-09-21T23-17-56.003321.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_09_21T23_17_56.003321", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-09-21T23-17-56.003321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-09-21T23-17-56.003321.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_09_21T23_17_56.003321", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-09-21T23-17-56.003321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-09-21T23-17-56.003321.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_09_21T23_17_56.003321", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-21T23-17-56.003321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-21T23-17-56.003321.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_09_21T23_17_56.003321", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-09-21T23-17-56.003321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-09-21T23-17-56.003321.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_09_21T23_17_56.003321", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-09-21T23-17-56.003321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-09-21T23-17-56.003321.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_09_21T23_17_56.003321", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-09-21T23-17-56.003321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-09-21T23-17-56.003321.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_09_21T23_17_56.003321", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-09-21T23-17-56.003321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-09-21T23-17-56.003321.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_09_21T23_17_56.003321", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-09-21T23-17-56.003321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-09-21T23-17-56.003321.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_09_21T23_17_56.003321", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-09-21T23-17-56.003321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-09-21T23-17-56.003321.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_09_21T23_17_56.003321", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-09-21T23-17-56.003321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-09-21T23-17-56.003321.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_09_21T23_17_56.003321", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-21T23-17-56.003321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-21T23-17-56.003321.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_09_21T23_17_56.003321", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-09-21T23-17-56.003321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-09-21T23-17-56.003321.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_09_21T23_17_56.003321", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-21T23-17-56.003321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-21T23-17-56.003321.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_09_21T23_17_56.003321", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-21T23-17-56.003321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-21T23-17-56.003321.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_09_21T23_17_56.003321", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-09-21T23-17-56.003321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-09-21T23-17-56.003321.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_09_21T23_17_56.003321", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-09-21T23-17-56.003321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-09-21T23-17-56.003321.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_09_21T23_17_56.003321", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-09-21T23-17-56.003321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-09-21T23-17-56.003321.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_09_21T23_17_56.003321", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-21T23-17-56.003321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-21T23-17-56.003321.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_09_21T23_17_56.003321", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-21T23-17-56.003321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-21T23-17-56.003321.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_09_21T23_17_56.003321", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-21T23-17-56.003321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-21T23-17-56.003321.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_09_21T23_17_56.003321", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-09-21T23-17-56.003321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-09-21T23-17-56.003321.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_09_21T23_17_56.003321", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-21T23-17-56.003321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-21T23-17-56.003321.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_09_21T23_17_56.003321", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-21T23-17-56.003321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-21T23-17-56.003321.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_09_21T23_17_56.003321", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-21T23-17-56.003321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-21T23-17-56.003321.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_09_21T23_17_56.003321", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-21T23-17-56.003321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-21T23-17-56.003321.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_09_21T23_17_56.003321", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-09-21T23-17-56.003321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-09-21T23-17-56.003321.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_09_21T23_17_56.003321", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-21T23-17-56.003321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-21T23-17-56.003321.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_09_21T23_17_56.003321", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-21T23-17-56.003321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-21T23-17-56.003321.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_09_21T23_17_56.003321", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-21T23-17-56.003321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-21T23-17-56.003321.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_09_21T23_17_56.003321", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-21T23-17-56.003321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-21T23-17-56.003321.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_09_21T23_17_56.003321", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-09-21T23-17-56.003321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-09-21T23-17-56.003321.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_09_21T23_17_56.003321", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-09-21T23-17-56.003321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-09-21T23-17-56.003321.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_09_21T23_17_56.003321", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-09-21T23-17-56.003321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-09-21T23-17-56.003321.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_09_21T23_17_56.003321", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-09-21T23-17-56.003321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-09-21T23-17-56.003321.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_09_21T23_17_56.003321", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-21T23-17-56.003321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-21T23-17-56.003321.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_09_21T23_17_56.003321", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-09-21T23-17-56.003321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-09-21T23-17-56.003321.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_09_21T23_17_56.003321", "path": ["**/details_harness|hendrycksTest-management|5_2023-09-21T23-17-56.003321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-09-21T23-17-56.003321.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_09_21T23_17_56.003321", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-09-21T23-17-56.003321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-09-21T23-17-56.003321.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_09_21T23_17_56.003321", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-09-21T23-17-56.003321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-09-21T23-17-56.003321.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_09_21T23_17_56.003321", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-09-21T23-17-56.003321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-09-21T23-17-56.003321.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_09_21T23_17_56.003321", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-09-21T23-17-56.003321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-09-21T23-17-56.003321.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_09_21T23_17_56.003321", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-21T23-17-56.003321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-21T23-17-56.003321.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_09_21T23_17_56.003321", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-09-21T23-17-56.003321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-09-21T23-17-56.003321.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_09_21T23_17_56.003321", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-09-21T23-17-56.003321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-09-21T23-17-56.003321.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_09_21T23_17_56.003321", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-09-21T23-17-56.003321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-09-21T23-17-56.003321.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_09_21T23_17_56.003321", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-09-21T23-17-56.003321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-09-21T23-17-56.003321.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_09_21T23_17_56.003321", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-09-21T23-17-56.003321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-09-21T23-17-56.003321.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_09_21T23_17_56.003321", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-09-21T23-17-56.003321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-09-21T23-17-56.003321.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_09_21T23_17_56.003321", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-09-21T23-17-56.003321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-09-21T23-17-56.003321.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_09_21T23_17_56.003321", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-09-21T23-17-56.003321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-09-21T23-17-56.003321.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_09_21T23_17_56.003321", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-09-21T23-17-56.003321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-09-21T23-17-56.003321.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_09_21T23_17_56.003321", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-09-21T23-17-56.003321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-09-21T23-17-56.003321.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_09_21T23_17_56.003321", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-21T23-17-56.003321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-21T23-17-56.003321.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_09_21T23_17_56.003321", "path": ["**/details_harness|hendrycksTest-virology|5_2023-09-21T23-17-56.003321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-09-21T23-17-56.003321.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_09_21T23_17_56.003321", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-09-21T23-17-56.003321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-09-21T23-17-56.003321.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_09_21T23_17_56.003321", "path": ["**/details_harness|truthfulqa:mc|0_2023-09-21T23-17-56.003321.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-09-21T23-17-56.003321.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_25T04_35_36.269188", "path": ["**/details_harness|winogrande|5_2023-10-25T04-35-36.269188.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-25T04-35-36.269188.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_09_21T23_17_56.003321", "path": ["results_2023-09-21T23-17-56.003321.parquet"]}, {"split": "2023_10_25T04_35_36.269188", "path": ["results_2023-10-25T04-35-36.269188.parquet"]}, {"split": "latest", "path": ["results_2023-10-25T04-35-36.269188.parquet"]}]}]}
2023-10-25T03:35:49+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of wei123602/Llama-2-13b-FINETUNE4_TEST ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model wei123602/Llama-2-13b-FINETUNE4_TEST on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-10-25T04:35:36.269188(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of wei123602/Llama-2-13b-FINETUNE4_TEST", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model wei123602/Llama-2-13b-FINETUNE4_TEST on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-25T04:35:36.269188(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of wei123602/Llama-2-13b-FINETUNE4_TEST", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model wei123602/Llama-2-13b-FINETUNE4_TEST on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-25T04:35:36.269188(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 27, 31, 175, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of wei123602/Llama-2-13b-FINETUNE4_TEST## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model wei123602/Llama-2-13b-FINETUNE4_TEST on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-25T04:35:36.269188(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
dfeb52ed6ee9e64c5075791a657f8adb104c12fa
# Bangumi Image Base of Don't Toy With Me, Miss Nagatoro This is the image base of bangumi Don't Toy With Me, Miss Nagatoro, we detected 19 characters, 3059 images in total. The full dataset is [here](all.zip). **Please note that these image bases are not guaranteed to be 100% cleaned, they may be noisy actual.** If you intend to manually train models using this dataset, we recommend performing necessary preprocessing on the downloaded dataset to eliminate potential noisy samples (approximately 1% probability). Here is the characters' preview: | # | Images | Download | Preview 1 | Preview 2 | Preview 3 | Preview 4 | Preview 5 | Preview 6 | Preview 7 | Preview 8 | |:------|---------:|:---------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------| | 0 | 43 | [Download](0/dataset.zip) | ![preview 1](0/preview_1.png) | ![preview 2](0/preview_2.png) | ![preview 3](0/preview_3.png) | ![preview 4](0/preview_4.png) | ![preview 5](0/preview_5.png) | ![preview 6](0/preview_6.png) | ![preview 7](0/preview_7.png) | ![preview 8](0/preview_8.png) | | 1 | 34 | [Download](1/dataset.zip) | ![preview 1](1/preview_1.png) | ![preview 2](1/preview_2.png) | ![preview 3](1/preview_3.png) | ![preview 4](1/preview_4.png) | ![preview 5](1/preview_5.png) | ![preview 6](1/preview_6.png) | ![preview 7](1/preview_7.png) | ![preview 8](1/preview_8.png) | | 2 | 1240 | [Download](2/dataset.zip) | ![preview 1](2/preview_1.png) | ![preview 2](2/preview_2.png) | ![preview 3](2/preview_3.png) | ![preview 4](2/preview_4.png) | ![preview 5](2/preview_5.png) | ![preview 6](2/preview_6.png) | ![preview 7](2/preview_7.png) | ![preview 8](2/preview_8.png) | | 3 | 28 | [Download](3/dataset.zip) | ![preview 1](3/preview_1.png) | ![preview 2](3/preview_2.png) | ![preview 3](3/preview_3.png) | ![preview 4](3/preview_4.png) | ![preview 5](3/preview_5.png) | ![preview 6](3/preview_6.png) | ![preview 7](3/preview_7.png) | ![preview 8](3/preview_8.png) | | 4 | 12 | [Download](4/dataset.zip) | ![preview 1](4/preview_1.png) | ![preview 2](4/preview_2.png) | ![preview 3](4/preview_3.png) | ![preview 4](4/preview_4.png) | ![preview 5](4/preview_5.png) | ![preview 6](4/preview_6.png) | ![preview 7](4/preview_7.png) | ![preview 8](4/preview_8.png) | | 5 | 42 | [Download](5/dataset.zip) | ![preview 1](5/preview_1.png) | ![preview 2](5/preview_2.png) | ![preview 3](5/preview_3.png) | ![preview 4](5/preview_4.png) | ![preview 5](5/preview_5.png) | ![preview 6](5/preview_6.png) | ![preview 7](5/preview_7.png) | ![preview 8](5/preview_8.png) | | 6 | 28 | [Download](6/dataset.zip) | ![preview 1](6/preview_1.png) | ![preview 2](6/preview_2.png) | ![preview 3](6/preview_3.png) | ![preview 4](6/preview_4.png) | ![preview 5](6/preview_5.png) | ![preview 6](6/preview_6.png) | ![preview 7](6/preview_7.png) | ![preview 8](6/preview_8.png) | | 7 | 16 | [Download](7/dataset.zip) | ![preview 1](7/preview_1.png) | ![preview 2](7/preview_2.png) | ![preview 3](7/preview_3.png) | ![preview 4](7/preview_4.png) | ![preview 5](7/preview_5.png) | ![preview 6](7/preview_6.png) | ![preview 7](7/preview_7.png) | ![preview 8](7/preview_8.png) | | 8 | 1114 | [Download](8/dataset.zip) | ![preview 1](8/preview_1.png) | ![preview 2](8/preview_2.png) | ![preview 3](8/preview_3.png) | ![preview 4](8/preview_4.png) | ![preview 5](8/preview_5.png) | ![preview 6](8/preview_6.png) | ![preview 7](8/preview_7.png) | ![preview 8](8/preview_8.png) | | 9 | 9 | [Download](9/dataset.zip) | ![preview 1](9/preview_1.png) | ![preview 2](9/preview_2.png) | ![preview 3](9/preview_3.png) | ![preview 4](9/preview_4.png) | ![preview 5](9/preview_5.png) | ![preview 6](9/preview_6.png) | ![preview 7](9/preview_7.png) | ![preview 8](9/preview_8.png) | | 10 | 15 | [Download](10/dataset.zip) | ![preview 1](10/preview_1.png) | ![preview 2](10/preview_2.png) | ![preview 3](10/preview_3.png) | ![preview 4](10/preview_4.png) | ![preview 5](10/preview_5.png) | ![preview 6](10/preview_6.png) | ![preview 7](10/preview_7.png) | ![preview 8](10/preview_8.png) | | 11 | 144 | [Download](11/dataset.zip) | ![preview 1](11/preview_1.png) | ![preview 2](11/preview_2.png) | ![preview 3](11/preview_3.png) | ![preview 4](11/preview_4.png) | ![preview 5](11/preview_5.png) | ![preview 6](11/preview_6.png) | ![preview 7](11/preview_7.png) | ![preview 8](11/preview_8.png) | | 12 | 11 | [Download](12/dataset.zip) | ![preview 1](12/preview_1.png) | ![preview 2](12/preview_2.png) | ![preview 3](12/preview_3.png) | ![preview 4](12/preview_4.png) | ![preview 5](12/preview_5.png) | ![preview 6](12/preview_6.png) | ![preview 7](12/preview_7.png) | ![preview 8](12/preview_8.png) | | 13 | 87 | [Download](13/dataset.zip) | ![preview 1](13/preview_1.png) | ![preview 2](13/preview_2.png) | ![preview 3](13/preview_3.png) | ![preview 4](13/preview_4.png) | ![preview 5](13/preview_5.png) | ![preview 6](13/preview_6.png) | ![preview 7](13/preview_7.png) | ![preview 8](13/preview_8.png) | | 14 | 9 | [Download](14/dataset.zip) | ![preview 1](14/preview_1.png) | ![preview 2](14/preview_2.png) | ![preview 3](14/preview_3.png) | ![preview 4](14/preview_4.png) | ![preview 5](14/preview_5.png) | ![preview 6](14/preview_6.png) | ![preview 7](14/preview_7.png) | ![preview 8](14/preview_8.png) | | 15 | 11 | [Download](15/dataset.zip) | ![preview 1](15/preview_1.png) | ![preview 2](15/preview_2.png) | ![preview 3](15/preview_3.png) | ![preview 4](15/preview_4.png) | ![preview 5](15/preview_5.png) | ![preview 6](15/preview_6.png) | ![preview 7](15/preview_7.png) | ![preview 8](15/preview_8.png) | | 16 | 83 | [Download](16/dataset.zip) | ![preview 1](16/preview_1.png) | ![preview 2](16/preview_2.png) | ![preview 3](16/preview_3.png) | ![preview 4](16/preview_4.png) | ![preview 5](16/preview_5.png) | ![preview 6](16/preview_6.png) | ![preview 7](16/preview_7.png) | ![preview 8](16/preview_8.png) | | 17 | 12 | [Download](17/dataset.zip) | ![preview 1](17/preview_1.png) | ![preview 2](17/preview_2.png) | ![preview 3](17/preview_3.png) | ![preview 4](17/preview_4.png) | ![preview 5](17/preview_5.png) | ![preview 6](17/preview_6.png) | ![preview 7](17/preview_7.png) | ![preview 8](17/preview_8.png) | | noise | 121 | [Download](-1/dataset.zip) | ![preview 1](-1/preview_1.png) | ![preview 2](-1/preview_2.png) | ![preview 3](-1/preview_3.png) | ![preview 4](-1/preview_4.png) | ![preview 5](-1/preview_5.png) | ![preview 6](-1/preview_6.png) | ![preview 7](-1/preview_7.png) | ![preview 8](-1/preview_8.png) |
BangumiBase/donttoywithmemissnagatoro
[ "size_categories:1K<n<10K", "license:mit", "art", "region:us" ]
2023-09-21T22:47:50+00:00
{"license": "mit", "size_categories": ["1K<n<10K"], "tags": ["art"]}
2023-09-29T09:12:18+00:00
[]
[]
TAGS #size_categories-1K<n<10K #license-mit #art #region-us
Bangumi Image Base of Don't Toy With Me, Miss Nagatoro ====================================================== This is the image base of bangumi Don't Toy With Me, Miss Nagatoro, we detected 19 characters, 3059 images in total. The full dataset is here. Please note that these image bases are not guaranteed to be 100% cleaned, they may be noisy actual. If you intend to manually train models using this dataset, we recommend performing necessary preprocessing on the downloaded dataset to eliminate potential noisy samples (approximately 1% probability). Here is the characters' preview:
[]
[ "TAGS\n#size_categories-1K<n<10K #license-mit #art #region-us \n" ]
[ 25 ]
[ "passage: TAGS\n#size_categories-1K<n<10K #license-mit #art #region-us \n" ]
8099cf6c81fa9c4ed420fbab3da9780357da62b0
# Dataset Card for "three_styles_prompted_250_512x512" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
kewu93/three_styles_prompted_250_512x512
[ "region:us" ]
2023-09-21T22:51:24+00:00
{"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "val", "path": "data/val-*"}]}], "dataset_info": {"features": [{"name": "image", "dtype": "image"}, {"name": "text", "dtype": "string"}, {"name": "style_class", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 17235209.8, "num_examples": 600}, {"name": "val", "num_bytes": 4420404.2, "num_examples": 150}], "download_size": 21435960, "dataset_size": 21655614.0}}
2023-09-21T22:53:43+00:00
[]
[]
TAGS #region-us
# Dataset Card for "three_styles_prompted_250_512x512" More Information needed
[ "# Dataset Card for \"three_styles_prompted_250_512x512\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"three_styles_prompted_250_512x512\"\n\nMore Information needed" ]
[ 6, 28 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"three_styles_prompted_250_512x512\"\n\nMore Information needed" ]
725c0c9c08134d3602327f740da954f17d426e15
# Dataset Card for Evaluation run of zarakiquemparte/kuchiki-1.1-l2-7b ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/zarakiquemparte/kuchiki-1.1-l2-7b - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [zarakiquemparte/kuchiki-1.1-l2-7b](https://huggingface.co/zarakiquemparte/kuchiki-1.1-l2-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_zarakiquemparte__kuchiki-1.1-l2-7b", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-10-25T05:14:37.796518](https://huggingface.co/datasets/open-llm-leaderboard/details_zarakiquemparte__kuchiki-1.1-l2-7b/blob/main/results_2023-10-25T05-14-37.796518.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.25786493288590606, "em_stderr": 0.004479992336423503, "f1": 0.33612416107382703, "f1_stderr": 0.004456179772038806, "acc": 0.3893274364772528, "acc_stderr": 0.009141619357749198 }, "harness|drop|3": { "em": 0.25786493288590606, "em_stderr": 0.004479992336423503, "f1": 0.33612416107382703, "f1_stderr": 0.004456179772038806 }, "harness|gsm8k|5": { "acc": 0.04700530705079606, "acc_stderr": 0.0058298983559372 }, "harness|winogrande|5": { "acc": 0.7316495659037096, "acc_stderr": 0.012453340359561195 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_zarakiquemparte__kuchiki-1.1-l2-7b
[ "region:us" ]
2023-09-21T23:10:02+00:00
{"pretty_name": "Evaluation run of zarakiquemparte/kuchiki-1.1-l2-7b", "dataset_summary": "Dataset automatically created during the evaluation run of model [zarakiquemparte/kuchiki-1.1-l2-7b](https://huggingface.co/zarakiquemparte/kuchiki-1.1-l2-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_zarakiquemparte__kuchiki-1.1-l2-7b\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-25T05:14:37.796518](https://huggingface.co/datasets/open-llm-leaderboard/details_zarakiquemparte__kuchiki-1.1-l2-7b/blob/main/results_2023-10-25T05-14-37.796518.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.25786493288590606,\n \"em_stderr\": 0.004479992336423503,\n \"f1\": 0.33612416107382703,\n \"f1_stderr\": 0.004456179772038806,\n \"acc\": 0.3893274364772528,\n \"acc_stderr\": 0.009141619357749198\n },\n \"harness|drop|3\": {\n \"em\": 0.25786493288590606,\n \"em_stderr\": 0.004479992336423503,\n \"f1\": 0.33612416107382703,\n \"f1_stderr\": 0.004456179772038806\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.04700530705079606,\n \"acc_stderr\": 0.0058298983559372\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7316495659037096,\n \"acc_stderr\": 0.012453340359561195\n }\n}\n```", "repo_url": "https://huggingface.co/zarakiquemparte/kuchiki-1.1-l2-7b", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_09_22T00_09_37.890921", "path": ["**/details_harness|arc:challenge|25_2023-09-22T00-09-37.890921.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-09-22T00-09-37.890921.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_25T05_14_37.796518", "path": ["**/details_harness|drop|3_2023-10-25T05-14-37.796518.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-25T05-14-37.796518.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_25T05_14_37.796518", "path": ["**/details_harness|gsm8k|5_2023-10-25T05-14-37.796518.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-25T05-14-37.796518.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_09_22T00_09_37.890921", "path": ["**/details_harness|hellaswag|10_2023-09-22T00-09-37.890921.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-09-22T00-09-37.890921.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_09_22T00_09_37.890921", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-22T00-09-37.890921.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-09-22T00-09-37.890921.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-09-22T00-09-37.890921.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-09-22T00-09-37.890921.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-22T00-09-37.890921.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-09-22T00-09-37.890921.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-09-22T00-09-37.890921.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-09-22T00-09-37.890921.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-09-22T00-09-37.890921.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-09-22T00-09-37.890921.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-09-22T00-09-37.890921.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-09-22T00-09-37.890921.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-22T00-09-37.890921.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-09-22T00-09-37.890921.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-22T00-09-37.890921.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-22T00-09-37.890921.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-09-22T00-09-37.890921.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-09-22T00-09-37.890921.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-09-22T00-09-37.890921.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-22T00-09-37.890921.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-22T00-09-37.890921.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-22T00-09-37.890921.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-09-22T00-09-37.890921.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-22T00-09-37.890921.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-22T00-09-37.890921.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-22T00-09-37.890921.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-22T00-09-37.890921.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-09-22T00-09-37.890921.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-22T00-09-37.890921.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-22T00-09-37.890921.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-22T00-09-37.890921.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-22T00-09-37.890921.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-09-22T00-09-37.890921.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-09-22T00-09-37.890921.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-09-22T00-09-37.890921.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-09-22T00-09-37.890921.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-22T00-09-37.890921.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-09-22T00-09-37.890921.parquet", "**/details_harness|hendrycksTest-management|5_2023-09-22T00-09-37.890921.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-09-22T00-09-37.890921.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-09-22T00-09-37.890921.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-09-22T00-09-37.890921.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-09-22T00-09-37.890921.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-22T00-09-37.890921.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-09-22T00-09-37.890921.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-09-22T00-09-37.890921.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-09-22T00-09-37.890921.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-09-22T00-09-37.890921.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-09-22T00-09-37.890921.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-09-22T00-09-37.890921.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-09-22T00-09-37.890921.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-09-22T00-09-37.890921.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-09-22T00-09-37.890921.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-09-22T00-09-37.890921.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-22T00-09-37.890921.parquet", "**/details_harness|hendrycksTest-virology|5_2023-09-22T00-09-37.890921.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-09-22T00-09-37.890921.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-22T00-09-37.890921.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-09-22T00-09-37.890921.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-09-22T00-09-37.890921.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-09-22T00-09-37.890921.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-22T00-09-37.890921.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-09-22T00-09-37.890921.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-09-22T00-09-37.890921.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-09-22T00-09-37.890921.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-09-22T00-09-37.890921.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-09-22T00-09-37.890921.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-09-22T00-09-37.890921.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-09-22T00-09-37.890921.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-22T00-09-37.890921.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-09-22T00-09-37.890921.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-22T00-09-37.890921.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-22T00-09-37.890921.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-09-22T00-09-37.890921.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-09-22T00-09-37.890921.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-09-22T00-09-37.890921.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-22T00-09-37.890921.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-22T00-09-37.890921.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-22T00-09-37.890921.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-09-22T00-09-37.890921.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-22T00-09-37.890921.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-22T00-09-37.890921.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-22T00-09-37.890921.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-22T00-09-37.890921.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-09-22T00-09-37.890921.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-22T00-09-37.890921.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-22T00-09-37.890921.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-22T00-09-37.890921.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-22T00-09-37.890921.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-09-22T00-09-37.890921.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-09-22T00-09-37.890921.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-09-22T00-09-37.890921.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-09-22T00-09-37.890921.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-22T00-09-37.890921.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-09-22T00-09-37.890921.parquet", "**/details_harness|hendrycksTest-management|5_2023-09-22T00-09-37.890921.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-09-22T00-09-37.890921.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-09-22T00-09-37.890921.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-09-22T00-09-37.890921.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-09-22T00-09-37.890921.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-22T00-09-37.890921.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-09-22T00-09-37.890921.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-09-22T00-09-37.890921.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-09-22T00-09-37.890921.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-09-22T00-09-37.890921.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-09-22T00-09-37.890921.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-09-22T00-09-37.890921.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-09-22T00-09-37.890921.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-09-22T00-09-37.890921.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-09-22T00-09-37.890921.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-09-22T00-09-37.890921.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-22T00-09-37.890921.parquet", "**/details_harness|hendrycksTest-virology|5_2023-09-22T00-09-37.890921.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-09-22T00-09-37.890921.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_09_22T00_09_37.890921", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-22T00-09-37.890921.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-22T00-09-37.890921.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_09_22T00_09_37.890921", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-09-22T00-09-37.890921.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-09-22T00-09-37.890921.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_09_22T00_09_37.890921", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-09-22T00-09-37.890921.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-09-22T00-09-37.890921.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_09_22T00_09_37.890921", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-09-22T00-09-37.890921.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-09-22T00-09-37.890921.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_09_22T00_09_37.890921", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-22T00-09-37.890921.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-22T00-09-37.890921.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_09_22T00_09_37.890921", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-09-22T00-09-37.890921.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-09-22T00-09-37.890921.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_09_22T00_09_37.890921", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-09-22T00-09-37.890921.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-09-22T00-09-37.890921.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_09_22T00_09_37.890921", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-09-22T00-09-37.890921.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-09-22T00-09-37.890921.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_09_22T00_09_37.890921", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-09-22T00-09-37.890921.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-09-22T00-09-37.890921.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_09_22T00_09_37.890921", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-09-22T00-09-37.890921.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-09-22T00-09-37.890921.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_09_22T00_09_37.890921", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-09-22T00-09-37.890921.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-09-22T00-09-37.890921.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_09_22T00_09_37.890921", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-09-22T00-09-37.890921.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-09-22T00-09-37.890921.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_09_22T00_09_37.890921", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-22T00-09-37.890921.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-22T00-09-37.890921.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_09_22T00_09_37.890921", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-09-22T00-09-37.890921.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-09-22T00-09-37.890921.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_09_22T00_09_37.890921", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-22T00-09-37.890921.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-22T00-09-37.890921.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_09_22T00_09_37.890921", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-22T00-09-37.890921.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-22T00-09-37.890921.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_09_22T00_09_37.890921", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-09-22T00-09-37.890921.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-09-22T00-09-37.890921.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_09_22T00_09_37.890921", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-09-22T00-09-37.890921.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-09-22T00-09-37.890921.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_09_22T00_09_37.890921", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-09-22T00-09-37.890921.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-09-22T00-09-37.890921.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_09_22T00_09_37.890921", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-22T00-09-37.890921.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-22T00-09-37.890921.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_09_22T00_09_37.890921", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-22T00-09-37.890921.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-22T00-09-37.890921.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_09_22T00_09_37.890921", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-22T00-09-37.890921.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-22T00-09-37.890921.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_09_22T00_09_37.890921", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-09-22T00-09-37.890921.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-09-22T00-09-37.890921.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_09_22T00_09_37.890921", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-22T00-09-37.890921.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-22T00-09-37.890921.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_09_22T00_09_37.890921", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-22T00-09-37.890921.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-22T00-09-37.890921.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_09_22T00_09_37.890921", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-22T00-09-37.890921.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-22T00-09-37.890921.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_09_22T00_09_37.890921", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-22T00-09-37.890921.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-22T00-09-37.890921.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_09_22T00_09_37.890921", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-09-22T00-09-37.890921.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-09-22T00-09-37.890921.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_09_22T00_09_37.890921", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-22T00-09-37.890921.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-22T00-09-37.890921.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_09_22T00_09_37.890921", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-22T00-09-37.890921.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-22T00-09-37.890921.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_09_22T00_09_37.890921", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-22T00-09-37.890921.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-22T00-09-37.890921.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_09_22T00_09_37.890921", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-22T00-09-37.890921.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-22T00-09-37.890921.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_09_22T00_09_37.890921", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-09-22T00-09-37.890921.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-09-22T00-09-37.890921.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_09_22T00_09_37.890921", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-09-22T00-09-37.890921.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-09-22T00-09-37.890921.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_09_22T00_09_37.890921", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-09-22T00-09-37.890921.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-09-22T00-09-37.890921.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_09_22T00_09_37.890921", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-09-22T00-09-37.890921.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-09-22T00-09-37.890921.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_09_22T00_09_37.890921", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-22T00-09-37.890921.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-22T00-09-37.890921.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_09_22T00_09_37.890921", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-09-22T00-09-37.890921.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-09-22T00-09-37.890921.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_09_22T00_09_37.890921", "path": ["**/details_harness|hendrycksTest-management|5_2023-09-22T00-09-37.890921.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-09-22T00-09-37.890921.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_09_22T00_09_37.890921", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-09-22T00-09-37.890921.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-09-22T00-09-37.890921.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_09_22T00_09_37.890921", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-09-22T00-09-37.890921.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-09-22T00-09-37.890921.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_09_22T00_09_37.890921", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-09-22T00-09-37.890921.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-09-22T00-09-37.890921.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_09_22T00_09_37.890921", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-09-22T00-09-37.890921.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-09-22T00-09-37.890921.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_09_22T00_09_37.890921", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-22T00-09-37.890921.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-22T00-09-37.890921.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_09_22T00_09_37.890921", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-09-22T00-09-37.890921.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-09-22T00-09-37.890921.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_09_22T00_09_37.890921", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-09-22T00-09-37.890921.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-09-22T00-09-37.890921.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_09_22T00_09_37.890921", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-09-22T00-09-37.890921.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-09-22T00-09-37.890921.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_09_22T00_09_37.890921", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-09-22T00-09-37.890921.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-09-22T00-09-37.890921.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_09_22T00_09_37.890921", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-09-22T00-09-37.890921.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-09-22T00-09-37.890921.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_09_22T00_09_37.890921", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-09-22T00-09-37.890921.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-09-22T00-09-37.890921.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_09_22T00_09_37.890921", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-09-22T00-09-37.890921.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-09-22T00-09-37.890921.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_09_22T00_09_37.890921", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-09-22T00-09-37.890921.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-09-22T00-09-37.890921.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_09_22T00_09_37.890921", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-09-22T00-09-37.890921.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-09-22T00-09-37.890921.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_09_22T00_09_37.890921", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-09-22T00-09-37.890921.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-09-22T00-09-37.890921.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_09_22T00_09_37.890921", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-22T00-09-37.890921.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-22T00-09-37.890921.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_09_22T00_09_37.890921", "path": ["**/details_harness|hendrycksTest-virology|5_2023-09-22T00-09-37.890921.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-09-22T00-09-37.890921.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_09_22T00_09_37.890921", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-09-22T00-09-37.890921.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-09-22T00-09-37.890921.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_09_22T00_09_37.890921", "path": ["**/details_harness|truthfulqa:mc|0_2023-09-22T00-09-37.890921.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-09-22T00-09-37.890921.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_25T05_14_37.796518", "path": ["**/details_harness|winogrande|5_2023-10-25T05-14-37.796518.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-25T05-14-37.796518.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_09_22T00_09_37.890921", "path": ["results_2023-09-22T00-09-37.890921.parquet"]}, {"split": "2023_10_25T05_14_37.796518", "path": ["results_2023-10-25T05-14-37.796518.parquet"]}, {"split": "latest", "path": ["results_2023-10-25T05-14-37.796518.parquet"]}]}]}
2023-10-25T04:14:50+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of zarakiquemparte/kuchiki-1.1-l2-7b ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model zarakiquemparte/kuchiki-1.1-l2-7b on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-10-25T05:14:37.796518(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of zarakiquemparte/kuchiki-1.1-l2-7b", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model zarakiquemparte/kuchiki-1.1-l2-7b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-25T05:14:37.796518(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of zarakiquemparte/kuchiki-1.1-l2-7b", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model zarakiquemparte/kuchiki-1.1-l2-7b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-25T05:14:37.796518(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 25, 31, 173, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of zarakiquemparte/kuchiki-1.1-l2-7b## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model zarakiquemparte/kuchiki-1.1-l2-7b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-25T05:14:37.796518(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
446790ab9f1d3ccf6fd645b08331cf7caef8eeb7
# <a href="https://arxiv.org/abs/2309.09800">AMuRD</a>: Annotated Multilingual Receipts Dataset for Cross-lingual Key Information Extraction and Classification by Abdelrahman Abdallah, Mahmoud Abdalla, Mohamed Elkasaby, Yasser Elbendary, Adam Jatowt ![](images/model.png) ## Abstract > Key information extraction involves recognizing and extracting text from scanned receipts, enabling retrieval of essential content, and organizing it into structured documents. This paper presents a novel multilingual dataset for receipt extraction, addressing key challenges in information extraction and item classification. The dataset comprises $47,720$ samples, including annotations for item names, attributes like (price, brand, etc.), and classification into $44$ product categories. We introduce the InstructLLaMA approach, achieving an F1 score of $0.76$ and an accuracy of $0.68$ for key information extraction and item classification. ## Demo for our Instruct LLama Explore our Instruct LLama system through our live demo: [**Demo for our Instruct LLama**](http://18.188.209.98:5052/) ## Examples | Example | Input | Class | Brand | Weight | Number of units | Size of units | Price | T.Price | Pack | Unit | | ------- | ------------------------------------ | ---------------------- | -------------| --------- | ---------------- | --------------- | ------- | ------- | ------ | ----- | | Example 1| `40.99 20.99 2 chunks sunshine` | Tins, Jars & Packets | sunshine | No Weight | 2 | No Size of units| 20.99 | 40.99 | علبة | No Unit | | Example 2| `برسيل اتوماتيك جل روز 2.6` | Cleaning Supplies | برسيل | 2.6ل | 1 | No Size of units| No Price| No T.Price | عبوة | ل | | Example 3| `regina Pasta penne 400g` | Rice, Pasta & Pulses | regina | 400g | 1 | No Size of units| No Price| No T.Price | كيس | g | | Example 4| `10.00 400g Penne Pasta ElMaleka` | Rice, Pasta & Pulses | ElMaleka | 400g | 1 | No Size of units| 10 | 10 | كيس | g | ## Getting the code To get started with the code and utilize the AMuRD dataset for your research or projects, you can clone this repository: ```bash git clone https://github.com/yourusername/AMuRD.git ``` ## Dependencies ## Reproducing the results ## Citation Please consider to cite our paper: ``` @misc{abdallah2023amurd, title={AMuRD: Annotated Multilingual Receipts Dataset for Cross-lingual Key Information Extraction and Classification}, author={Abdelrahman Abdallah and Mahmoud Abdalla and Mohamed Elkasaby and Yasser Elbendary and Adam Jatowt}, year={2023}, eprint={2309.09800}, archivePrefix={arXiv}, primaryClass={cs.CL} } ``` ## License Note: The AMuRD Dataset can only be used for non-commercial research purposes. For researchers who want to use the AMuRD database, please first fill in this [Application Form](Application_Form/Application_Form_for_AMuRD.doc) and send it via email to us ([[email protected]](mailto:[email protected]), [[email protected]](mailto:[email protected]), [[email protected]](mailto:[email protected])).
abdoelsayed/AMuRD
[ "arxiv:2309.09800", "region:us" ]
2023-09-21T23:18:52+00:00
{}
2023-09-21T23:19:55+00:00
[ "2309.09800" ]
[]
TAGS #arxiv-2309.09800 #region-us
<a href="URL Annotated Multilingual Receipts Dataset for Cross-lingual Key Information Extraction and Classification ==================================================================================================================== by Abdelrahman Abdallah, Mahmoud Abdalla, Mohamed Elkasaby, Yasser Elbendary, Adam Jatowt ![](images/URL) Abstract -------- > > Key information extraction involves recognizing and extracting text from scanned receipts, > enabling retrieval of essential content, and organizing it into structured documents. > This paper presents a novel multilingual dataset for receipt extraction, addressing key challenges in information extraction and item classification. > The dataset comprises $47,720$ samples, including annotations for item names, attributes like (price, brand, etc.), and classification into $44$ product categories. > We introduce the InstructLLaMA approach, achieving an F1 score of $0.76$ and an accuracy of $0.68$ for key information extraction and item classification. > > > Demo for our Instruct LLama --------------------------- Explore our Instruct LLama system through our live demo: Demo for our Instruct LLama Examples -------- Getting the code ---------------- To get started with the code and utilize the AMuRD dataset for your research or projects, you can clone this repository: Dependencies ------------ Reproducing the results ----------------------- Please consider to cite our paper: License ------- Note: The AMuRD Dataset can only be used for non-commercial research purposes. For researchers who want to use the AMuRD database, please first fill in this Application Form and send it via email to us (m.abdallah@URL, Yelbendary@URL, abdoelsayed2016@URL).
[]
[ "TAGS\n#arxiv-2309.09800 #region-us \n" ]
[ 14 ]
[ "passage: TAGS\n#arxiv-2309.09800 #region-us \n" ]
04ab46595baa8e54e1d1305528c9481e558d3150
# Dataset Card for Evaluation run of zarakiquemparte/kuchiki-l2-7b ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/zarakiquemparte/kuchiki-l2-7b - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [zarakiquemparte/kuchiki-l2-7b](https://huggingface.co/zarakiquemparte/kuchiki-l2-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_zarakiquemparte__kuchiki-l2-7b", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-10-27T01:56:08.960825](https://huggingface.co/datasets/open-llm-leaderboard/details_zarakiquemparte__kuchiki-l2-7b/blob/main/results_2023-10-27T01-56-08.960825.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.27611157718120805, "em_stderr": 0.004578442614328635, "f1": 0.35264576342282045, "f1_stderr": 0.004531331117609875, "acc": 0.38779557831535094, "acc_stderr": 0.009079399041337897 }, "harness|drop|3": { "em": 0.27611157718120805, "em_stderr": 0.004578442614328635, "f1": 0.35264576342282045, "f1_stderr": 0.004531331117609875 }, "harness|gsm8k|5": { "acc": 0.04473085670962851, "acc_stderr": 0.005693886131407058 }, "harness|winogrande|5": { "acc": 0.7308602999210734, "acc_stderr": 0.012464911951268734 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_zarakiquemparte__kuchiki-l2-7b
[ "region:us" ]
2023-09-21T23:21:37+00:00
{"pretty_name": "Evaluation run of zarakiquemparte/kuchiki-l2-7b", "dataset_summary": "Dataset automatically created during the evaluation run of model [zarakiquemparte/kuchiki-l2-7b](https://huggingface.co/zarakiquemparte/kuchiki-l2-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_zarakiquemparte__kuchiki-l2-7b\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-27T01:56:08.960825](https://huggingface.co/datasets/open-llm-leaderboard/details_zarakiquemparte__kuchiki-l2-7b/blob/main/results_2023-10-27T01-56-08.960825.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.27611157718120805,\n \"em_stderr\": 0.004578442614328635,\n \"f1\": 0.35264576342282045,\n \"f1_stderr\": 0.004531331117609875,\n \"acc\": 0.38779557831535094,\n \"acc_stderr\": 0.009079399041337897\n },\n \"harness|drop|3\": {\n \"em\": 0.27611157718120805,\n \"em_stderr\": 0.004578442614328635,\n \"f1\": 0.35264576342282045,\n \"f1_stderr\": 0.004531331117609875\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.04473085670962851,\n \"acc_stderr\": 0.005693886131407058\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7308602999210734,\n \"acc_stderr\": 0.012464911951268734\n }\n}\n```", "repo_url": "https://huggingface.co/zarakiquemparte/kuchiki-l2-7b", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_09_22T00_21_14.015290", "path": ["**/details_harness|arc:challenge|25_2023-09-22T00-21-14.015290.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-09-22T00-21-14.015290.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_27T01_56_08.960825", "path": ["**/details_harness|drop|3_2023-10-27T01-56-08.960825.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-27T01-56-08.960825.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_27T01_56_08.960825", "path": ["**/details_harness|gsm8k|5_2023-10-27T01-56-08.960825.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-27T01-56-08.960825.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_09_22T00_21_14.015290", "path": ["**/details_harness|hellaswag|10_2023-09-22T00-21-14.015290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-09-22T00-21-14.015290.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_09_22T00_21_14.015290", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-22T00-21-14.015290.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-09-22T00-21-14.015290.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-09-22T00-21-14.015290.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-09-22T00-21-14.015290.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-22T00-21-14.015290.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-09-22T00-21-14.015290.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-09-22T00-21-14.015290.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-09-22T00-21-14.015290.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-09-22T00-21-14.015290.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-09-22T00-21-14.015290.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-09-22T00-21-14.015290.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-09-22T00-21-14.015290.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-22T00-21-14.015290.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-09-22T00-21-14.015290.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-22T00-21-14.015290.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-22T00-21-14.015290.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-09-22T00-21-14.015290.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-09-22T00-21-14.015290.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-09-22T00-21-14.015290.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-22T00-21-14.015290.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-22T00-21-14.015290.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-22T00-21-14.015290.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-09-22T00-21-14.015290.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-22T00-21-14.015290.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-22T00-21-14.015290.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-22T00-21-14.015290.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-22T00-21-14.015290.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-09-22T00-21-14.015290.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-22T00-21-14.015290.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-22T00-21-14.015290.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-22T00-21-14.015290.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-22T00-21-14.015290.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-09-22T00-21-14.015290.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-09-22T00-21-14.015290.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-09-22T00-21-14.015290.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-09-22T00-21-14.015290.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-22T00-21-14.015290.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-09-22T00-21-14.015290.parquet", "**/details_harness|hendrycksTest-management|5_2023-09-22T00-21-14.015290.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-09-22T00-21-14.015290.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-09-22T00-21-14.015290.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-09-22T00-21-14.015290.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-09-22T00-21-14.015290.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-22T00-21-14.015290.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-09-22T00-21-14.015290.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-09-22T00-21-14.015290.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-09-22T00-21-14.015290.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-09-22T00-21-14.015290.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-09-22T00-21-14.015290.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-09-22T00-21-14.015290.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-09-22T00-21-14.015290.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-09-22T00-21-14.015290.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-09-22T00-21-14.015290.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-09-22T00-21-14.015290.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-22T00-21-14.015290.parquet", "**/details_harness|hendrycksTest-virology|5_2023-09-22T00-21-14.015290.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-09-22T00-21-14.015290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-22T00-21-14.015290.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-09-22T00-21-14.015290.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-09-22T00-21-14.015290.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-09-22T00-21-14.015290.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-22T00-21-14.015290.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-09-22T00-21-14.015290.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-09-22T00-21-14.015290.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-09-22T00-21-14.015290.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-09-22T00-21-14.015290.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-09-22T00-21-14.015290.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-09-22T00-21-14.015290.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-09-22T00-21-14.015290.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-22T00-21-14.015290.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-09-22T00-21-14.015290.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-22T00-21-14.015290.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-22T00-21-14.015290.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-09-22T00-21-14.015290.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-09-22T00-21-14.015290.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-09-22T00-21-14.015290.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-22T00-21-14.015290.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-22T00-21-14.015290.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-22T00-21-14.015290.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-09-22T00-21-14.015290.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-22T00-21-14.015290.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-22T00-21-14.015290.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-22T00-21-14.015290.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-22T00-21-14.015290.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-09-22T00-21-14.015290.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-22T00-21-14.015290.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-22T00-21-14.015290.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-22T00-21-14.015290.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-22T00-21-14.015290.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-09-22T00-21-14.015290.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-09-22T00-21-14.015290.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-09-22T00-21-14.015290.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-09-22T00-21-14.015290.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-22T00-21-14.015290.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-09-22T00-21-14.015290.parquet", "**/details_harness|hendrycksTest-management|5_2023-09-22T00-21-14.015290.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-09-22T00-21-14.015290.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-09-22T00-21-14.015290.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-09-22T00-21-14.015290.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-09-22T00-21-14.015290.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-22T00-21-14.015290.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-09-22T00-21-14.015290.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-09-22T00-21-14.015290.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-09-22T00-21-14.015290.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-09-22T00-21-14.015290.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-09-22T00-21-14.015290.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-09-22T00-21-14.015290.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-09-22T00-21-14.015290.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-09-22T00-21-14.015290.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-09-22T00-21-14.015290.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-09-22T00-21-14.015290.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-22T00-21-14.015290.parquet", "**/details_harness|hendrycksTest-virology|5_2023-09-22T00-21-14.015290.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-09-22T00-21-14.015290.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_09_22T00_21_14.015290", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-22T00-21-14.015290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-22T00-21-14.015290.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_09_22T00_21_14.015290", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-09-22T00-21-14.015290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-09-22T00-21-14.015290.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_09_22T00_21_14.015290", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-09-22T00-21-14.015290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-09-22T00-21-14.015290.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_09_22T00_21_14.015290", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-09-22T00-21-14.015290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-09-22T00-21-14.015290.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_09_22T00_21_14.015290", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-22T00-21-14.015290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-22T00-21-14.015290.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_09_22T00_21_14.015290", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-09-22T00-21-14.015290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-09-22T00-21-14.015290.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_09_22T00_21_14.015290", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-09-22T00-21-14.015290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-09-22T00-21-14.015290.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_09_22T00_21_14.015290", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-09-22T00-21-14.015290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-09-22T00-21-14.015290.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_09_22T00_21_14.015290", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-09-22T00-21-14.015290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-09-22T00-21-14.015290.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_09_22T00_21_14.015290", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-09-22T00-21-14.015290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-09-22T00-21-14.015290.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_09_22T00_21_14.015290", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-09-22T00-21-14.015290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-09-22T00-21-14.015290.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_09_22T00_21_14.015290", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-09-22T00-21-14.015290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-09-22T00-21-14.015290.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_09_22T00_21_14.015290", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-22T00-21-14.015290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-22T00-21-14.015290.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_09_22T00_21_14.015290", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-09-22T00-21-14.015290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-09-22T00-21-14.015290.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_09_22T00_21_14.015290", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-22T00-21-14.015290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-22T00-21-14.015290.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_09_22T00_21_14.015290", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-22T00-21-14.015290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-22T00-21-14.015290.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_09_22T00_21_14.015290", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-09-22T00-21-14.015290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-09-22T00-21-14.015290.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_09_22T00_21_14.015290", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-09-22T00-21-14.015290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-09-22T00-21-14.015290.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_09_22T00_21_14.015290", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-09-22T00-21-14.015290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-09-22T00-21-14.015290.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_09_22T00_21_14.015290", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-22T00-21-14.015290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-22T00-21-14.015290.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_09_22T00_21_14.015290", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-22T00-21-14.015290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-22T00-21-14.015290.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_09_22T00_21_14.015290", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-22T00-21-14.015290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-22T00-21-14.015290.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_09_22T00_21_14.015290", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-09-22T00-21-14.015290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-09-22T00-21-14.015290.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_09_22T00_21_14.015290", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-22T00-21-14.015290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-22T00-21-14.015290.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_09_22T00_21_14.015290", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-22T00-21-14.015290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-22T00-21-14.015290.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_09_22T00_21_14.015290", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-22T00-21-14.015290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-22T00-21-14.015290.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_09_22T00_21_14.015290", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-22T00-21-14.015290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-22T00-21-14.015290.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_09_22T00_21_14.015290", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-09-22T00-21-14.015290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-09-22T00-21-14.015290.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_09_22T00_21_14.015290", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-22T00-21-14.015290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-22T00-21-14.015290.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_09_22T00_21_14.015290", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-22T00-21-14.015290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-22T00-21-14.015290.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_09_22T00_21_14.015290", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-22T00-21-14.015290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-22T00-21-14.015290.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_09_22T00_21_14.015290", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-22T00-21-14.015290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-22T00-21-14.015290.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_09_22T00_21_14.015290", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-09-22T00-21-14.015290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-09-22T00-21-14.015290.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_09_22T00_21_14.015290", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-09-22T00-21-14.015290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-09-22T00-21-14.015290.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_09_22T00_21_14.015290", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-09-22T00-21-14.015290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-09-22T00-21-14.015290.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_09_22T00_21_14.015290", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-09-22T00-21-14.015290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-09-22T00-21-14.015290.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_09_22T00_21_14.015290", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-22T00-21-14.015290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-22T00-21-14.015290.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_09_22T00_21_14.015290", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-09-22T00-21-14.015290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-09-22T00-21-14.015290.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_09_22T00_21_14.015290", "path": ["**/details_harness|hendrycksTest-management|5_2023-09-22T00-21-14.015290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-09-22T00-21-14.015290.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_09_22T00_21_14.015290", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-09-22T00-21-14.015290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-09-22T00-21-14.015290.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_09_22T00_21_14.015290", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-09-22T00-21-14.015290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-09-22T00-21-14.015290.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_09_22T00_21_14.015290", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-09-22T00-21-14.015290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-09-22T00-21-14.015290.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_09_22T00_21_14.015290", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-09-22T00-21-14.015290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-09-22T00-21-14.015290.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_09_22T00_21_14.015290", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-22T00-21-14.015290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-22T00-21-14.015290.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_09_22T00_21_14.015290", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-09-22T00-21-14.015290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-09-22T00-21-14.015290.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_09_22T00_21_14.015290", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-09-22T00-21-14.015290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-09-22T00-21-14.015290.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_09_22T00_21_14.015290", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-09-22T00-21-14.015290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-09-22T00-21-14.015290.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_09_22T00_21_14.015290", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-09-22T00-21-14.015290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-09-22T00-21-14.015290.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_09_22T00_21_14.015290", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-09-22T00-21-14.015290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-09-22T00-21-14.015290.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_09_22T00_21_14.015290", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-09-22T00-21-14.015290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-09-22T00-21-14.015290.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_09_22T00_21_14.015290", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-09-22T00-21-14.015290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-09-22T00-21-14.015290.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_09_22T00_21_14.015290", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-09-22T00-21-14.015290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-09-22T00-21-14.015290.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_09_22T00_21_14.015290", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-09-22T00-21-14.015290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-09-22T00-21-14.015290.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_09_22T00_21_14.015290", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-09-22T00-21-14.015290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-09-22T00-21-14.015290.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_09_22T00_21_14.015290", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-22T00-21-14.015290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-22T00-21-14.015290.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_09_22T00_21_14.015290", "path": ["**/details_harness|hendrycksTest-virology|5_2023-09-22T00-21-14.015290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-09-22T00-21-14.015290.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_09_22T00_21_14.015290", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-09-22T00-21-14.015290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-09-22T00-21-14.015290.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_09_22T00_21_14.015290", "path": ["**/details_harness|truthfulqa:mc|0_2023-09-22T00-21-14.015290.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-09-22T00-21-14.015290.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_27T01_56_08.960825", "path": ["**/details_harness|winogrande|5_2023-10-27T01-56-08.960825.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-27T01-56-08.960825.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_09_22T00_21_14.015290", "path": ["results_2023-09-22T00-21-14.015290.parquet"]}, {"split": "2023_10_27T01_56_08.960825", "path": ["results_2023-10-27T01-56-08.960825.parquet"]}, {"split": "latest", "path": ["results_2023-10-27T01-56-08.960825.parquet"]}]}]}
2023-10-27T00:56:21+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of zarakiquemparte/kuchiki-l2-7b ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model zarakiquemparte/kuchiki-l2-7b on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-10-27T01:56:08.960825(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of zarakiquemparte/kuchiki-l2-7b", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model zarakiquemparte/kuchiki-l2-7b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-27T01:56:08.960825(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of zarakiquemparte/kuchiki-l2-7b", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model zarakiquemparte/kuchiki-l2-7b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-27T01:56:08.960825(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 23, 31, 171, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of zarakiquemparte/kuchiki-l2-7b## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model zarakiquemparte/kuchiki-l2-7b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-27T01:56:08.960825(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
8de5c41ff63e30cc057c662ec92f5f3a35c8b71b
# Dataset Card for Evaluation run of zarakiquemparte/zarafusionex-1.2-l2-7b ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/zarakiquemparte/zarafusionex-1.2-l2-7b - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [zarakiquemparte/zarafusionex-1.2-l2-7b](https://huggingface.co/zarakiquemparte/zarafusionex-1.2-l2-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_zarakiquemparte__zarafusionex-1.2-l2-7b", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-10-23T14:21:00.502633](https://huggingface.co/datasets/open-llm-leaderboard/details_zarakiquemparte__zarafusionex-1.2-l2-7b/blob/main/results_2023-10-23T14-21-00.502633.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.13590604026845637, "em_stderr": 0.0035094536867653074, "f1": 0.22546770134228172, "f1_stderr": 0.003699492065854956, "acc": 0.4165529242035385, "acc_stderr": 0.009960183652638432 }, "harness|drop|3": { "em": 0.13590604026845637, "em_stderr": 0.0035094536867653074, "f1": 0.22546770134228172, "f1_stderr": 0.003699492065854956 }, "harness|gsm8k|5": { "acc": 0.08567096285064443, "acc_stderr": 0.007709218855882758 }, "harness|winogrande|5": { "acc": 0.7474348855564326, "acc_stderr": 0.012211148449394107 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_zarakiquemparte__zarafusionex-1.2-l2-7b
[ "region:us" ]
2023-09-21T23:25:00+00:00
{"pretty_name": "Evaluation run of zarakiquemparte/zarafusionex-1.2-l2-7b", "dataset_summary": "Dataset automatically created during the evaluation run of model [zarakiquemparte/zarafusionex-1.2-l2-7b](https://huggingface.co/zarakiquemparte/zarafusionex-1.2-l2-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_zarakiquemparte__zarafusionex-1.2-l2-7b\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-23T14:21:00.502633](https://huggingface.co/datasets/open-llm-leaderboard/details_zarakiquemparte__zarafusionex-1.2-l2-7b/blob/main/results_2023-10-23T14-21-00.502633.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.13590604026845637,\n \"em_stderr\": 0.0035094536867653074,\n \"f1\": 0.22546770134228172,\n \"f1_stderr\": 0.003699492065854956,\n \"acc\": 0.4165529242035385,\n \"acc_stderr\": 0.009960183652638432\n },\n \"harness|drop|3\": {\n \"em\": 0.13590604026845637,\n \"em_stderr\": 0.0035094536867653074,\n \"f1\": 0.22546770134228172,\n \"f1_stderr\": 0.003699492065854956\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.08567096285064443,\n \"acc_stderr\": 0.007709218855882758\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7474348855564326,\n \"acc_stderr\": 0.012211148449394107\n }\n}\n```", "repo_url": "https://huggingface.co/zarakiquemparte/zarafusionex-1.2-l2-7b", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_09_22T00_24_36.284847", "path": ["**/details_harness|arc:challenge|25_2023-09-22T00-24-36.284847.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-09-22T00-24-36.284847.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_23T14_21_00.502633", "path": ["**/details_harness|drop|3_2023-10-23T14-21-00.502633.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-23T14-21-00.502633.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_23T14_21_00.502633", "path": ["**/details_harness|gsm8k|5_2023-10-23T14-21-00.502633.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-23T14-21-00.502633.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_09_22T00_24_36.284847", "path": ["**/details_harness|hellaswag|10_2023-09-22T00-24-36.284847.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-09-22T00-24-36.284847.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_09_22T00_24_36.284847", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-22T00-24-36.284847.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-09-22T00-24-36.284847.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-09-22T00-24-36.284847.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-09-22T00-24-36.284847.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-22T00-24-36.284847.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-09-22T00-24-36.284847.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-09-22T00-24-36.284847.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-09-22T00-24-36.284847.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-09-22T00-24-36.284847.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-09-22T00-24-36.284847.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-09-22T00-24-36.284847.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-09-22T00-24-36.284847.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-22T00-24-36.284847.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-09-22T00-24-36.284847.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-22T00-24-36.284847.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-22T00-24-36.284847.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-09-22T00-24-36.284847.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-09-22T00-24-36.284847.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-09-22T00-24-36.284847.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-22T00-24-36.284847.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-22T00-24-36.284847.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-22T00-24-36.284847.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-09-22T00-24-36.284847.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-22T00-24-36.284847.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-22T00-24-36.284847.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-22T00-24-36.284847.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-22T00-24-36.284847.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-09-22T00-24-36.284847.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-22T00-24-36.284847.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-22T00-24-36.284847.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-22T00-24-36.284847.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-22T00-24-36.284847.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-09-22T00-24-36.284847.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-09-22T00-24-36.284847.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-09-22T00-24-36.284847.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-09-22T00-24-36.284847.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-22T00-24-36.284847.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-09-22T00-24-36.284847.parquet", "**/details_harness|hendrycksTest-management|5_2023-09-22T00-24-36.284847.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-09-22T00-24-36.284847.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-09-22T00-24-36.284847.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-09-22T00-24-36.284847.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-09-22T00-24-36.284847.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-22T00-24-36.284847.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-09-22T00-24-36.284847.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-09-22T00-24-36.284847.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-09-22T00-24-36.284847.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-09-22T00-24-36.284847.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-09-22T00-24-36.284847.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-09-22T00-24-36.284847.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-09-22T00-24-36.284847.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-09-22T00-24-36.284847.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-09-22T00-24-36.284847.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-09-22T00-24-36.284847.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-22T00-24-36.284847.parquet", "**/details_harness|hendrycksTest-virology|5_2023-09-22T00-24-36.284847.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-09-22T00-24-36.284847.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-22T00-24-36.284847.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-09-22T00-24-36.284847.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-09-22T00-24-36.284847.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-09-22T00-24-36.284847.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-22T00-24-36.284847.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-09-22T00-24-36.284847.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-09-22T00-24-36.284847.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-09-22T00-24-36.284847.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-09-22T00-24-36.284847.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-09-22T00-24-36.284847.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-09-22T00-24-36.284847.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-09-22T00-24-36.284847.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-22T00-24-36.284847.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-09-22T00-24-36.284847.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-22T00-24-36.284847.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-22T00-24-36.284847.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-09-22T00-24-36.284847.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-09-22T00-24-36.284847.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-09-22T00-24-36.284847.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-22T00-24-36.284847.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-22T00-24-36.284847.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-22T00-24-36.284847.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-09-22T00-24-36.284847.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-22T00-24-36.284847.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-22T00-24-36.284847.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-22T00-24-36.284847.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-22T00-24-36.284847.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-09-22T00-24-36.284847.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-22T00-24-36.284847.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-22T00-24-36.284847.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-22T00-24-36.284847.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-22T00-24-36.284847.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-09-22T00-24-36.284847.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-09-22T00-24-36.284847.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-09-22T00-24-36.284847.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-09-22T00-24-36.284847.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-22T00-24-36.284847.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-09-22T00-24-36.284847.parquet", "**/details_harness|hendrycksTest-management|5_2023-09-22T00-24-36.284847.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-09-22T00-24-36.284847.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-09-22T00-24-36.284847.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-09-22T00-24-36.284847.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-09-22T00-24-36.284847.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-22T00-24-36.284847.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-09-22T00-24-36.284847.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-09-22T00-24-36.284847.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-09-22T00-24-36.284847.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-09-22T00-24-36.284847.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-09-22T00-24-36.284847.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-09-22T00-24-36.284847.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-09-22T00-24-36.284847.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-09-22T00-24-36.284847.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-09-22T00-24-36.284847.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-09-22T00-24-36.284847.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-22T00-24-36.284847.parquet", "**/details_harness|hendrycksTest-virology|5_2023-09-22T00-24-36.284847.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-09-22T00-24-36.284847.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_09_22T00_24_36.284847", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-22T00-24-36.284847.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-22T00-24-36.284847.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_09_22T00_24_36.284847", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-09-22T00-24-36.284847.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-09-22T00-24-36.284847.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_09_22T00_24_36.284847", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-09-22T00-24-36.284847.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-09-22T00-24-36.284847.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_09_22T00_24_36.284847", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-09-22T00-24-36.284847.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-09-22T00-24-36.284847.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_09_22T00_24_36.284847", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-22T00-24-36.284847.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-22T00-24-36.284847.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_09_22T00_24_36.284847", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-09-22T00-24-36.284847.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-09-22T00-24-36.284847.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_09_22T00_24_36.284847", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-09-22T00-24-36.284847.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-09-22T00-24-36.284847.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_09_22T00_24_36.284847", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-09-22T00-24-36.284847.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-09-22T00-24-36.284847.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_09_22T00_24_36.284847", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-09-22T00-24-36.284847.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-09-22T00-24-36.284847.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_09_22T00_24_36.284847", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-09-22T00-24-36.284847.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-09-22T00-24-36.284847.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_09_22T00_24_36.284847", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-09-22T00-24-36.284847.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-09-22T00-24-36.284847.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_09_22T00_24_36.284847", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-09-22T00-24-36.284847.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-09-22T00-24-36.284847.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_09_22T00_24_36.284847", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-22T00-24-36.284847.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-22T00-24-36.284847.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_09_22T00_24_36.284847", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-09-22T00-24-36.284847.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-09-22T00-24-36.284847.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_09_22T00_24_36.284847", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-22T00-24-36.284847.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-22T00-24-36.284847.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_09_22T00_24_36.284847", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-22T00-24-36.284847.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-22T00-24-36.284847.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_09_22T00_24_36.284847", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-09-22T00-24-36.284847.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-09-22T00-24-36.284847.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_09_22T00_24_36.284847", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-09-22T00-24-36.284847.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-09-22T00-24-36.284847.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_09_22T00_24_36.284847", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-09-22T00-24-36.284847.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-09-22T00-24-36.284847.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_09_22T00_24_36.284847", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-22T00-24-36.284847.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-22T00-24-36.284847.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_09_22T00_24_36.284847", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-22T00-24-36.284847.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-22T00-24-36.284847.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_09_22T00_24_36.284847", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-22T00-24-36.284847.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-22T00-24-36.284847.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_09_22T00_24_36.284847", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-09-22T00-24-36.284847.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-09-22T00-24-36.284847.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_09_22T00_24_36.284847", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-22T00-24-36.284847.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-22T00-24-36.284847.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_09_22T00_24_36.284847", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-22T00-24-36.284847.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-22T00-24-36.284847.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_09_22T00_24_36.284847", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-22T00-24-36.284847.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-22T00-24-36.284847.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_09_22T00_24_36.284847", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-22T00-24-36.284847.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-22T00-24-36.284847.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_09_22T00_24_36.284847", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-09-22T00-24-36.284847.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-09-22T00-24-36.284847.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_09_22T00_24_36.284847", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-22T00-24-36.284847.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-22T00-24-36.284847.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_09_22T00_24_36.284847", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-22T00-24-36.284847.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-22T00-24-36.284847.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_09_22T00_24_36.284847", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-22T00-24-36.284847.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-22T00-24-36.284847.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_09_22T00_24_36.284847", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-22T00-24-36.284847.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-22T00-24-36.284847.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_09_22T00_24_36.284847", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-09-22T00-24-36.284847.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-09-22T00-24-36.284847.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_09_22T00_24_36.284847", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-09-22T00-24-36.284847.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-09-22T00-24-36.284847.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_09_22T00_24_36.284847", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-09-22T00-24-36.284847.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-09-22T00-24-36.284847.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_09_22T00_24_36.284847", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-09-22T00-24-36.284847.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-09-22T00-24-36.284847.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_09_22T00_24_36.284847", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-22T00-24-36.284847.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-22T00-24-36.284847.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_09_22T00_24_36.284847", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-09-22T00-24-36.284847.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-09-22T00-24-36.284847.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_09_22T00_24_36.284847", "path": ["**/details_harness|hendrycksTest-management|5_2023-09-22T00-24-36.284847.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-09-22T00-24-36.284847.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_09_22T00_24_36.284847", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-09-22T00-24-36.284847.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-09-22T00-24-36.284847.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_09_22T00_24_36.284847", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-09-22T00-24-36.284847.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-09-22T00-24-36.284847.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_09_22T00_24_36.284847", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-09-22T00-24-36.284847.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-09-22T00-24-36.284847.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_09_22T00_24_36.284847", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-09-22T00-24-36.284847.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-09-22T00-24-36.284847.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_09_22T00_24_36.284847", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-22T00-24-36.284847.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-22T00-24-36.284847.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_09_22T00_24_36.284847", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-09-22T00-24-36.284847.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-09-22T00-24-36.284847.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_09_22T00_24_36.284847", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-09-22T00-24-36.284847.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-09-22T00-24-36.284847.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_09_22T00_24_36.284847", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-09-22T00-24-36.284847.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-09-22T00-24-36.284847.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_09_22T00_24_36.284847", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-09-22T00-24-36.284847.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-09-22T00-24-36.284847.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_09_22T00_24_36.284847", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-09-22T00-24-36.284847.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-09-22T00-24-36.284847.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_09_22T00_24_36.284847", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-09-22T00-24-36.284847.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-09-22T00-24-36.284847.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_09_22T00_24_36.284847", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-09-22T00-24-36.284847.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-09-22T00-24-36.284847.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_09_22T00_24_36.284847", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-09-22T00-24-36.284847.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-09-22T00-24-36.284847.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_09_22T00_24_36.284847", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-09-22T00-24-36.284847.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-09-22T00-24-36.284847.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_09_22T00_24_36.284847", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-09-22T00-24-36.284847.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-09-22T00-24-36.284847.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_09_22T00_24_36.284847", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-22T00-24-36.284847.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-22T00-24-36.284847.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_09_22T00_24_36.284847", "path": ["**/details_harness|hendrycksTest-virology|5_2023-09-22T00-24-36.284847.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-09-22T00-24-36.284847.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_09_22T00_24_36.284847", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-09-22T00-24-36.284847.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-09-22T00-24-36.284847.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_09_22T00_24_36.284847", "path": ["**/details_harness|truthfulqa:mc|0_2023-09-22T00-24-36.284847.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-09-22T00-24-36.284847.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_23T14_21_00.502633", "path": ["**/details_harness|winogrande|5_2023-10-23T14-21-00.502633.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-23T14-21-00.502633.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_09_22T00_24_36.284847", "path": ["results_2023-09-22T00-24-36.284847.parquet"]}, {"split": "2023_10_23T14_21_00.502633", "path": ["results_2023-10-23T14-21-00.502633.parquet"]}, {"split": "latest", "path": ["results_2023-10-23T14-21-00.502633.parquet"]}]}]}
2023-10-23T13:21:13+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of zarakiquemparte/zarafusionex-1.2-l2-7b ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model zarakiquemparte/zarafusionex-1.2-l2-7b on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-10-23T14:21:00.502633(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of zarakiquemparte/zarafusionex-1.2-l2-7b", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model zarakiquemparte/zarafusionex-1.2-l2-7b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-23T14:21:00.502633(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of zarakiquemparte/zarafusionex-1.2-l2-7b", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model zarakiquemparte/zarafusionex-1.2-l2-7b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-23T14:21:00.502633(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 25, 31, 173, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of zarakiquemparte/zarafusionex-1.2-l2-7b## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model zarakiquemparte/zarafusionex-1.2-l2-7b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-23T14:21:00.502633(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
bde4fdeaedd935fffeb98d28557b11c0194aba10
# Dataset Card for "ficbook_raw" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
dim/ficbook_raw
[ "region:us" ]
2023-09-22T00:01:37+00:00
{"dataset_info": {"features": [{"name": "id", "dtype": "string"}, {"name": "author", "dtype": "string"}, {"name": "title", "dtype": "string"}, {"name": "link", "dtype": "string"}, {"name": "description", "dtype": "string"}, {"name": "tag", "dtype": "string"}, {"name": "likes", "dtype": "string"}, {"name": "date", "dtype": "string"}, {"name": "review", "dtype": "string"}, {"name": "format", "dtype": "string"}, {"name": "text", "dtype": "string"}, {"name": "rating", "dtype": "string"}, {"name": "status", "dtype": "string"}, {"name": "parts", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 1046798039, "num_examples": 114411}], "download_size": 539051486, "dataset_size": 1046798039}}
2023-09-22T00:07:00+00:00
[]
[]
TAGS #region-us
# Dataset Card for "ficbook_raw" More Information needed
[ "# Dataset Card for \"ficbook_raw\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"ficbook_raw\"\n\nMore Information needed" ]
[ 6, 14 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"ficbook_raw\"\n\nMore Information needed" ]
955e979aa44c476f467044ff6e01fc459cbd1eed
# Dataset Card for "hh-generated_flan_t5_large_flan_t5_large_zeroshot" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
dongyoung4091/hh-generated_flan_t5_large_flan_t5_large_zeroshot
[ "region:us" ]
2023-09-22T00:31:13+00:00
{"dataset_info": {"features": [{"name": "prompt", "dtype": "string"}, {"name": "response", "dtype": "string"}, {"name": "zeroshot_helpfulness", "dtype": "float64"}, {"name": "zeroshot_specificity", "dtype": "float64"}, {"name": "zeroshot_intent", "dtype": "float64"}, {"name": "zeroshot_factuality", "dtype": "float64"}, {"name": "zeroshot_easy-to-understand", "dtype": "float64"}, {"name": "zeroshot_relevance", "dtype": "float64"}, {"name": "zeroshot_readability", "dtype": "float64"}, {"name": "zeroshot_enough-detail", "dtype": "float64"}, {"name": "zeroshot_biased:", "dtype": "float64"}, {"name": "zeroshot_fail-to-consider-individual-preferences", "dtype": "float64"}, {"name": "zeroshot_repetetive", "dtype": "float64"}, {"name": "zeroshot_fail-to-consider-context", "dtype": "float64"}, {"name": "zeroshot_too-long", "dtype": "float64"}], "splits": [{"name": "train", "num_bytes": 6336357, "num_examples": 25600}], "download_size": 814393, "dataset_size": 6336357}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]}
2023-09-22T01:18:19+00:00
[]
[]
TAGS #region-us
# Dataset Card for "hh-generated_flan_t5_large_flan_t5_large_zeroshot" More Information needed
[ "# Dataset Card for \"hh-generated_flan_t5_large_flan_t5_large_zeroshot\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"hh-generated_flan_t5_large_flan_t5_large_zeroshot\"\n\nMore Information needed" ]
[ 6, 33 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"hh-generated_flan_t5_large_flan_t5_large_zeroshot\"\n\nMore Information needed" ]
ba07fb00582355cb3ac7308c8439e7b17042e718
# Dataset Card for "hh-generated_flan_t5_large_flan_t5_base_zeroshot" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
dongyoung4091/hh-generated_flan_t5_large_flan_t5_base_zeroshot
[ "region:us" ]
2023-09-22T00:32:39+00:00
{"dataset_info": {"features": [{"name": "prompt", "dtype": "string"}, {"name": "response", "dtype": "string"}, {"name": "zeroshot_helpfulness", "dtype": "float64"}, {"name": "zeroshot_specificity", "dtype": "float64"}, {"name": "zeroshot_intent", "dtype": "float64"}, {"name": "zeroshot_factuality", "dtype": "float64"}, {"name": "zeroshot_easy-to-understand", "dtype": "float64"}, {"name": "zeroshot_relevance", "dtype": "float64"}, {"name": "zeroshot_readability", "dtype": "float64"}, {"name": "zeroshot_enough-detail", "dtype": "float64"}, {"name": "zeroshot_biased:", "dtype": "float64"}, {"name": "zeroshot_fail-to-consider-individual-preferences", "dtype": "float64"}, {"name": "zeroshot_repetetive", "dtype": "float64"}, {"name": "zeroshot_fail-to-consider-context", "dtype": "float64"}, {"name": "zeroshot_too-long", "dtype": "float64"}], "splits": [{"name": "train", "num_bytes": 6336357, "num_examples": 25600}], "download_size": 0, "dataset_size": 6336357}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]}
2023-09-22T23:41:25+00:00
[]
[]
TAGS #region-us
# Dataset Card for "hh-generated_flan_t5_large_flan_t5_base_zeroshot" More Information needed
[ "# Dataset Card for \"hh-generated_flan_t5_large_flan_t5_base_zeroshot\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"hh-generated_flan_t5_large_flan_t5_base_zeroshot\"\n\nMore Information needed" ]
[ 6, 32 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"hh-generated_flan_t5_large_flan_t5_base_zeroshot\"\n\nMore Information needed" ]
f184cfc34f384cb3fc89e79871f8d305c57d95c7
# Dataset Card for "shp-generated_flan_t5_large_flan_t5_base_zeroshot" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
dongyoung4091/shp-generated_flan_t5_large_flan_t5_base_zeroshot
[ "region:us" ]
2023-09-22T00:34:21+00:00
{"dataset_info": {"features": [{"name": "prompt", "dtype": "string"}, {"name": "response", "dtype": "string"}, {"name": "zeroshot_helpfulness", "dtype": "float64"}, {"name": "zeroshot_specificity", "dtype": "float64"}, {"name": "zeroshot_intent", "dtype": "float64"}, {"name": "zeroshot_factuality", "dtype": "float64"}, {"name": "zeroshot_easy-to-understand", "dtype": "float64"}, {"name": "zeroshot_relevance", "dtype": "float64"}, {"name": "zeroshot_readability", "dtype": "float64"}, {"name": "zeroshot_enough-detail", "dtype": "float64"}, {"name": "zeroshot_biased:", "dtype": "float64"}, {"name": "zeroshot_fail-to-consider-individual-preferences", "dtype": "float64"}, {"name": "zeroshot_repetetive", "dtype": "float64"}, {"name": "zeroshot_fail-to-consider-context", "dtype": "float64"}, {"name": "zeroshot_too-long", "dtype": "float64"}], "splits": [{"name": "train", "num_bytes": 29493865, "num_examples": 25600}], "download_size": 0, "dataset_size": 29493865}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]}
2023-09-22T23:41:46+00:00
[]
[]
TAGS #region-us
# Dataset Card for "shp-generated_flan_t5_large_flan_t5_base_zeroshot" More Information needed
[ "# Dataset Card for \"shp-generated_flan_t5_large_flan_t5_base_zeroshot\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"shp-generated_flan_t5_large_flan_t5_base_zeroshot\"\n\nMore Information needed" ]
[ 6, 33 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"shp-generated_flan_t5_large_flan_t5_base_zeroshot\"\n\nMore Information needed" ]
a5f825574a904e03c23281a48c691ce805fc0d57
# Dataset Card for Evaluation run of Xwin-LM/Xwin-LM-7B-V0.1 ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/Xwin-LM/Xwin-LM-7B-V0.1 - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [Xwin-LM/Xwin-LM-7B-V0.1](https://huggingface.co/Xwin-LM/Xwin-LM-7B-V0.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_Xwin-LM__Xwin-LM-7B-V0.1", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-10-29T08:21:17.529053](https://huggingface.co/datasets/open-llm-leaderboard/details_Xwin-LM__Xwin-LM-7B-V0.1/blob/main/results_2023-10-29T08-21-17.529053.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.027894295302013424, "em_stderr": 0.0016863747631550056, "f1": 0.09094588926174478, "f1_stderr": 0.0021236742209429166, "acc": 0.393149302914779, "acc_stderr": 0.009302457480391348 }, "harness|drop|3": { "em": 0.027894295302013424, "em_stderr": 0.0016863747631550056, "f1": 0.09094588926174478, "f1_stderr": 0.0021236742209429166 }, "harness|gsm8k|5": { "acc": 0.05307050796057619, "acc_stderr": 0.006174868858638367 }, "harness|winogrande|5": { "acc": 0.7332280978689818, "acc_stderr": 0.01243004610214433 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_Xwin-LM__Xwin-LM-7B-V0.1
[ "region:us" ]
2023-09-22T00:35:25+00:00
{"pretty_name": "Evaluation run of Xwin-LM/Xwin-LM-7B-V0.1", "dataset_summary": "Dataset automatically created during the evaluation run of model [Xwin-LM/Xwin-LM-7B-V0.1](https://huggingface.co/Xwin-LM/Xwin-LM-7B-V0.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Xwin-LM__Xwin-LM-7B-V0.1\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-29T08:21:17.529053](https://huggingface.co/datasets/open-llm-leaderboard/details_Xwin-LM__Xwin-LM-7B-V0.1/blob/main/results_2023-10-29T08-21-17.529053.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.027894295302013424,\n \"em_stderr\": 0.0016863747631550056,\n \"f1\": 0.09094588926174478,\n \"f1_stderr\": 0.0021236742209429166,\n \"acc\": 0.393149302914779,\n \"acc_stderr\": 0.009302457480391348\n },\n \"harness|drop|3\": {\n \"em\": 0.027894295302013424,\n \"em_stderr\": 0.0016863747631550056,\n \"f1\": 0.09094588926174478,\n \"f1_stderr\": 0.0021236742209429166\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.05307050796057619,\n \"acc_stderr\": 0.006174868858638367\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7332280978689818,\n \"acc_stderr\": 0.01243004610214433\n }\n}\n```", "repo_url": "https://huggingface.co/Xwin-LM/Xwin-LM-7B-V0.1", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_09_22T01_35_00.215271", "path": ["**/details_harness|arc:challenge|25_2023-09-22T01-35-00.215271.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-09-22T01-35-00.215271.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_29T08_21_17.529053", "path": ["**/details_harness|drop|3_2023-10-29T08-21-17.529053.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-29T08-21-17.529053.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_29T08_21_17.529053", "path": ["**/details_harness|gsm8k|5_2023-10-29T08-21-17.529053.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-29T08-21-17.529053.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_09_22T01_35_00.215271", "path": ["**/details_harness|hellaswag|10_2023-09-22T01-35-00.215271.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-09-22T01-35-00.215271.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_09_22T01_35_00.215271", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-22T01-35-00.215271.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-09-22T01-35-00.215271.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-09-22T01-35-00.215271.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-09-22T01-35-00.215271.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-22T01-35-00.215271.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-09-22T01-35-00.215271.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-09-22T01-35-00.215271.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-09-22T01-35-00.215271.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-09-22T01-35-00.215271.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-09-22T01-35-00.215271.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-09-22T01-35-00.215271.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-09-22T01-35-00.215271.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-22T01-35-00.215271.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-09-22T01-35-00.215271.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-22T01-35-00.215271.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-22T01-35-00.215271.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-09-22T01-35-00.215271.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-09-22T01-35-00.215271.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-09-22T01-35-00.215271.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-22T01-35-00.215271.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-22T01-35-00.215271.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-22T01-35-00.215271.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-09-22T01-35-00.215271.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-22T01-35-00.215271.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-22T01-35-00.215271.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-22T01-35-00.215271.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-22T01-35-00.215271.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-09-22T01-35-00.215271.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-22T01-35-00.215271.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-22T01-35-00.215271.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-22T01-35-00.215271.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-22T01-35-00.215271.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-09-22T01-35-00.215271.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-09-22T01-35-00.215271.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-09-22T01-35-00.215271.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-09-22T01-35-00.215271.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-22T01-35-00.215271.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-09-22T01-35-00.215271.parquet", "**/details_harness|hendrycksTest-management|5_2023-09-22T01-35-00.215271.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-09-22T01-35-00.215271.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-09-22T01-35-00.215271.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-09-22T01-35-00.215271.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-09-22T01-35-00.215271.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-22T01-35-00.215271.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-09-22T01-35-00.215271.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-09-22T01-35-00.215271.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-09-22T01-35-00.215271.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-09-22T01-35-00.215271.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-09-22T01-35-00.215271.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-09-22T01-35-00.215271.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-09-22T01-35-00.215271.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-09-22T01-35-00.215271.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-09-22T01-35-00.215271.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-09-22T01-35-00.215271.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-22T01-35-00.215271.parquet", "**/details_harness|hendrycksTest-virology|5_2023-09-22T01-35-00.215271.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-09-22T01-35-00.215271.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-22T01-35-00.215271.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-09-22T01-35-00.215271.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-09-22T01-35-00.215271.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-09-22T01-35-00.215271.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-22T01-35-00.215271.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-09-22T01-35-00.215271.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-09-22T01-35-00.215271.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-09-22T01-35-00.215271.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-09-22T01-35-00.215271.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-09-22T01-35-00.215271.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-09-22T01-35-00.215271.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-09-22T01-35-00.215271.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-22T01-35-00.215271.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-09-22T01-35-00.215271.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-22T01-35-00.215271.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-22T01-35-00.215271.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-09-22T01-35-00.215271.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-09-22T01-35-00.215271.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-09-22T01-35-00.215271.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-22T01-35-00.215271.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-22T01-35-00.215271.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-22T01-35-00.215271.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-09-22T01-35-00.215271.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-22T01-35-00.215271.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-22T01-35-00.215271.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-22T01-35-00.215271.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-22T01-35-00.215271.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-09-22T01-35-00.215271.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-22T01-35-00.215271.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-22T01-35-00.215271.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-22T01-35-00.215271.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-22T01-35-00.215271.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-09-22T01-35-00.215271.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-09-22T01-35-00.215271.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-09-22T01-35-00.215271.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-09-22T01-35-00.215271.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-22T01-35-00.215271.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-09-22T01-35-00.215271.parquet", "**/details_harness|hendrycksTest-management|5_2023-09-22T01-35-00.215271.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-09-22T01-35-00.215271.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-09-22T01-35-00.215271.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-09-22T01-35-00.215271.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-09-22T01-35-00.215271.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-22T01-35-00.215271.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-09-22T01-35-00.215271.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-09-22T01-35-00.215271.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-09-22T01-35-00.215271.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-09-22T01-35-00.215271.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-09-22T01-35-00.215271.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-09-22T01-35-00.215271.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-09-22T01-35-00.215271.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-09-22T01-35-00.215271.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-09-22T01-35-00.215271.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-09-22T01-35-00.215271.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-22T01-35-00.215271.parquet", "**/details_harness|hendrycksTest-virology|5_2023-09-22T01-35-00.215271.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-09-22T01-35-00.215271.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_09_22T01_35_00.215271", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-22T01-35-00.215271.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-22T01-35-00.215271.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_09_22T01_35_00.215271", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-09-22T01-35-00.215271.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-09-22T01-35-00.215271.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_09_22T01_35_00.215271", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-09-22T01-35-00.215271.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-09-22T01-35-00.215271.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_09_22T01_35_00.215271", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-09-22T01-35-00.215271.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-09-22T01-35-00.215271.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_09_22T01_35_00.215271", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-22T01-35-00.215271.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-22T01-35-00.215271.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_09_22T01_35_00.215271", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-09-22T01-35-00.215271.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-09-22T01-35-00.215271.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_09_22T01_35_00.215271", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-09-22T01-35-00.215271.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-09-22T01-35-00.215271.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_09_22T01_35_00.215271", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-09-22T01-35-00.215271.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-09-22T01-35-00.215271.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_09_22T01_35_00.215271", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-09-22T01-35-00.215271.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-09-22T01-35-00.215271.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_09_22T01_35_00.215271", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-09-22T01-35-00.215271.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-09-22T01-35-00.215271.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_09_22T01_35_00.215271", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-09-22T01-35-00.215271.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-09-22T01-35-00.215271.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_09_22T01_35_00.215271", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-09-22T01-35-00.215271.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-09-22T01-35-00.215271.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_09_22T01_35_00.215271", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-22T01-35-00.215271.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-22T01-35-00.215271.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_09_22T01_35_00.215271", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-09-22T01-35-00.215271.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-09-22T01-35-00.215271.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_09_22T01_35_00.215271", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-22T01-35-00.215271.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-22T01-35-00.215271.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_09_22T01_35_00.215271", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-22T01-35-00.215271.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-22T01-35-00.215271.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_09_22T01_35_00.215271", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-09-22T01-35-00.215271.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-09-22T01-35-00.215271.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_09_22T01_35_00.215271", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-09-22T01-35-00.215271.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-09-22T01-35-00.215271.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_09_22T01_35_00.215271", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-09-22T01-35-00.215271.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-09-22T01-35-00.215271.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_09_22T01_35_00.215271", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-22T01-35-00.215271.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-22T01-35-00.215271.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_09_22T01_35_00.215271", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-22T01-35-00.215271.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-22T01-35-00.215271.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_09_22T01_35_00.215271", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-22T01-35-00.215271.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-22T01-35-00.215271.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_09_22T01_35_00.215271", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-09-22T01-35-00.215271.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-09-22T01-35-00.215271.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_09_22T01_35_00.215271", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-22T01-35-00.215271.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-22T01-35-00.215271.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_09_22T01_35_00.215271", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-22T01-35-00.215271.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-22T01-35-00.215271.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_09_22T01_35_00.215271", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-22T01-35-00.215271.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-22T01-35-00.215271.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_09_22T01_35_00.215271", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-22T01-35-00.215271.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-22T01-35-00.215271.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_09_22T01_35_00.215271", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-09-22T01-35-00.215271.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-09-22T01-35-00.215271.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_09_22T01_35_00.215271", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-22T01-35-00.215271.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-22T01-35-00.215271.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_09_22T01_35_00.215271", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-22T01-35-00.215271.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-22T01-35-00.215271.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_09_22T01_35_00.215271", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-22T01-35-00.215271.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-22T01-35-00.215271.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_09_22T01_35_00.215271", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-22T01-35-00.215271.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-22T01-35-00.215271.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_09_22T01_35_00.215271", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-09-22T01-35-00.215271.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-09-22T01-35-00.215271.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_09_22T01_35_00.215271", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-09-22T01-35-00.215271.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-09-22T01-35-00.215271.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_09_22T01_35_00.215271", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-09-22T01-35-00.215271.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-09-22T01-35-00.215271.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_09_22T01_35_00.215271", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-09-22T01-35-00.215271.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-09-22T01-35-00.215271.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_09_22T01_35_00.215271", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-22T01-35-00.215271.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-22T01-35-00.215271.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_09_22T01_35_00.215271", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-09-22T01-35-00.215271.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-09-22T01-35-00.215271.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_09_22T01_35_00.215271", "path": ["**/details_harness|hendrycksTest-management|5_2023-09-22T01-35-00.215271.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-09-22T01-35-00.215271.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_09_22T01_35_00.215271", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-09-22T01-35-00.215271.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-09-22T01-35-00.215271.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_09_22T01_35_00.215271", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-09-22T01-35-00.215271.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-09-22T01-35-00.215271.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_09_22T01_35_00.215271", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-09-22T01-35-00.215271.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-09-22T01-35-00.215271.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_09_22T01_35_00.215271", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-09-22T01-35-00.215271.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-09-22T01-35-00.215271.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_09_22T01_35_00.215271", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-22T01-35-00.215271.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-22T01-35-00.215271.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_09_22T01_35_00.215271", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-09-22T01-35-00.215271.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-09-22T01-35-00.215271.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_09_22T01_35_00.215271", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-09-22T01-35-00.215271.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-09-22T01-35-00.215271.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_09_22T01_35_00.215271", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-09-22T01-35-00.215271.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-09-22T01-35-00.215271.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_09_22T01_35_00.215271", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-09-22T01-35-00.215271.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-09-22T01-35-00.215271.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_09_22T01_35_00.215271", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-09-22T01-35-00.215271.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-09-22T01-35-00.215271.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_09_22T01_35_00.215271", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-09-22T01-35-00.215271.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-09-22T01-35-00.215271.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_09_22T01_35_00.215271", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-09-22T01-35-00.215271.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-09-22T01-35-00.215271.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_09_22T01_35_00.215271", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-09-22T01-35-00.215271.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-09-22T01-35-00.215271.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_09_22T01_35_00.215271", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-09-22T01-35-00.215271.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-09-22T01-35-00.215271.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_09_22T01_35_00.215271", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-09-22T01-35-00.215271.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-09-22T01-35-00.215271.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_09_22T01_35_00.215271", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-22T01-35-00.215271.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-22T01-35-00.215271.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_09_22T01_35_00.215271", "path": ["**/details_harness|hendrycksTest-virology|5_2023-09-22T01-35-00.215271.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-09-22T01-35-00.215271.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_09_22T01_35_00.215271", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-09-22T01-35-00.215271.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-09-22T01-35-00.215271.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_09_22T01_35_00.215271", "path": ["**/details_harness|truthfulqa:mc|0_2023-09-22T01-35-00.215271.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-09-22T01-35-00.215271.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_29T08_21_17.529053", "path": ["**/details_harness|winogrande|5_2023-10-29T08-21-17.529053.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-29T08-21-17.529053.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_09_22T01_35_00.215271", "path": ["results_2023-09-22T01-35-00.215271.parquet"]}, {"split": "2023_10_29T08_21_17.529053", "path": ["results_2023-10-29T08-21-17.529053.parquet"]}, {"split": "latest", "path": ["results_2023-10-29T08-21-17.529053.parquet"]}]}]}
2023-10-29T08:21:30+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of Xwin-LM/Xwin-LM-7B-V0.1 ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model Xwin-LM/Xwin-LM-7B-V0.1 on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-10-29T08:21:17.529053(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of Xwin-LM/Xwin-LM-7B-V0.1", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model Xwin-LM/Xwin-LM-7B-V0.1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-29T08:21:17.529053(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of Xwin-LM/Xwin-LM-7B-V0.1", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model Xwin-LM/Xwin-LM-7B-V0.1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-29T08:21:17.529053(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 23, 31, 171, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of Xwin-LM/Xwin-LM-7B-V0.1## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model Xwin-LM/Xwin-LM-7B-V0.1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-29T08:21:17.529053(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
f93cb8ff87993739adfb02094253df3135d348cf
# Dataset Card for "data_synthesis" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
thanhduycao/data_synthesis
[ "region:us" ]
2023-09-22T00:36:57+00:00
{"dataset_info": {"features": [{"name": "audio", "struct": [{"name": "array", "sequence": "float64"}, {"name": "path", "dtype": "null"}, {"name": "sampling_rate", "dtype": "int64"}]}, {"name": "transcription", "dtype": "string"}, {"name": "old_transcription", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 2364881573, "num_examples": 4430}], "download_size": 559968141, "dataset_size": 2364881573}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]}
2023-09-22T00:37:35+00:00
[]
[]
TAGS #region-us
# Dataset Card for "data_synthesis" More Information needed
[ "# Dataset Card for \"data_synthesis\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"data_synthesis\"\n\nMore Information needed" ]
[ 6, 15 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"data_synthesis\"\n\nMore Information needed" ]
b5df17cda88d08aacf8460fb0b45d337d5db538b
# Dataset Card for "shp_with_features_20k_flan_t5_large_flan_t5_large_zeroshot" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
dongyoung4091/shp_with_features_20k_flan_t5_large_flan_t5_large_zeroshot
[ "region:us" ]
2023-09-22T00:43:05+00:00
{"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "test", "path": "data/test-*"}]}], "dataset_info": {"features": [{"name": "post_id", "dtype": "string"}, {"name": "domain", "dtype": "string"}, {"name": "upvote_ratio", "dtype": "float64"}, {"name": "history", "dtype": "string"}, {"name": "c_root_id_A", "dtype": "string"}, {"name": "c_root_id_B", "dtype": "string"}, {"name": "created_at_utc_A", "dtype": "int64"}, {"name": "created_at_utc_B", "dtype": "int64"}, {"name": "score_A", "dtype": "int64"}, {"name": "score_B", "dtype": "int64"}, {"name": "human_ref_A", "dtype": "string"}, {"name": "human_ref_B", "dtype": "string"}, {"name": "labels", "dtype": "int64"}, {"name": "seconds_difference", "dtype": "float64"}, {"name": "score_ratio", "dtype": "float64"}, {"name": "helpfulness_A", "dtype": "float64"}, {"name": "helpfulness_B", "dtype": "float64"}, {"name": "specificity_A", "dtype": "float64"}, {"name": "specificity_B", "dtype": "float64"}, {"name": "intent_A", "dtype": "float64"}, {"name": "intent_B", "dtype": "float64"}, {"name": "factuality_A", "dtype": "float64"}, {"name": "factuality_B", "dtype": "float64"}, {"name": "easy-to-understand_A", "dtype": "float64"}, {"name": "easy-to-understand_B", "dtype": "float64"}, {"name": "relevance_A", "dtype": "float64"}, {"name": "relevance_B", "dtype": "float64"}, {"name": "readability_A", "dtype": "float64"}, {"name": "readability_B", "dtype": "float64"}, {"name": "enough-detail_A", "dtype": "float64"}, {"name": "enough-detail_B", "dtype": "float64"}, {"name": "biased:_A", "dtype": "float64"}, {"name": "biased:_B", "dtype": "float64"}, {"name": "fail-to-consider-individual-preferences_A", "dtype": "float64"}, {"name": "fail-to-consider-individual-preferences_B", "dtype": "float64"}, {"name": "repetetive_A", "dtype": "float64"}, {"name": "repetetive_B", "dtype": "float64"}, {"name": "fail-to-consider-context_A", "dtype": "float64"}, {"name": "fail-to-consider-context_B", "dtype": "float64"}, {"name": "too-long_A", "dtype": "float64"}, {"name": "too-long_B", "dtype": "float64"}, {"name": "__index_level_0__", "dtype": "int64"}, {"name": "log_score_A", "dtype": "float64"}, {"name": "log_score_B", "dtype": "float64"}, {"name": "zeroshot_helpfulness_A", "dtype": "float64"}, {"name": "zeroshot_helpfulness_B", "dtype": "float64"}, {"name": "zeroshot_specificity_A", "dtype": "float64"}, {"name": "zeroshot_specificity_B", "dtype": "float64"}, {"name": "zeroshot_intent_A", "dtype": "float64"}, {"name": "zeroshot_intent_B", "dtype": "float64"}, {"name": "zeroshot_factuality_A", "dtype": "float64"}, {"name": "zeroshot_factuality_B", "dtype": "float64"}, {"name": "zeroshot_easy-to-understand_A", "dtype": "float64"}, {"name": "zeroshot_easy-to-understand_B", "dtype": "float64"}, {"name": "zeroshot_relevance_A", "dtype": "float64"}, {"name": "zeroshot_relevance_B", "dtype": "float64"}, {"name": "zeroshot_readability_A", "dtype": "float64"}, {"name": "zeroshot_readability_B", "dtype": "float64"}, {"name": "zeroshot_enough-detail_A", "dtype": "float64"}, {"name": "zeroshot_enough-detail_B", "dtype": "float64"}, {"name": "zeroshot_biased:_A", "dtype": "float64"}, {"name": "zeroshot_biased:_B", "dtype": "float64"}, {"name": "zeroshot_fail-to-consider-individual-preferences_A", "dtype": "float64"}, {"name": "zeroshot_fail-to-consider-individual-preferences_B", "dtype": "float64"}, {"name": "zeroshot_repetetive_A", "dtype": "float64"}, {"name": "zeroshot_repetetive_B", "dtype": "float64"}, {"name": "zeroshot_fail-to-consider-context_A", "dtype": "float64"}, {"name": "zeroshot_fail-to-consider-context_B", "dtype": "float64"}, {"name": "zeroshot_too-long_A", "dtype": "float64"}, {"name": "zeroshot_too-long_B", "dtype": "float64"}], "splits": [{"name": "train", "num_bytes": 22674534, "num_examples": 9459}, {"name": "test", "num_bytes": 22627412, "num_examples": 9459}], "download_size": 12124964, "dataset_size": 45301946}}
2023-09-22T01:12:52+00:00
[]
[]
TAGS #region-us
# Dataset Card for "shp_with_features_20k_flan_t5_large_flan_t5_large_zeroshot" More Information needed
[ "# Dataset Card for \"shp_with_features_20k_flan_t5_large_flan_t5_large_zeroshot\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"shp_with_features_20k_flan_t5_large_flan_t5_large_zeroshot\"\n\nMore Information needed" ]
[ 6, 39 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"shp_with_features_20k_flan_t5_large_flan_t5_large_zeroshot\"\n\nMore Information needed" ]
8b24834f98b1ee9ee9a4410fcc94a8c6a460b430
# Dataset Card for "shp_with_features_20k_flan_t5_large_flan_t5_base_zeroshot" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
dongyoung4091/shp_with_features_20k_flan_t5_large_flan_t5_base_zeroshot
[ "region:us" ]
2023-09-22T00:46:20+00:00
{"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "test", "path": "data/test-*"}]}], "dataset_info": {"features": [{"name": "post_id", "dtype": "string"}, {"name": "domain", "dtype": "string"}, {"name": "upvote_ratio", "dtype": "float64"}, {"name": "history", "dtype": "string"}, {"name": "c_root_id_A", "dtype": "string"}, {"name": "c_root_id_B", "dtype": "string"}, {"name": "created_at_utc_A", "dtype": "int64"}, {"name": "created_at_utc_B", "dtype": "int64"}, {"name": "score_A", "dtype": "int64"}, {"name": "score_B", "dtype": "int64"}, {"name": "human_ref_A", "dtype": "string"}, {"name": "human_ref_B", "dtype": "string"}, {"name": "labels", "dtype": "int64"}, {"name": "seconds_difference", "dtype": "float64"}, {"name": "score_ratio", "dtype": "float64"}, {"name": "helpfulness_A", "dtype": "float64"}, {"name": "helpfulness_B", "dtype": "float64"}, {"name": "specificity_A", "dtype": "float64"}, {"name": "specificity_B", "dtype": "float64"}, {"name": "intent_A", "dtype": "float64"}, {"name": "intent_B", "dtype": "float64"}, {"name": "factuality_A", "dtype": "float64"}, {"name": "factuality_B", "dtype": "float64"}, {"name": "easy-to-understand_A", "dtype": "float64"}, {"name": "easy-to-understand_B", "dtype": "float64"}, {"name": "relevance_A", "dtype": "float64"}, {"name": "relevance_B", "dtype": "float64"}, {"name": "readability_A", "dtype": "float64"}, {"name": "readability_B", "dtype": "float64"}, {"name": "enough-detail_A", "dtype": "float64"}, {"name": "enough-detail_B", "dtype": "float64"}, {"name": "biased:_A", "dtype": "float64"}, {"name": "biased:_B", "dtype": "float64"}, {"name": "fail-to-consider-individual-preferences_A", "dtype": "float64"}, {"name": "fail-to-consider-individual-preferences_B", "dtype": "float64"}, {"name": "repetetive_A", "dtype": "float64"}, {"name": "repetetive_B", "dtype": "float64"}, {"name": "fail-to-consider-context_A", "dtype": "float64"}, {"name": "fail-to-consider-context_B", "dtype": "float64"}, {"name": "too-long_A", "dtype": "float64"}, {"name": "too-long_B", "dtype": "float64"}, {"name": "__index_level_0__", "dtype": "int64"}, {"name": "log_score_A", "dtype": "float64"}, {"name": "log_score_B", "dtype": "float64"}, {"name": "zeroshot_helpfulness_A", "dtype": "float64"}, {"name": "zeroshot_helpfulness_B", "dtype": "float64"}, {"name": "zeroshot_specificity_A", "dtype": "float64"}, {"name": "zeroshot_specificity_B", "dtype": "float64"}, {"name": "zeroshot_intent_A", "dtype": "float64"}, {"name": "zeroshot_intent_B", "dtype": "float64"}, {"name": "zeroshot_factuality_A", "dtype": "float64"}, {"name": "zeroshot_factuality_B", "dtype": "float64"}, {"name": "zeroshot_easy-to-understand_A", "dtype": "float64"}, {"name": "zeroshot_easy-to-understand_B", "dtype": "float64"}, {"name": "zeroshot_relevance_A", "dtype": "float64"}, {"name": "zeroshot_relevance_B", "dtype": "float64"}, {"name": "zeroshot_readability_A", "dtype": "float64"}, {"name": "zeroshot_readability_B", "dtype": "float64"}, {"name": "zeroshot_enough-detail_A", "dtype": "float64"}, {"name": "zeroshot_enough-detail_B", "dtype": "float64"}, {"name": "zeroshot_biased:_A", "dtype": "float64"}, {"name": "zeroshot_biased:_B", "dtype": "float64"}, {"name": "zeroshot_fail-to-consider-individual-preferences_A", "dtype": "float64"}, {"name": "zeroshot_fail-to-consider-individual-preferences_B", "dtype": "float64"}, {"name": "zeroshot_repetetive_A", "dtype": "float64"}, {"name": "zeroshot_repetetive_B", "dtype": "float64"}, {"name": "zeroshot_fail-to-consider-context_A", "dtype": "float64"}, {"name": "zeroshot_fail-to-consider-context_B", "dtype": "float64"}, {"name": "zeroshot_too-long_A", "dtype": "float64"}, {"name": "zeroshot_too-long_B", "dtype": "float64"}], "splits": [{"name": "train", "num_bytes": 22674534, "num_examples": 9459}, {"name": "test", "num_bytes": 22627412, "num_examples": 9459}], "download_size": 0, "dataset_size": 45301946}}
2023-09-22T23:41:52+00:00
[]
[]
TAGS #region-us
# Dataset Card for "shp_with_features_20k_flan_t5_large_flan_t5_base_zeroshot" More Information needed
[ "# Dataset Card for \"shp_with_features_20k_flan_t5_large_flan_t5_base_zeroshot\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"shp_with_features_20k_flan_t5_large_flan_t5_base_zeroshot\"\n\nMore Information needed" ]
[ 6, 38 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"shp_with_features_20k_flan_t5_large_flan_t5_base_zeroshot\"\n\nMore Information needed" ]
92d09cc6a7794fe0455a5ecdea44193a0fa0e975
# Dataset Card for "hh-rlhf_with_features_flan_t5_large_flan_t5_base_zeroshot" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
dongyoung4091/hh-rlhf_with_features_flan_t5_large_flan_t5_base_zeroshot
[ "region:us" ]
2023-09-22T00:49:46+00:00
{"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "test", "path": "data/test-*"}]}], "dataset_info": {"features": [{"name": "chosen", "dtype": "string"}, {"name": "rejected", "dtype": "string"}, {"name": "helpfulness_chosen", "dtype": "int64"}, {"name": "helpfulness_rejected", "dtype": "int64"}, {"name": "specificity_chosen", "dtype": "int64"}, {"name": "specificity_rejected", "dtype": "int64"}, {"name": "intent_chosen", "dtype": "int64"}, {"name": "intent_rejected", "dtype": "int64"}, {"name": "factuality_chosen", "dtype": "int64"}, {"name": "factuality_rejected", "dtype": "int64"}, {"name": "easy-to-understand_chosen", "dtype": "int64"}, {"name": "easy-to-understand_rejected", "dtype": "int64"}, {"name": "relevance_chosen", "dtype": "int64"}, {"name": "relevance_rejected", "dtype": "int64"}, {"name": "readability_chosen", "dtype": "int64"}, {"name": "readability_rejected", "dtype": "int64"}, {"name": "enough-detail_chosen", "dtype": "int64"}, {"name": "enough-detail_rejected", "dtype": "int64"}, {"name": "biased:_chosen", "dtype": "int64"}, {"name": "biased:_rejected", "dtype": "int64"}, {"name": "fail-to-consider-individual-preferences_chosen", "dtype": "int64"}, {"name": "fail-to-consider-individual-preferences_rejected", "dtype": "int64"}, {"name": "repetetive_chosen", "dtype": "int64"}, {"name": "repetetive_rejected", "dtype": "int64"}, {"name": "fail-to-consider-context_chosen", "dtype": "int64"}, {"name": "fail-to-consider-context_rejected", "dtype": "int64"}, {"name": "too-long_chosen", "dtype": "int64"}, {"name": "too-long_rejected", "dtype": "int64"}, {"name": "human", "dtype": "string"}, {"name": "assistant_chosen", "dtype": "string"}, {"name": "assistant_rejected", "dtype": "string"}, {"name": "log_score_chosen", "dtype": "float64"}, {"name": "log_score_rejected", "dtype": "float64"}, {"name": "labels", "dtype": "string"}, {"name": "zeroshot_helpfulness_chosen", "dtype": "float64"}, {"name": "zeroshot_helpfulness_rejected", "dtype": "float64"}, {"name": "zeroshot_specificity_chosen", "dtype": "float64"}, {"name": "zeroshot_specificity_rejected", "dtype": "float64"}, {"name": "zeroshot_intent_chosen", "dtype": "float64"}, {"name": "zeroshot_intent_rejected", "dtype": "float64"}, {"name": "zeroshot_factuality_chosen", "dtype": "float64"}, {"name": "zeroshot_factuality_rejected", "dtype": "float64"}, {"name": "zeroshot_easy-to-understand_chosen", "dtype": "float64"}, {"name": "zeroshot_easy-to-understand_rejected", "dtype": "float64"}, {"name": "zeroshot_relevance_chosen", "dtype": "float64"}, {"name": "zeroshot_relevance_rejected", "dtype": "float64"}, {"name": "zeroshot_readability_chosen", "dtype": "float64"}, {"name": "zeroshot_readability_rejected", "dtype": "float64"}, {"name": "zeroshot_enough-detail_chosen", "dtype": "float64"}, {"name": "zeroshot_enough-detail_rejected", "dtype": "float64"}, {"name": "zeroshot_biased:_chosen", "dtype": "float64"}, {"name": "zeroshot_biased:_rejected", "dtype": "float64"}, {"name": "zeroshot_fail-to-consider-individual-preferences_chosen", "dtype": "float64"}, {"name": "zeroshot_fail-to-consider-individual-preferences_rejected", "dtype": "float64"}, {"name": "zeroshot_repetetive_chosen", "dtype": "float64"}, {"name": "zeroshot_repetetive_rejected", "dtype": "float64"}, {"name": "zeroshot_fail-to-consider-context_chosen", "dtype": "float64"}, {"name": "zeroshot_fail-to-consider-context_rejected", "dtype": "float64"}, {"name": "zeroshot_too-long_chosen", "dtype": "float64"}, {"name": "zeroshot_too-long_rejected", "dtype": "float64"}], "splits": [{"name": "train", "num_bytes": 16425816, "num_examples": 9574}, {"name": "test", "num_bytes": 16369741, "num_examples": 9574}], "download_size": 0, "dataset_size": 32795557}}
2023-09-22T23:41:36+00:00
[]
[]
TAGS #region-us
# Dataset Card for "hh-rlhf_with_features_flan_t5_large_flan_t5_base_zeroshot" More Information needed
[ "# Dataset Card for \"hh-rlhf_with_features_flan_t5_large_flan_t5_base_zeroshot\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"hh-rlhf_with_features_flan_t5_large_flan_t5_base_zeroshot\"\n\nMore Information needed" ]
[ 6, 38 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"hh-rlhf_with_features_flan_t5_large_flan_t5_base_zeroshot\"\n\nMore Information needed" ]
8663299930a53dc4a8317807930a4e88bea94ff3
# Dataset Card for "commit-pack-lua-fixes-filter" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
cassanof/commit-pack-lua-fixes-filter
[ "region:us" ]
2023-09-22T00:57:40+00:00
{"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}], "dataset_info": {"features": [{"name": "commit", "dtype": "string"}, {"name": "old_file", "dtype": "string"}, {"name": "new_file", "dtype": "string"}, {"name": "old_contents", "dtype": "string"}, {"name": "new_contents", "dtype": "string"}, {"name": "subject", "dtype": "string"}, {"name": "message", "dtype": "string"}, {"name": "lang", "dtype": "string"}, {"name": "license", "dtype": "string"}, {"name": "repos", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 56385828, "num_examples": 7051}], "download_size": 24189442, "dataset_size": 56385828}}
2023-09-22T01:20:48+00:00
[]
[]
TAGS #region-us
# Dataset Card for "commit-pack-lua-fixes-filter" More Information needed
[ "# Dataset Card for \"commit-pack-lua-fixes-filter\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"commit-pack-lua-fixes-filter\"\n\nMore Information needed" ]
[ 6, 22 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"commit-pack-lua-fixes-filter\"\n\nMore Information needed" ]
491aa4e123d39882404d1ac4cf6d6fbcaad8eac7
# Dataset Card for Evaluation run of Xwin-LM/Xwin-LM-13B-V0.1 ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/Xwin-LM/Xwin-LM-13B-V0.1 - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [Xwin-LM/Xwin-LM-13B-V0.1](https://huggingface.co/Xwin-LM/Xwin-LM-13B-V0.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_Xwin-LM__Xwin-LM-13B-V0.1", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-10-29T07:10:16.987751](https://huggingface.co/datasets/open-llm-leaderboard/details_Xwin-LM__Xwin-LM-13B-V0.1/blob/main/results_2023-10-29T07-10-16.987751.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.04509228187919463, "em_stderr": 0.0021250612871047806, "f1": 0.10556522651006703, "f1_stderr": 0.002412258426782379, "acc": 0.4194921770516876, "acc_stderr": 0.0102056268735408 }, "harness|drop|3": { "em": 0.04509228187919463, "em_stderr": 0.0021250612871047806, "f1": 0.10556522651006703, "f1_stderr": 0.002412258426782379 }, "harness|gsm8k|5": { "acc": 0.09628506444275967, "acc_stderr": 0.008125264128215896 }, "harness|winogrande|5": { "acc": 0.7426992896606156, "acc_stderr": 0.012285989618865702 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_Xwin-LM__Xwin-LM-13B-V0.1
[ "region:us" ]
2023-09-22T01:01:09+00:00
{"pretty_name": "Evaluation run of Xwin-LM/Xwin-LM-13B-V0.1", "dataset_summary": "Dataset automatically created during the evaluation run of model [Xwin-LM/Xwin-LM-13B-V0.1](https://huggingface.co/Xwin-LM/Xwin-LM-13B-V0.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Xwin-LM__Xwin-LM-13B-V0.1\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-29T07:10:16.987751](https://huggingface.co/datasets/open-llm-leaderboard/details_Xwin-LM__Xwin-LM-13B-V0.1/blob/main/results_2023-10-29T07-10-16.987751.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.04509228187919463,\n \"em_stderr\": 0.0021250612871047806,\n \"f1\": 0.10556522651006703,\n \"f1_stderr\": 0.002412258426782379,\n \"acc\": 0.4194921770516876,\n \"acc_stderr\": 0.0102056268735408\n },\n \"harness|drop|3\": {\n \"em\": 0.04509228187919463,\n \"em_stderr\": 0.0021250612871047806,\n \"f1\": 0.10556522651006703,\n \"f1_stderr\": 0.002412258426782379\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.09628506444275967,\n \"acc_stderr\": 0.008125264128215896\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7426992896606156,\n \"acc_stderr\": 0.012285989618865702\n }\n}\n```", "repo_url": "https://huggingface.co/Xwin-LM/Xwin-LM-13B-V0.1", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_09_22T02_00_45.467077", "path": ["**/details_harness|arc:challenge|25_2023-09-22T02-00-45.467077.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-09-22T02-00-45.467077.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_29T07_10_16.987751", "path": ["**/details_harness|drop|3_2023-10-29T07-10-16.987751.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-29T07-10-16.987751.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_29T07_10_16.987751", "path": ["**/details_harness|gsm8k|5_2023-10-29T07-10-16.987751.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-29T07-10-16.987751.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_09_22T02_00_45.467077", "path": ["**/details_harness|hellaswag|10_2023-09-22T02-00-45.467077.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-09-22T02-00-45.467077.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_09_22T02_00_45.467077", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-22T02-00-45.467077.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-09-22T02-00-45.467077.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-09-22T02-00-45.467077.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-09-22T02-00-45.467077.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-22T02-00-45.467077.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-09-22T02-00-45.467077.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-09-22T02-00-45.467077.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-09-22T02-00-45.467077.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-09-22T02-00-45.467077.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-09-22T02-00-45.467077.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-09-22T02-00-45.467077.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-09-22T02-00-45.467077.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-22T02-00-45.467077.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-09-22T02-00-45.467077.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-22T02-00-45.467077.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-22T02-00-45.467077.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-09-22T02-00-45.467077.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-09-22T02-00-45.467077.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-09-22T02-00-45.467077.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-22T02-00-45.467077.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-22T02-00-45.467077.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-22T02-00-45.467077.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-09-22T02-00-45.467077.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-22T02-00-45.467077.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-22T02-00-45.467077.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-22T02-00-45.467077.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-22T02-00-45.467077.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-09-22T02-00-45.467077.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-22T02-00-45.467077.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-22T02-00-45.467077.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-22T02-00-45.467077.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-22T02-00-45.467077.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-09-22T02-00-45.467077.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-09-22T02-00-45.467077.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-09-22T02-00-45.467077.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-09-22T02-00-45.467077.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-22T02-00-45.467077.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-09-22T02-00-45.467077.parquet", "**/details_harness|hendrycksTest-management|5_2023-09-22T02-00-45.467077.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-09-22T02-00-45.467077.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-09-22T02-00-45.467077.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-09-22T02-00-45.467077.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-09-22T02-00-45.467077.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-22T02-00-45.467077.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-09-22T02-00-45.467077.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-09-22T02-00-45.467077.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-09-22T02-00-45.467077.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-09-22T02-00-45.467077.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-09-22T02-00-45.467077.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-09-22T02-00-45.467077.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-09-22T02-00-45.467077.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-09-22T02-00-45.467077.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-09-22T02-00-45.467077.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-09-22T02-00-45.467077.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-22T02-00-45.467077.parquet", "**/details_harness|hendrycksTest-virology|5_2023-09-22T02-00-45.467077.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-09-22T02-00-45.467077.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-22T02-00-45.467077.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-09-22T02-00-45.467077.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-09-22T02-00-45.467077.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-09-22T02-00-45.467077.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-22T02-00-45.467077.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-09-22T02-00-45.467077.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-09-22T02-00-45.467077.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-09-22T02-00-45.467077.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-09-22T02-00-45.467077.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-09-22T02-00-45.467077.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-09-22T02-00-45.467077.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-09-22T02-00-45.467077.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-22T02-00-45.467077.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-09-22T02-00-45.467077.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-22T02-00-45.467077.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-22T02-00-45.467077.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-09-22T02-00-45.467077.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-09-22T02-00-45.467077.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-09-22T02-00-45.467077.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-22T02-00-45.467077.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-22T02-00-45.467077.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-22T02-00-45.467077.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-09-22T02-00-45.467077.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-22T02-00-45.467077.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-22T02-00-45.467077.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-22T02-00-45.467077.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-22T02-00-45.467077.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-09-22T02-00-45.467077.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-22T02-00-45.467077.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-22T02-00-45.467077.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-22T02-00-45.467077.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-22T02-00-45.467077.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-09-22T02-00-45.467077.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-09-22T02-00-45.467077.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-09-22T02-00-45.467077.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-09-22T02-00-45.467077.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-22T02-00-45.467077.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-09-22T02-00-45.467077.parquet", "**/details_harness|hendrycksTest-management|5_2023-09-22T02-00-45.467077.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-09-22T02-00-45.467077.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-09-22T02-00-45.467077.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-09-22T02-00-45.467077.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-09-22T02-00-45.467077.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-22T02-00-45.467077.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-09-22T02-00-45.467077.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-09-22T02-00-45.467077.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-09-22T02-00-45.467077.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-09-22T02-00-45.467077.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-09-22T02-00-45.467077.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-09-22T02-00-45.467077.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-09-22T02-00-45.467077.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-09-22T02-00-45.467077.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-09-22T02-00-45.467077.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-09-22T02-00-45.467077.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-22T02-00-45.467077.parquet", "**/details_harness|hendrycksTest-virology|5_2023-09-22T02-00-45.467077.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-09-22T02-00-45.467077.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_09_22T02_00_45.467077", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-22T02-00-45.467077.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-22T02-00-45.467077.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_09_22T02_00_45.467077", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-09-22T02-00-45.467077.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-09-22T02-00-45.467077.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_09_22T02_00_45.467077", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-09-22T02-00-45.467077.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-09-22T02-00-45.467077.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_09_22T02_00_45.467077", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-09-22T02-00-45.467077.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-09-22T02-00-45.467077.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_09_22T02_00_45.467077", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-22T02-00-45.467077.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-22T02-00-45.467077.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_09_22T02_00_45.467077", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-09-22T02-00-45.467077.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-09-22T02-00-45.467077.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_09_22T02_00_45.467077", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-09-22T02-00-45.467077.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-09-22T02-00-45.467077.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_09_22T02_00_45.467077", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-09-22T02-00-45.467077.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-09-22T02-00-45.467077.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_09_22T02_00_45.467077", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-09-22T02-00-45.467077.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-09-22T02-00-45.467077.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_09_22T02_00_45.467077", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-09-22T02-00-45.467077.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-09-22T02-00-45.467077.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_09_22T02_00_45.467077", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-09-22T02-00-45.467077.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-09-22T02-00-45.467077.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_09_22T02_00_45.467077", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-09-22T02-00-45.467077.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-09-22T02-00-45.467077.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_09_22T02_00_45.467077", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-22T02-00-45.467077.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-22T02-00-45.467077.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_09_22T02_00_45.467077", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-09-22T02-00-45.467077.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-09-22T02-00-45.467077.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_09_22T02_00_45.467077", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-22T02-00-45.467077.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-22T02-00-45.467077.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_09_22T02_00_45.467077", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-22T02-00-45.467077.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-22T02-00-45.467077.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_09_22T02_00_45.467077", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-09-22T02-00-45.467077.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-09-22T02-00-45.467077.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_09_22T02_00_45.467077", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-09-22T02-00-45.467077.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-09-22T02-00-45.467077.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_09_22T02_00_45.467077", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-09-22T02-00-45.467077.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-09-22T02-00-45.467077.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_09_22T02_00_45.467077", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-22T02-00-45.467077.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-22T02-00-45.467077.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_09_22T02_00_45.467077", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-22T02-00-45.467077.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-22T02-00-45.467077.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_09_22T02_00_45.467077", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-22T02-00-45.467077.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-22T02-00-45.467077.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_09_22T02_00_45.467077", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-09-22T02-00-45.467077.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-09-22T02-00-45.467077.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_09_22T02_00_45.467077", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-22T02-00-45.467077.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-22T02-00-45.467077.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_09_22T02_00_45.467077", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-22T02-00-45.467077.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-22T02-00-45.467077.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_09_22T02_00_45.467077", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-22T02-00-45.467077.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-22T02-00-45.467077.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_09_22T02_00_45.467077", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-22T02-00-45.467077.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-22T02-00-45.467077.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_09_22T02_00_45.467077", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-09-22T02-00-45.467077.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-09-22T02-00-45.467077.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_09_22T02_00_45.467077", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-22T02-00-45.467077.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-22T02-00-45.467077.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_09_22T02_00_45.467077", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-22T02-00-45.467077.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-22T02-00-45.467077.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_09_22T02_00_45.467077", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-22T02-00-45.467077.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-22T02-00-45.467077.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_09_22T02_00_45.467077", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-22T02-00-45.467077.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-22T02-00-45.467077.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_09_22T02_00_45.467077", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-09-22T02-00-45.467077.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-09-22T02-00-45.467077.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_09_22T02_00_45.467077", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-09-22T02-00-45.467077.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-09-22T02-00-45.467077.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_09_22T02_00_45.467077", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-09-22T02-00-45.467077.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-09-22T02-00-45.467077.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_09_22T02_00_45.467077", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-09-22T02-00-45.467077.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-09-22T02-00-45.467077.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_09_22T02_00_45.467077", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-22T02-00-45.467077.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-22T02-00-45.467077.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_09_22T02_00_45.467077", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-09-22T02-00-45.467077.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-09-22T02-00-45.467077.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_09_22T02_00_45.467077", "path": ["**/details_harness|hendrycksTest-management|5_2023-09-22T02-00-45.467077.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-09-22T02-00-45.467077.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_09_22T02_00_45.467077", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-09-22T02-00-45.467077.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-09-22T02-00-45.467077.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_09_22T02_00_45.467077", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-09-22T02-00-45.467077.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-09-22T02-00-45.467077.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_09_22T02_00_45.467077", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-09-22T02-00-45.467077.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-09-22T02-00-45.467077.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_09_22T02_00_45.467077", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-09-22T02-00-45.467077.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-09-22T02-00-45.467077.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_09_22T02_00_45.467077", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-22T02-00-45.467077.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-22T02-00-45.467077.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_09_22T02_00_45.467077", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-09-22T02-00-45.467077.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-09-22T02-00-45.467077.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_09_22T02_00_45.467077", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-09-22T02-00-45.467077.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-09-22T02-00-45.467077.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_09_22T02_00_45.467077", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-09-22T02-00-45.467077.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-09-22T02-00-45.467077.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_09_22T02_00_45.467077", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-09-22T02-00-45.467077.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-09-22T02-00-45.467077.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_09_22T02_00_45.467077", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-09-22T02-00-45.467077.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-09-22T02-00-45.467077.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_09_22T02_00_45.467077", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-09-22T02-00-45.467077.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-09-22T02-00-45.467077.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_09_22T02_00_45.467077", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-09-22T02-00-45.467077.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-09-22T02-00-45.467077.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_09_22T02_00_45.467077", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-09-22T02-00-45.467077.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-09-22T02-00-45.467077.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_09_22T02_00_45.467077", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-09-22T02-00-45.467077.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-09-22T02-00-45.467077.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_09_22T02_00_45.467077", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-09-22T02-00-45.467077.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-09-22T02-00-45.467077.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_09_22T02_00_45.467077", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-22T02-00-45.467077.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-22T02-00-45.467077.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_09_22T02_00_45.467077", "path": ["**/details_harness|hendrycksTest-virology|5_2023-09-22T02-00-45.467077.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-09-22T02-00-45.467077.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_09_22T02_00_45.467077", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-09-22T02-00-45.467077.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-09-22T02-00-45.467077.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_09_22T02_00_45.467077", "path": ["**/details_harness|truthfulqa:mc|0_2023-09-22T02-00-45.467077.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-09-22T02-00-45.467077.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_29T07_10_16.987751", "path": ["**/details_harness|winogrande|5_2023-10-29T07-10-16.987751.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-29T07-10-16.987751.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_09_22T02_00_45.467077", "path": ["results_2023-09-22T02-00-45.467077.parquet"]}, {"split": "2023_10_29T07_10_16.987751", "path": ["results_2023-10-29T07-10-16.987751.parquet"]}, {"split": "latest", "path": ["results_2023-10-29T07-10-16.987751.parquet"]}]}]}
2023-10-29T07:10:28+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of Xwin-LM/Xwin-LM-13B-V0.1 ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model Xwin-LM/Xwin-LM-13B-V0.1 on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-10-29T07:10:16.987751(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of Xwin-LM/Xwin-LM-13B-V0.1", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model Xwin-LM/Xwin-LM-13B-V0.1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-29T07:10:16.987751(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of Xwin-LM/Xwin-LM-13B-V0.1", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model Xwin-LM/Xwin-LM-13B-V0.1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-29T07:10:16.987751(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 23, 31, 171, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of Xwin-LM/Xwin-LM-13B-V0.1## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model Xwin-LM/Xwin-LM-13B-V0.1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-29T07:10:16.987751(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
9cfd34b1ce08d4c22dabd10a3bc584106ebc4ace
Chinese-Dolly-15k 是繁體中文翻譯的Dolly instruction(Databricks)資料集,並用於 Fine tune 的問答 JSON 格式。 原來的資料集'databricks/databricks-dolly-15k'是由數千名Databricks員工根據InstructGPT論文中概述的幾種行為類別生成的遵循指示記錄的開來源資料集。這幾個行為類別包括頭腦風暴、分類、封閉型問答、生成、資訊擷取、開放類型的問答和摘要。 在知識共用署名-相同方式共用3.0(CC BY-SA 3.0)許可下,此資料集可用於任何學術或商業用途。 如果你也在做這些資料集的籌備,歡迎來聯繫我們,避免重複花錢。 ## Citation Please cite the repo if you use the data or code in this repo. ``` @misc{alpaca, author = {DavidLanz}, title = {An Instruction-following Chinese Language model, LoRA tuning on LLaMA}, year = {2023}, publisher = {GitHub}, journal = {GitHub repository}, url = {https://www.databricks.com/blog/2023/04/12/dolly-first-open-commercially-viable-instruction-tuned-llm}, urldate = {2023-09-15} } ```
DavidLanz/chinese-dolly-input-output-15k
[ "task_categories:question-answering", "task_categories:summarization", "task_categories:text-generation", "size_categories:10K<n<100K", "language:zh", "language:en", "license:cc-by-sa-3.0", "region:us" ]
2023-09-22T01:11:39+00:00
{"language": ["zh", "en"], "license": "cc-by-sa-3.0", "size_categories": ["10K<n<100K"], "task_categories": ["question-answering", "summarization", "text-generation"]}
2023-09-22T01:13:53+00:00
[]
[ "zh", "en" ]
TAGS #task_categories-question-answering #task_categories-summarization #task_categories-text-generation #size_categories-10K<n<100K #language-Chinese #language-English #license-cc-by-sa-3.0 #region-us
Chinese-Dolly-15k 是繁體中文翻譯的Dolly instruction(Databricks)資料集,並用於 Fine tune 的問答 JSON 格式。 原來的資料集'databricks/databricks-dolly-15k'是由數千名Databricks員工根據InstructGPT論文中概述的幾種行為類別生成的遵循指示記錄的開來源資料集。這幾個行為類別包括頭腦風暴、分類、封閉型問答、生成、資訊擷取、開放類型的問答和摘要。 在知識共用署名-相同方式共用3.0(CC BY-SA 3.0)許可下,此資料集可用於任何學術或商業用途。 如果你也在做這些資料集的籌備,歡迎來聯繫我們,避免重複花錢。 Please cite the repo if you use the data or code in this repo.
[]
[ "TAGS\n#task_categories-question-answering #task_categories-summarization #task_categories-text-generation #size_categories-10K<n<100K #language-Chinese #language-English #license-cc-by-sa-3.0 #region-us \n" ]
[ 71 ]
[ "passage: TAGS\n#task_categories-question-answering #task_categories-summarization #task_categories-text-generation #size_categories-10K<n<100K #language-Chinese #language-English #license-cc-by-sa-3.0 #region-us \n" ]
b5f6a8af071757a055fe8cb66c59238b3b93414e
# Dataset Card for Evaluation run of euclaise/falcon_1b_stage3 ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/euclaise/falcon_1b_stage3 - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [euclaise/falcon_1b_stage3](https://huggingface.co/euclaise/falcon_1b_stage3) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_euclaise__falcon_1b_stage3", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-10-26T07:58:45.651251](https://huggingface.co/datasets/open-llm-leaderboard/details_euclaise__falcon_1b_stage3/blob/main/results_2023-10-26T07-58-45.651251.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.002726510067114094, "em_stderr": 0.0005340111700415902, "f1": 0.05945679530201347, "f1_stderr": 0.0017516314854118453, "acc": 0.2975532754538279, "acc_stderr": 0.006897963501562468 }, "harness|drop|3": { "em": 0.002726510067114094, "em_stderr": 0.0005340111700415902, "f1": 0.05945679530201347, "f1_stderr": 0.0017516314854118453 }, "harness|gsm8k|5": { "acc": 0.0, "acc_stderr": 0.0 }, "harness|winogrande|5": { "acc": 0.5951065509076559, "acc_stderr": 0.013795927003124936 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_euclaise__falcon_1b_stage3
[ "region:us" ]
2023-09-22T01:13:02+00:00
{"pretty_name": "Evaluation run of euclaise/falcon_1b_stage3", "dataset_summary": "Dataset automatically created during the evaluation run of model [euclaise/falcon_1b_stage3](https://huggingface.co/euclaise/falcon_1b_stage3) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_euclaise__falcon_1b_stage3\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-26T07:58:45.651251](https://huggingface.co/datasets/open-llm-leaderboard/details_euclaise__falcon_1b_stage3/blob/main/results_2023-10-26T07-58-45.651251.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.002726510067114094,\n \"em_stderr\": 0.0005340111700415902,\n \"f1\": 0.05945679530201347,\n \"f1_stderr\": 0.0017516314854118453,\n \"acc\": 0.2975532754538279,\n \"acc_stderr\": 0.006897963501562468\n },\n \"harness|drop|3\": {\n \"em\": 0.002726510067114094,\n \"em_stderr\": 0.0005340111700415902,\n \"f1\": 0.05945679530201347,\n \"f1_stderr\": 0.0017516314854118453\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\": 0.0\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.5951065509076559,\n \"acc_stderr\": 0.013795927003124936\n }\n}\n```", "repo_url": "https://huggingface.co/euclaise/falcon_1b_stage3", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_09_22T02_12_44.101824", "path": ["**/details_harness|arc:challenge|25_2023-09-22T02-12-44.101824.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-09-22T02-12-44.101824.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_26T07_58_45.651251", "path": ["**/details_harness|drop|3_2023-10-26T07-58-45.651251.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-26T07-58-45.651251.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_26T07_58_45.651251", "path": ["**/details_harness|gsm8k|5_2023-10-26T07-58-45.651251.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-26T07-58-45.651251.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_09_22T02_12_44.101824", "path": ["**/details_harness|hellaswag|10_2023-09-22T02-12-44.101824.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-09-22T02-12-44.101824.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_09_22T02_12_44.101824", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-22T02-12-44.101824.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-09-22T02-12-44.101824.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-09-22T02-12-44.101824.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-09-22T02-12-44.101824.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-22T02-12-44.101824.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-09-22T02-12-44.101824.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-09-22T02-12-44.101824.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-09-22T02-12-44.101824.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-09-22T02-12-44.101824.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-09-22T02-12-44.101824.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-09-22T02-12-44.101824.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-09-22T02-12-44.101824.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-22T02-12-44.101824.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-09-22T02-12-44.101824.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-22T02-12-44.101824.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-22T02-12-44.101824.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-09-22T02-12-44.101824.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-09-22T02-12-44.101824.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-09-22T02-12-44.101824.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-22T02-12-44.101824.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-22T02-12-44.101824.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-22T02-12-44.101824.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-09-22T02-12-44.101824.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-22T02-12-44.101824.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-22T02-12-44.101824.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-22T02-12-44.101824.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-22T02-12-44.101824.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-09-22T02-12-44.101824.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-22T02-12-44.101824.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-22T02-12-44.101824.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-22T02-12-44.101824.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-22T02-12-44.101824.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-09-22T02-12-44.101824.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-09-22T02-12-44.101824.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-09-22T02-12-44.101824.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-09-22T02-12-44.101824.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-22T02-12-44.101824.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-09-22T02-12-44.101824.parquet", "**/details_harness|hendrycksTest-management|5_2023-09-22T02-12-44.101824.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-09-22T02-12-44.101824.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-09-22T02-12-44.101824.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-09-22T02-12-44.101824.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-09-22T02-12-44.101824.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-22T02-12-44.101824.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-09-22T02-12-44.101824.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-09-22T02-12-44.101824.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-09-22T02-12-44.101824.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-09-22T02-12-44.101824.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-09-22T02-12-44.101824.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-09-22T02-12-44.101824.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-09-22T02-12-44.101824.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-09-22T02-12-44.101824.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-09-22T02-12-44.101824.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-09-22T02-12-44.101824.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-22T02-12-44.101824.parquet", "**/details_harness|hendrycksTest-virology|5_2023-09-22T02-12-44.101824.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-09-22T02-12-44.101824.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-22T02-12-44.101824.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-09-22T02-12-44.101824.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-09-22T02-12-44.101824.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-09-22T02-12-44.101824.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-22T02-12-44.101824.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-09-22T02-12-44.101824.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-09-22T02-12-44.101824.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-09-22T02-12-44.101824.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-09-22T02-12-44.101824.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-09-22T02-12-44.101824.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-09-22T02-12-44.101824.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-09-22T02-12-44.101824.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-22T02-12-44.101824.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-09-22T02-12-44.101824.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-22T02-12-44.101824.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-22T02-12-44.101824.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-09-22T02-12-44.101824.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-09-22T02-12-44.101824.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-09-22T02-12-44.101824.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-22T02-12-44.101824.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-22T02-12-44.101824.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-22T02-12-44.101824.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-09-22T02-12-44.101824.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-22T02-12-44.101824.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-22T02-12-44.101824.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-22T02-12-44.101824.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-22T02-12-44.101824.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-09-22T02-12-44.101824.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-22T02-12-44.101824.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-22T02-12-44.101824.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-22T02-12-44.101824.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-22T02-12-44.101824.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-09-22T02-12-44.101824.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-09-22T02-12-44.101824.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-09-22T02-12-44.101824.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-09-22T02-12-44.101824.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-22T02-12-44.101824.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-09-22T02-12-44.101824.parquet", "**/details_harness|hendrycksTest-management|5_2023-09-22T02-12-44.101824.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-09-22T02-12-44.101824.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-09-22T02-12-44.101824.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-09-22T02-12-44.101824.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-09-22T02-12-44.101824.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-22T02-12-44.101824.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-09-22T02-12-44.101824.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-09-22T02-12-44.101824.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-09-22T02-12-44.101824.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-09-22T02-12-44.101824.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-09-22T02-12-44.101824.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-09-22T02-12-44.101824.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-09-22T02-12-44.101824.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-09-22T02-12-44.101824.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-09-22T02-12-44.101824.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-09-22T02-12-44.101824.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-22T02-12-44.101824.parquet", "**/details_harness|hendrycksTest-virology|5_2023-09-22T02-12-44.101824.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-09-22T02-12-44.101824.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_09_22T02_12_44.101824", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-22T02-12-44.101824.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-22T02-12-44.101824.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_09_22T02_12_44.101824", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-09-22T02-12-44.101824.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-09-22T02-12-44.101824.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_09_22T02_12_44.101824", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-09-22T02-12-44.101824.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-09-22T02-12-44.101824.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_09_22T02_12_44.101824", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-09-22T02-12-44.101824.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-09-22T02-12-44.101824.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_09_22T02_12_44.101824", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-22T02-12-44.101824.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-22T02-12-44.101824.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_09_22T02_12_44.101824", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-09-22T02-12-44.101824.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-09-22T02-12-44.101824.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_09_22T02_12_44.101824", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-09-22T02-12-44.101824.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-09-22T02-12-44.101824.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_09_22T02_12_44.101824", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-09-22T02-12-44.101824.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-09-22T02-12-44.101824.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_09_22T02_12_44.101824", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-09-22T02-12-44.101824.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-09-22T02-12-44.101824.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_09_22T02_12_44.101824", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-09-22T02-12-44.101824.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-09-22T02-12-44.101824.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_09_22T02_12_44.101824", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-09-22T02-12-44.101824.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-09-22T02-12-44.101824.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_09_22T02_12_44.101824", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-09-22T02-12-44.101824.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-09-22T02-12-44.101824.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_09_22T02_12_44.101824", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-22T02-12-44.101824.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-22T02-12-44.101824.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_09_22T02_12_44.101824", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-09-22T02-12-44.101824.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-09-22T02-12-44.101824.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_09_22T02_12_44.101824", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-22T02-12-44.101824.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-22T02-12-44.101824.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_09_22T02_12_44.101824", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-22T02-12-44.101824.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-22T02-12-44.101824.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_09_22T02_12_44.101824", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-09-22T02-12-44.101824.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-09-22T02-12-44.101824.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_09_22T02_12_44.101824", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-09-22T02-12-44.101824.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-09-22T02-12-44.101824.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_09_22T02_12_44.101824", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-09-22T02-12-44.101824.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-09-22T02-12-44.101824.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_09_22T02_12_44.101824", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-22T02-12-44.101824.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-22T02-12-44.101824.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_09_22T02_12_44.101824", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-22T02-12-44.101824.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-22T02-12-44.101824.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_09_22T02_12_44.101824", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-22T02-12-44.101824.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-22T02-12-44.101824.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_09_22T02_12_44.101824", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-09-22T02-12-44.101824.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-09-22T02-12-44.101824.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_09_22T02_12_44.101824", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-22T02-12-44.101824.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-22T02-12-44.101824.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_09_22T02_12_44.101824", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-22T02-12-44.101824.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-22T02-12-44.101824.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_09_22T02_12_44.101824", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-22T02-12-44.101824.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-22T02-12-44.101824.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_09_22T02_12_44.101824", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-22T02-12-44.101824.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-22T02-12-44.101824.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_09_22T02_12_44.101824", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-09-22T02-12-44.101824.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-09-22T02-12-44.101824.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_09_22T02_12_44.101824", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-22T02-12-44.101824.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-22T02-12-44.101824.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_09_22T02_12_44.101824", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-22T02-12-44.101824.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-22T02-12-44.101824.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_09_22T02_12_44.101824", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-22T02-12-44.101824.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-22T02-12-44.101824.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_09_22T02_12_44.101824", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-22T02-12-44.101824.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-22T02-12-44.101824.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_09_22T02_12_44.101824", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-09-22T02-12-44.101824.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-09-22T02-12-44.101824.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_09_22T02_12_44.101824", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-09-22T02-12-44.101824.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-09-22T02-12-44.101824.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_09_22T02_12_44.101824", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-09-22T02-12-44.101824.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-09-22T02-12-44.101824.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_09_22T02_12_44.101824", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-09-22T02-12-44.101824.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-09-22T02-12-44.101824.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_09_22T02_12_44.101824", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-22T02-12-44.101824.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-22T02-12-44.101824.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_09_22T02_12_44.101824", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-09-22T02-12-44.101824.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-09-22T02-12-44.101824.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_09_22T02_12_44.101824", "path": ["**/details_harness|hendrycksTest-management|5_2023-09-22T02-12-44.101824.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-09-22T02-12-44.101824.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_09_22T02_12_44.101824", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-09-22T02-12-44.101824.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-09-22T02-12-44.101824.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_09_22T02_12_44.101824", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-09-22T02-12-44.101824.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-09-22T02-12-44.101824.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_09_22T02_12_44.101824", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-09-22T02-12-44.101824.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-09-22T02-12-44.101824.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_09_22T02_12_44.101824", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-09-22T02-12-44.101824.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-09-22T02-12-44.101824.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_09_22T02_12_44.101824", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-22T02-12-44.101824.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-22T02-12-44.101824.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_09_22T02_12_44.101824", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-09-22T02-12-44.101824.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-09-22T02-12-44.101824.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_09_22T02_12_44.101824", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-09-22T02-12-44.101824.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-09-22T02-12-44.101824.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_09_22T02_12_44.101824", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-09-22T02-12-44.101824.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-09-22T02-12-44.101824.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_09_22T02_12_44.101824", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-09-22T02-12-44.101824.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-09-22T02-12-44.101824.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_09_22T02_12_44.101824", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-09-22T02-12-44.101824.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-09-22T02-12-44.101824.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_09_22T02_12_44.101824", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-09-22T02-12-44.101824.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-09-22T02-12-44.101824.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_09_22T02_12_44.101824", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-09-22T02-12-44.101824.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-09-22T02-12-44.101824.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_09_22T02_12_44.101824", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-09-22T02-12-44.101824.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-09-22T02-12-44.101824.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_09_22T02_12_44.101824", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-09-22T02-12-44.101824.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-09-22T02-12-44.101824.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_09_22T02_12_44.101824", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-09-22T02-12-44.101824.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-09-22T02-12-44.101824.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_09_22T02_12_44.101824", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-22T02-12-44.101824.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-22T02-12-44.101824.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_09_22T02_12_44.101824", "path": ["**/details_harness|hendrycksTest-virology|5_2023-09-22T02-12-44.101824.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-09-22T02-12-44.101824.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_09_22T02_12_44.101824", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-09-22T02-12-44.101824.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-09-22T02-12-44.101824.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_09_22T02_12_44.101824", "path": ["**/details_harness|truthfulqa:mc|0_2023-09-22T02-12-44.101824.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-09-22T02-12-44.101824.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_26T07_58_45.651251", "path": ["**/details_harness|winogrande|5_2023-10-26T07-58-45.651251.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-26T07-58-45.651251.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_09_22T02_12_44.101824", "path": ["results_2023-09-22T02-12-44.101824.parquet"]}, {"split": "2023_10_26T07_58_45.651251", "path": ["results_2023-10-26T07-58-45.651251.parquet"]}, {"split": "latest", "path": ["results_2023-10-26T07-58-45.651251.parquet"]}]}]}
2023-10-26T06:58:58+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of euclaise/falcon_1b_stage3 ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model euclaise/falcon_1b_stage3 on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-10-26T07:58:45.651251(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of euclaise/falcon_1b_stage3", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model euclaise/falcon_1b_stage3 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-26T07:58:45.651251(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of euclaise/falcon_1b_stage3", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model euclaise/falcon_1b_stage3 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-26T07:58:45.651251(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 21, 31, 169, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of euclaise/falcon_1b_stage3## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model euclaise/falcon_1b_stage3 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-26T07:58:45.651251(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
0cbdbbb23ece84664b55b688bb3507c6fa8529f9
# Dataset Card for Evaluation run of NoIdeaLand/test-3k-mx ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/NoIdeaLand/test-3k-mx - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [NoIdeaLand/test-3k-mx](https://huggingface.co/NoIdeaLand/test-3k-mx) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_NoIdeaLand__test-3k-mx", "harness_truthfulqa_mc_0", split="train") ``` ## Latest results These are the [latest results from run 2023-09-22T02:20:18.679270](https://huggingface.co/datasets/open-llm-leaderboard/details_NoIdeaLand__test-3k-mx/blob/main/results_2023-09-22T02-20-18.679270.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.25937377998421673, "acc_stderr": 0.03158826848264918, "acc_norm": 0.263032034220802, "acc_norm_stderr": 0.031588822884227444, "mc1": 0.23623011015911874, "mc1_stderr": 0.014869755015871108, "mc2": 0.4093188700877857, "mc2_stderr": 0.014339231042407396 }, "harness|arc:challenge|25": { "acc": 0.3438566552901024, "acc_stderr": 0.013880644570156201, "acc_norm": 0.38054607508532423, "acc_norm_stderr": 0.014188277712349822 }, "harness|hellaswag|10": { "acc": 0.48516231826329415, "acc_stderr": 0.004987583858923224, "acc_norm": 0.6643098984266083, "acc_norm_stderr": 0.004712660409846823 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.23, "acc_stderr": 0.04229525846816507, "acc_norm": 0.23, "acc_norm_stderr": 0.04229525846816507 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.1925925925925926, "acc_stderr": 0.034065420585026505, "acc_norm": 0.1925925925925926, "acc_norm_stderr": 0.034065420585026505 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.18421052631578946, "acc_stderr": 0.0315469804508223, "acc_norm": 0.18421052631578946, "acc_norm_stderr": 0.0315469804508223 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.34, "acc_stderr": 0.04760952285695235, "acc_norm": 0.34, "acc_norm_stderr": 0.04760952285695235 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.2528301886792453, "acc_stderr": 0.026749899771241235, "acc_norm": 0.2528301886792453, "acc_norm_stderr": 0.026749899771241235 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.2569444444444444, "acc_stderr": 0.03653946969442099, "acc_norm": 0.2569444444444444, "acc_norm_stderr": 0.03653946969442099 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.23, "acc_stderr": 0.04229525846816508, "acc_norm": 0.23, "acc_norm_stderr": 0.04229525846816508 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.18, "acc_stderr": 0.03861229196653695, "acc_norm": 0.18, "acc_norm_stderr": 0.03861229196653695 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.21, "acc_stderr": 0.040936018074033256, "acc_norm": 0.21, "acc_norm_stderr": 0.040936018074033256 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.2138728323699422, "acc_stderr": 0.031265112061730445, "acc_norm": 0.2138728323699422, "acc_norm_stderr": 0.031265112061730445 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.20588235294117646, "acc_stderr": 0.04023382273617746, "acc_norm": 0.20588235294117646, "acc_norm_stderr": 0.04023382273617746 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.26, "acc_stderr": 0.04408440022768077, "acc_norm": 0.26, "acc_norm_stderr": 0.04408440022768077 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.28936170212765955, "acc_stderr": 0.029644006577009618, "acc_norm": 0.28936170212765955, "acc_norm_stderr": 0.029644006577009618 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.16666666666666666, "acc_stderr": 0.035058596825972656, "acc_norm": 0.16666666666666666, "acc_norm_stderr": 0.035058596825972656 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.2827586206896552, "acc_stderr": 0.037528339580033376, "acc_norm": 0.2827586206896552, "acc_norm_stderr": 0.037528339580033376 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.21428571428571427, "acc_stderr": 0.02113285918275445, "acc_norm": 0.21428571428571427, "acc_norm_stderr": 0.02113285918275445 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.1984126984126984, "acc_stderr": 0.03567016675276863, "acc_norm": 0.1984126984126984, "acc_norm_stderr": 0.03567016675276863 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.23, "acc_stderr": 0.042295258468165065, "acc_norm": 0.23, "acc_norm_stderr": 0.042295258468165065 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.2032258064516129, "acc_stderr": 0.022891687984554952, "acc_norm": 0.2032258064516129, "acc_norm_stderr": 0.022891687984554952 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.21182266009852216, "acc_stderr": 0.028748983689941065, "acc_norm": 0.21182266009852216, "acc_norm_stderr": 0.028748983689941065 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.26, "acc_stderr": 0.04408440022768078, "acc_norm": 0.26, "acc_norm_stderr": 0.04408440022768078 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.2606060606060606, "acc_stderr": 0.03427743175816524, "acc_norm": 0.2606060606060606, "acc_norm_stderr": 0.03427743175816524 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.18686868686868688, "acc_stderr": 0.027772533334218977, "acc_norm": 0.18686868686868688, "acc_norm_stderr": 0.027772533334218977 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.26424870466321243, "acc_stderr": 0.031821550509166484, "acc_norm": 0.26424870466321243, "acc_norm_stderr": 0.031821550509166484 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.28717948717948716, "acc_stderr": 0.022939925418530616, "acc_norm": 0.28717948717948716, "acc_norm_stderr": 0.022939925418530616 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.2740740740740741, "acc_stderr": 0.027195934804085622, "acc_norm": 0.2740740740740741, "acc_norm_stderr": 0.027195934804085622 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.226890756302521, "acc_stderr": 0.02720537153827947, "acc_norm": 0.226890756302521, "acc_norm_stderr": 0.02720537153827947 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.24503311258278146, "acc_stderr": 0.035118075718047245, "acc_norm": 0.24503311258278146, "acc_norm_stderr": 0.035118075718047245 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.1981651376146789, "acc_stderr": 0.017090573804217885, "acc_norm": 0.1981651376146789, "acc_norm_stderr": 0.017090573804217885 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.1712962962962963, "acc_stderr": 0.02569534164382467, "acc_norm": 0.1712962962962963, "acc_norm_stderr": 0.02569534164382467 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.2549019607843137, "acc_stderr": 0.030587591351604246, "acc_norm": 0.2549019607843137, "acc_norm_stderr": 0.030587591351604246 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.2911392405063291, "acc_stderr": 0.02957160106575337, "acc_norm": 0.2911392405063291, "acc_norm_stderr": 0.02957160106575337 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.34080717488789236, "acc_stderr": 0.031811497470553604, "acc_norm": 0.34080717488789236, "acc_norm_stderr": 0.031811497470553604 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.2824427480916031, "acc_stderr": 0.03948406125768361, "acc_norm": 0.2824427480916031, "acc_norm_stderr": 0.03948406125768361 }, "harness|hendrycksTest-international_law|5": { "acc": 0.2727272727272727, "acc_stderr": 0.04065578140908705, "acc_norm": 0.2727272727272727, "acc_norm_stderr": 0.04065578140908705 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.2962962962962963, "acc_stderr": 0.044143436668549335, "acc_norm": 0.2962962962962963, "acc_norm_stderr": 0.044143436668549335 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.26993865030674846, "acc_stderr": 0.03487825168497892, "acc_norm": 0.26993865030674846, "acc_norm_stderr": 0.03487825168497892 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.35714285714285715, "acc_stderr": 0.04547960999764376, "acc_norm": 0.35714285714285715, "acc_norm_stderr": 0.04547960999764376 }, "harness|hendrycksTest-management|5": { "acc": 0.18446601941747573, "acc_stderr": 0.03840423627288276, "acc_norm": 0.18446601941747573, "acc_norm_stderr": 0.03840423627288276 }, "harness|hendrycksTest-marketing|5": { "acc": 0.3504273504273504, "acc_stderr": 0.03125610824421881, "acc_norm": 0.3504273504273504, "acc_norm_stderr": 0.03125610824421881 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.33, "acc_stderr": 0.04725815626252604, "acc_norm": 0.33, "acc_norm_stderr": 0.04725815626252604 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.26053639846743293, "acc_stderr": 0.01569600856380709, "acc_norm": 0.26053639846743293, "acc_norm_stderr": 0.01569600856380709 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.25722543352601157, "acc_stderr": 0.023532925431044273, "acc_norm": 0.25722543352601157, "acc_norm_stderr": 0.023532925431044273 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.24134078212290502, "acc_stderr": 0.014310999547961455, "acc_norm": 0.24134078212290502, "acc_norm_stderr": 0.014310999547961455 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.2875816993464052, "acc_stderr": 0.02591780611714716, "acc_norm": 0.2875816993464052, "acc_norm_stderr": 0.02591780611714716 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.2282958199356913, "acc_stderr": 0.023839303311398222, "acc_norm": 0.2282958199356913, "acc_norm_stderr": 0.023839303311398222 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.2345679012345679, "acc_stderr": 0.023576881744005716, "acc_norm": 0.2345679012345679, "acc_norm_stderr": 0.023576881744005716 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.25177304964539005, "acc_stderr": 0.025892151156709405, "acc_norm": 0.25177304964539005, "acc_norm_stderr": 0.025892151156709405 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.2770534550195567, "acc_stderr": 0.011430462443719676, "acc_norm": 0.2770534550195567, "acc_norm_stderr": 0.011430462443719676 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.19852941176470587, "acc_stderr": 0.024231013370541104, "acc_norm": 0.19852941176470587, "acc_norm_stderr": 0.024231013370541104 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.2549019607843137, "acc_stderr": 0.017630827375148383, "acc_norm": 0.2549019607843137, "acc_norm_stderr": 0.017630827375148383 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.3, "acc_stderr": 0.04389311454644287, "acc_norm": 0.3, "acc_norm_stderr": 0.04389311454644287 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.2530612244897959, "acc_stderr": 0.027833023871399677, "acc_norm": 0.2530612244897959, "acc_norm_stderr": 0.027833023871399677 }, "harness|hendrycksTest-sociology|5": { "acc": 0.23880597014925373, "acc_stderr": 0.030147775935409217, "acc_norm": 0.23880597014925373, "acc_norm_stderr": 0.030147775935409217 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.41, "acc_stderr": 0.049431107042371025, "acc_norm": 0.41, "acc_norm_stderr": 0.049431107042371025 }, "harness|hendrycksTest-virology|5": { "acc": 0.30120481927710846, "acc_stderr": 0.03571609230053481, "acc_norm": 0.30120481927710846, "acc_norm_stderr": 0.03571609230053481 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.3216374269005848, "acc_stderr": 0.03582529442573122, "acc_norm": 0.3216374269005848, "acc_norm_stderr": 0.03582529442573122 }, "harness|truthfulqa:mc|0": { "mc1": 0.23623011015911874, "mc1_stderr": 0.014869755015871108, "mc2": 0.4093188700877857, "mc2_stderr": 0.014339231042407396 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_NoIdeaLand__test-3k-mx
[ "region:us" ]
2023-09-22T01:20:37+00:00
{"pretty_name": "Evaluation run of NoIdeaLand/test-3k-mx", "dataset_summary": "Dataset automatically created during the evaluation run of model [NoIdeaLand/test-3k-mx](https://huggingface.co/NoIdeaLand/test-3k-mx) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_NoIdeaLand__test-3k-mx\",\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-09-22T02:20:18.679270](https://huggingface.co/datasets/open-llm-leaderboard/details_NoIdeaLand__test-3k-mx/blob/main/results_2023-09-22T02-20-18.679270.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.25937377998421673,\n \"acc_stderr\": 0.03158826848264918,\n \"acc_norm\": 0.263032034220802,\n \"acc_norm_stderr\": 0.031588822884227444,\n \"mc1\": 0.23623011015911874,\n \"mc1_stderr\": 0.014869755015871108,\n \"mc2\": 0.4093188700877857,\n \"mc2_stderr\": 0.014339231042407396\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.3438566552901024,\n \"acc_stderr\": 0.013880644570156201,\n \"acc_norm\": 0.38054607508532423,\n \"acc_norm_stderr\": 0.014188277712349822\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.48516231826329415,\n \"acc_stderr\": 0.004987583858923224,\n \"acc_norm\": 0.6643098984266083,\n \"acc_norm_stderr\": 0.004712660409846823\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816507,\n \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816507\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.1925925925925926,\n \"acc_stderr\": 0.034065420585026505,\n \"acc_norm\": 0.1925925925925926,\n \"acc_norm_stderr\": 0.034065420585026505\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.18421052631578946,\n \"acc_stderr\": 0.0315469804508223,\n \"acc_norm\": 0.18421052631578946,\n \"acc_norm_stderr\": 0.0315469804508223\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.2528301886792453,\n \"acc_stderr\": 0.026749899771241235,\n \"acc_norm\": 0.2528301886792453,\n \"acc_norm_stderr\": 0.026749899771241235\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2569444444444444,\n \"acc_stderr\": 0.03653946969442099,\n \"acc_norm\": 0.2569444444444444,\n \"acc_norm_stderr\": 0.03653946969442099\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816508,\n \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816508\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.18,\n \"acc_stderr\": 0.03861229196653695,\n \"acc_norm\": 0.18,\n \"acc_norm_stderr\": 0.03861229196653695\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.2138728323699422,\n \"acc_stderr\": 0.031265112061730445,\n \"acc_norm\": 0.2138728323699422,\n \"acc_norm_stderr\": 0.031265112061730445\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.20588235294117646,\n \"acc_stderr\": 0.04023382273617746,\n \"acc_norm\": 0.20588235294117646,\n \"acc_norm_stderr\": 0.04023382273617746\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768077,\n \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768077\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.28936170212765955,\n \"acc_stderr\": 0.029644006577009618,\n \"acc_norm\": 0.28936170212765955,\n \"acc_norm_stderr\": 0.029644006577009618\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.16666666666666666,\n \"acc_stderr\": 0.035058596825972656,\n \"acc_norm\": 0.16666666666666666,\n \"acc_norm_stderr\": 0.035058596825972656\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.2827586206896552,\n \"acc_stderr\": 0.037528339580033376,\n \"acc_norm\": 0.2827586206896552,\n \"acc_norm_stderr\": 0.037528339580033376\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.21428571428571427,\n \"acc_stderr\": 0.02113285918275445,\n \"acc_norm\": 0.21428571428571427,\n \"acc_norm_stderr\": 0.02113285918275445\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.1984126984126984,\n \"acc_stderr\": 0.03567016675276863,\n \"acc_norm\": 0.1984126984126984,\n \"acc_norm_stderr\": 0.03567016675276863\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.23,\n \"acc_stderr\": 0.042295258468165065,\n \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.042295258468165065\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.2032258064516129,\n \"acc_stderr\": 0.022891687984554952,\n \"acc_norm\": 0.2032258064516129,\n \"acc_norm_stderr\": 0.022891687984554952\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.21182266009852216,\n \"acc_stderr\": 0.028748983689941065,\n \"acc_norm\": 0.21182266009852216,\n \"acc_norm_stderr\": 0.028748983689941065\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768078,\n \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768078\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.2606060606060606,\n \"acc_stderr\": 0.03427743175816524,\n \"acc_norm\": 0.2606060606060606,\n \"acc_norm_stderr\": 0.03427743175816524\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.18686868686868688,\n \"acc_stderr\": 0.027772533334218977,\n \"acc_norm\": 0.18686868686868688,\n \"acc_norm_stderr\": 0.027772533334218977\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.26424870466321243,\n \"acc_stderr\": 0.031821550509166484,\n \"acc_norm\": 0.26424870466321243,\n \"acc_norm_stderr\": 0.031821550509166484\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.28717948717948716,\n \"acc_stderr\": 0.022939925418530616,\n \"acc_norm\": 0.28717948717948716,\n \"acc_norm_stderr\": 0.022939925418530616\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.2740740740740741,\n \"acc_stderr\": 0.027195934804085622,\n \"acc_norm\": 0.2740740740740741,\n \"acc_norm_stderr\": 0.027195934804085622\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.226890756302521,\n \"acc_stderr\": 0.02720537153827947,\n \"acc_norm\": 0.226890756302521,\n \"acc_norm_stderr\": 0.02720537153827947\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.24503311258278146,\n \"acc_stderr\": 0.035118075718047245,\n \"acc_norm\": 0.24503311258278146,\n \"acc_norm_stderr\": 0.035118075718047245\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.1981651376146789,\n \"acc_stderr\": 0.017090573804217885,\n \"acc_norm\": 0.1981651376146789,\n \"acc_norm_stderr\": 0.017090573804217885\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.1712962962962963,\n \"acc_stderr\": 0.02569534164382467,\n \"acc_norm\": 0.1712962962962963,\n \"acc_norm_stderr\": 0.02569534164382467\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.2549019607843137,\n \"acc_stderr\": 0.030587591351604246,\n \"acc_norm\": 0.2549019607843137,\n \"acc_norm_stderr\": 0.030587591351604246\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.2911392405063291,\n \"acc_stderr\": 0.02957160106575337,\n \"acc_norm\": 0.2911392405063291,\n \"acc_norm_stderr\": 0.02957160106575337\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.34080717488789236,\n \"acc_stderr\": 0.031811497470553604,\n \"acc_norm\": 0.34080717488789236,\n \"acc_norm_stderr\": 0.031811497470553604\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.2824427480916031,\n \"acc_stderr\": 0.03948406125768361,\n \"acc_norm\": 0.2824427480916031,\n \"acc_norm_stderr\": 0.03948406125768361\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.2727272727272727,\n \"acc_stderr\": 0.04065578140908705,\n \"acc_norm\": 0.2727272727272727,\n \"acc_norm_stderr\": 0.04065578140908705\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.2962962962962963,\n \"acc_stderr\": 0.044143436668549335,\n \"acc_norm\": 0.2962962962962963,\n \"acc_norm_stderr\": 0.044143436668549335\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.26993865030674846,\n \"acc_stderr\": 0.03487825168497892,\n \"acc_norm\": 0.26993865030674846,\n \"acc_norm_stderr\": 0.03487825168497892\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.35714285714285715,\n \"acc_stderr\": 0.04547960999764376,\n \"acc_norm\": 0.35714285714285715,\n \"acc_norm_stderr\": 0.04547960999764376\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.18446601941747573,\n \"acc_stderr\": 0.03840423627288276,\n \"acc_norm\": 0.18446601941747573,\n \"acc_norm_stderr\": 0.03840423627288276\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.3504273504273504,\n \"acc_stderr\": 0.03125610824421881,\n \"acc_norm\": 0.3504273504273504,\n \"acc_norm_stderr\": 0.03125610824421881\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.26053639846743293,\n \"acc_stderr\": 0.01569600856380709,\n \"acc_norm\": 0.26053639846743293,\n \"acc_norm_stderr\": 0.01569600856380709\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.25722543352601157,\n \"acc_stderr\": 0.023532925431044273,\n \"acc_norm\": 0.25722543352601157,\n \"acc_norm_stderr\": 0.023532925431044273\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24134078212290502,\n \"acc_stderr\": 0.014310999547961455,\n \"acc_norm\": 0.24134078212290502,\n \"acc_norm_stderr\": 0.014310999547961455\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.2875816993464052,\n \"acc_stderr\": 0.02591780611714716,\n \"acc_norm\": 0.2875816993464052,\n \"acc_norm_stderr\": 0.02591780611714716\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.2282958199356913,\n \"acc_stderr\": 0.023839303311398222,\n \"acc_norm\": 0.2282958199356913,\n \"acc_norm_stderr\": 0.023839303311398222\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.2345679012345679,\n \"acc_stderr\": 0.023576881744005716,\n \"acc_norm\": 0.2345679012345679,\n \"acc_norm_stderr\": 0.023576881744005716\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.25177304964539005,\n \"acc_stderr\": 0.025892151156709405,\n \"acc_norm\": 0.25177304964539005,\n \"acc_norm_stderr\": 0.025892151156709405\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2770534550195567,\n \"acc_stderr\": 0.011430462443719676,\n \"acc_norm\": 0.2770534550195567,\n \"acc_norm_stderr\": 0.011430462443719676\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.19852941176470587,\n \"acc_stderr\": 0.024231013370541104,\n \"acc_norm\": 0.19852941176470587,\n \"acc_norm_stderr\": 0.024231013370541104\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.2549019607843137,\n \"acc_stderr\": 0.017630827375148383,\n \"acc_norm\": 0.2549019607843137,\n \"acc_norm_stderr\": 0.017630827375148383\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.04389311454644287,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.04389311454644287\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.2530612244897959,\n \"acc_stderr\": 0.027833023871399677,\n \"acc_norm\": 0.2530612244897959,\n \"acc_norm_stderr\": 0.027833023871399677\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.23880597014925373,\n \"acc_stderr\": 0.030147775935409217,\n \"acc_norm\": 0.23880597014925373,\n \"acc_norm_stderr\": 0.030147775935409217\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.30120481927710846,\n \"acc_stderr\": 0.03571609230053481,\n \"acc_norm\": 0.30120481927710846,\n \"acc_norm_stderr\": 0.03571609230053481\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.3216374269005848,\n \"acc_stderr\": 0.03582529442573122,\n \"acc_norm\": 0.3216374269005848,\n \"acc_norm_stderr\": 0.03582529442573122\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.23623011015911874,\n \"mc1_stderr\": 0.014869755015871108,\n \"mc2\": 0.4093188700877857,\n \"mc2_stderr\": 0.014339231042407396\n }\n}\n```", "repo_url": "https://huggingface.co/NoIdeaLand/test-3k-mx", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_09_22T02_20_18.679270", "path": ["**/details_harness|arc:challenge|25_2023-09-22T02-20-18.679270.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-09-22T02-20-18.679270.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_09_22T02_20_18.679270", "path": ["**/details_harness|hellaswag|10_2023-09-22T02-20-18.679270.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-09-22T02-20-18.679270.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_09_22T02_20_18.679270", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-22T02-20-18.679270.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-09-22T02-20-18.679270.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-09-22T02-20-18.679270.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-09-22T02-20-18.679270.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-22T02-20-18.679270.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-09-22T02-20-18.679270.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-09-22T02-20-18.679270.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-09-22T02-20-18.679270.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-09-22T02-20-18.679270.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-09-22T02-20-18.679270.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-09-22T02-20-18.679270.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-09-22T02-20-18.679270.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-22T02-20-18.679270.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-09-22T02-20-18.679270.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-22T02-20-18.679270.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-22T02-20-18.679270.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-09-22T02-20-18.679270.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-09-22T02-20-18.679270.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-09-22T02-20-18.679270.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-22T02-20-18.679270.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-22T02-20-18.679270.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-22T02-20-18.679270.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-09-22T02-20-18.679270.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-22T02-20-18.679270.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-22T02-20-18.679270.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-22T02-20-18.679270.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-22T02-20-18.679270.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-09-22T02-20-18.679270.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-22T02-20-18.679270.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-22T02-20-18.679270.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-22T02-20-18.679270.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-22T02-20-18.679270.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-09-22T02-20-18.679270.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-09-22T02-20-18.679270.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-09-22T02-20-18.679270.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-09-22T02-20-18.679270.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-22T02-20-18.679270.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-09-22T02-20-18.679270.parquet", "**/details_harness|hendrycksTest-management|5_2023-09-22T02-20-18.679270.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-09-22T02-20-18.679270.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-09-22T02-20-18.679270.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-09-22T02-20-18.679270.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-09-22T02-20-18.679270.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-22T02-20-18.679270.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-09-22T02-20-18.679270.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-09-22T02-20-18.679270.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-09-22T02-20-18.679270.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-09-22T02-20-18.679270.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-09-22T02-20-18.679270.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-09-22T02-20-18.679270.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-09-22T02-20-18.679270.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-09-22T02-20-18.679270.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-09-22T02-20-18.679270.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-09-22T02-20-18.679270.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-22T02-20-18.679270.parquet", "**/details_harness|hendrycksTest-virology|5_2023-09-22T02-20-18.679270.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-09-22T02-20-18.679270.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-22T02-20-18.679270.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-09-22T02-20-18.679270.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-09-22T02-20-18.679270.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-09-22T02-20-18.679270.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-22T02-20-18.679270.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-09-22T02-20-18.679270.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-09-22T02-20-18.679270.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-09-22T02-20-18.679270.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-09-22T02-20-18.679270.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-09-22T02-20-18.679270.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-09-22T02-20-18.679270.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-09-22T02-20-18.679270.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-22T02-20-18.679270.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-09-22T02-20-18.679270.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-22T02-20-18.679270.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-22T02-20-18.679270.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-09-22T02-20-18.679270.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-09-22T02-20-18.679270.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-09-22T02-20-18.679270.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-22T02-20-18.679270.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-22T02-20-18.679270.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-22T02-20-18.679270.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-09-22T02-20-18.679270.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-22T02-20-18.679270.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-22T02-20-18.679270.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-22T02-20-18.679270.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-22T02-20-18.679270.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-09-22T02-20-18.679270.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-22T02-20-18.679270.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-22T02-20-18.679270.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-22T02-20-18.679270.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-22T02-20-18.679270.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-09-22T02-20-18.679270.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-09-22T02-20-18.679270.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-09-22T02-20-18.679270.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-09-22T02-20-18.679270.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-22T02-20-18.679270.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-09-22T02-20-18.679270.parquet", "**/details_harness|hendrycksTest-management|5_2023-09-22T02-20-18.679270.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-09-22T02-20-18.679270.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-09-22T02-20-18.679270.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-09-22T02-20-18.679270.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-09-22T02-20-18.679270.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-22T02-20-18.679270.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-09-22T02-20-18.679270.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-09-22T02-20-18.679270.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-09-22T02-20-18.679270.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-09-22T02-20-18.679270.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-09-22T02-20-18.679270.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-09-22T02-20-18.679270.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-09-22T02-20-18.679270.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-09-22T02-20-18.679270.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-09-22T02-20-18.679270.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-09-22T02-20-18.679270.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-22T02-20-18.679270.parquet", "**/details_harness|hendrycksTest-virology|5_2023-09-22T02-20-18.679270.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-09-22T02-20-18.679270.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_09_22T02_20_18.679270", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-22T02-20-18.679270.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-22T02-20-18.679270.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_09_22T02_20_18.679270", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-09-22T02-20-18.679270.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-09-22T02-20-18.679270.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_09_22T02_20_18.679270", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-09-22T02-20-18.679270.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-09-22T02-20-18.679270.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_09_22T02_20_18.679270", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-09-22T02-20-18.679270.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-09-22T02-20-18.679270.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_09_22T02_20_18.679270", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-22T02-20-18.679270.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-22T02-20-18.679270.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_09_22T02_20_18.679270", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-09-22T02-20-18.679270.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-09-22T02-20-18.679270.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_09_22T02_20_18.679270", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-09-22T02-20-18.679270.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-09-22T02-20-18.679270.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_09_22T02_20_18.679270", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-09-22T02-20-18.679270.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-09-22T02-20-18.679270.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_09_22T02_20_18.679270", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-09-22T02-20-18.679270.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-09-22T02-20-18.679270.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_09_22T02_20_18.679270", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-09-22T02-20-18.679270.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-09-22T02-20-18.679270.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_09_22T02_20_18.679270", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-09-22T02-20-18.679270.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-09-22T02-20-18.679270.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_09_22T02_20_18.679270", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-09-22T02-20-18.679270.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-09-22T02-20-18.679270.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_09_22T02_20_18.679270", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-22T02-20-18.679270.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-22T02-20-18.679270.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_09_22T02_20_18.679270", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-09-22T02-20-18.679270.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-09-22T02-20-18.679270.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_09_22T02_20_18.679270", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-22T02-20-18.679270.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-22T02-20-18.679270.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_09_22T02_20_18.679270", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-22T02-20-18.679270.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-22T02-20-18.679270.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_09_22T02_20_18.679270", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-09-22T02-20-18.679270.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-09-22T02-20-18.679270.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_09_22T02_20_18.679270", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-09-22T02-20-18.679270.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-09-22T02-20-18.679270.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_09_22T02_20_18.679270", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-09-22T02-20-18.679270.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-09-22T02-20-18.679270.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_09_22T02_20_18.679270", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-22T02-20-18.679270.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-22T02-20-18.679270.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_09_22T02_20_18.679270", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-22T02-20-18.679270.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-22T02-20-18.679270.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_09_22T02_20_18.679270", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-22T02-20-18.679270.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-22T02-20-18.679270.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_09_22T02_20_18.679270", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-09-22T02-20-18.679270.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-09-22T02-20-18.679270.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_09_22T02_20_18.679270", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-22T02-20-18.679270.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-22T02-20-18.679270.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_09_22T02_20_18.679270", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-22T02-20-18.679270.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-22T02-20-18.679270.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_09_22T02_20_18.679270", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-22T02-20-18.679270.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-22T02-20-18.679270.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_09_22T02_20_18.679270", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-22T02-20-18.679270.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-22T02-20-18.679270.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_09_22T02_20_18.679270", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-09-22T02-20-18.679270.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-09-22T02-20-18.679270.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_09_22T02_20_18.679270", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-22T02-20-18.679270.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-22T02-20-18.679270.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_09_22T02_20_18.679270", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-22T02-20-18.679270.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-22T02-20-18.679270.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_09_22T02_20_18.679270", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-22T02-20-18.679270.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-22T02-20-18.679270.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_09_22T02_20_18.679270", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-22T02-20-18.679270.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-22T02-20-18.679270.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_09_22T02_20_18.679270", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-09-22T02-20-18.679270.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-09-22T02-20-18.679270.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_09_22T02_20_18.679270", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-09-22T02-20-18.679270.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-09-22T02-20-18.679270.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_09_22T02_20_18.679270", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-09-22T02-20-18.679270.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-09-22T02-20-18.679270.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_09_22T02_20_18.679270", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-09-22T02-20-18.679270.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-09-22T02-20-18.679270.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_09_22T02_20_18.679270", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-22T02-20-18.679270.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-22T02-20-18.679270.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_09_22T02_20_18.679270", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-09-22T02-20-18.679270.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-09-22T02-20-18.679270.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_09_22T02_20_18.679270", "path": ["**/details_harness|hendrycksTest-management|5_2023-09-22T02-20-18.679270.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-09-22T02-20-18.679270.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_09_22T02_20_18.679270", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-09-22T02-20-18.679270.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-09-22T02-20-18.679270.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_09_22T02_20_18.679270", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-09-22T02-20-18.679270.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-09-22T02-20-18.679270.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_09_22T02_20_18.679270", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-09-22T02-20-18.679270.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-09-22T02-20-18.679270.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_09_22T02_20_18.679270", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-09-22T02-20-18.679270.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-09-22T02-20-18.679270.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_09_22T02_20_18.679270", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-22T02-20-18.679270.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-22T02-20-18.679270.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_09_22T02_20_18.679270", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-09-22T02-20-18.679270.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-09-22T02-20-18.679270.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_09_22T02_20_18.679270", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-09-22T02-20-18.679270.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-09-22T02-20-18.679270.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_09_22T02_20_18.679270", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-09-22T02-20-18.679270.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-09-22T02-20-18.679270.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_09_22T02_20_18.679270", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-09-22T02-20-18.679270.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-09-22T02-20-18.679270.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_09_22T02_20_18.679270", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-09-22T02-20-18.679270.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-09-22T02-20-18.679270.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_09_22T02_20_18.679270", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-09-22T02-20-18.679270.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-09-22T02-20-18.679270.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_09_22T02_20_18.679270", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-09-22T02-20-18.679270.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-09-22T02-20-18.679270.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_09_22T02_20_18.679270", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-09-22T02-20-18.679270.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-09-22T02-20-18.679270.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_09_22T02_20_18.679270", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-09-22T02-20-18.679270.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-09-22T02-20-18.679270.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_09_22T02_20_18.679270", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-09-22T02-20-18.679270.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-09-22T02-20-18.679270.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_09_22T02_20_18.679270", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-22T02-20-18.679270.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-22T02-20-18.679270.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_09_22T02_20_18.679270", "path": ["**/details_harness|hendrycksTest-virology|5_2023-09-22T02-20-18.679270.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-09-22T02-20-18.679270.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_09_22T02_20_18.679270", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-09-22T02-20-18.679270.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-09-22T02-20-18.679270.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_09_22T02_20_18.679270", "path": ["**/details_harness|truthfulqa:mc|0_2023-09-22T02-20-18.679270.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-09-22T02-20-18.679270.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_09_22T02_20_18.679270", "path": ["results_2023-09-22T02-20-18.679270.parquet"]}, {"split": "latest", "path": ["results_2023-09-22T02-20-18.679270.parquet"]}]}]}
2023-09-22T01:21:41+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of NoIdeaLand/test-3k-mx ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model NoIdeaLand/test-3k-mx on the Open LLM Leaderboard. The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-09-22T02:20:18.679270(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of NoIdeaLand/test-3k-mx", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model NoIdeaLand/test-3k-mx on the Open LLM Leaderboard.\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-09-22T02:20:18.679270(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of NoIdeaLand/test-3k-mx", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model NoIdeaLand/test-3k-mx on the Open LLM Leaderboard.\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-09-22T02:20:18.679270(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 21, 31, 169, 66, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of NoIdeaLand/test-3k-mx## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model NoIdeaLand/test-3k-mx on the Open LLM Leaderboard.\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-09-22T02:20:18.679270(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
dcca2ec0fe14fcbc89b7bd7eaa1181d59c25e1f4
## 中文闲聊数据集 role 的取值有: "unknown", "human", "assistant", 三种. 数据集从网上收集整理如下: | 数据 | 原始数据/项目地址 | 样本个数 | 语料描述 | 替代数据下载地址 | | :--- | :---: | :---: | :---: | :---: | | ChatterBot | [ChatterBot](https://github.com/gunthercox/ChatterBot); [chatterbot-corpus](https://github.com/gunthercox/chatterbot-corpus) | 560 | 按类型分类,质量较高 | [阿里云盘](https://www.aliyundrive.com/s/qXBdAYtz5j5); 提取码: 81ao | | douban | [Douban Conversation Corpus](https://github.com/MarkWuNLP/MultiTurnResponseSelection) | 352W | 来自北航和微软的paper, 噪音相对较少, 多轮(平均7.6轮) | [阿里云盘](https://www.aliyundrive.com/s/qXBdAYtz5j5); 提取码: 81ao | | ptt | [PTT中文語料](https://github.com/zake7749/Gossiping-Chinese-Corpus) | 77W | 开源项目, 台湾PTT论坛八卦版, 繁体, 语料较生活化, 有噪音 | [阿里云盘](https://www.aliyundrive.com/s/qXBdAYtz5j5); 提取码: 81ao | | qingyun | [阿里云盘](https://www.aliyundrive.com/s/qXBdAYtz5j5); 提取码: 81ao | 10W | 青云语料, 相对不错, 生活化 | | | subtitle | [电视剧对白语料](https://github.com/aceimnorstuvwxz/dgk_lost_conv) | 274W | 来自爬取的电影和美剧的字幕, 有一些噪音, 不严谨的对话, 说话人无法对应起来, 多轮(平均5.3轮) | [阿里云盘](https://www.aliyundrive.com/s/qXBdAYtz5j5); 提取码: 81ao | | tieba | [贴吧论坛回帖语料](https://pan.baidu.com/s/1mUknfwy1nhSM7XzH8xi7gQ); 密码:i4si | 232W | 多轮, 有噪音 | [阿里云盘](https://www.aliyundrive.com/s/qXBdAYtz5j5); 提取码: 81ao | | weibo | [阿里云盘](https://www.aliyundrive.com/s/qXBdAYtz5j5); 提取码: 81ao | 443W | 来自华为的paper | | | xiaohuangji | [小黄鸡语料](https://github.com/candlewill/Dialog_Corpus) | 45W | 原人人网项目语料, 有一些不雅对话, 少量噪音 | [阿里云盘](https://www.aliyundrive.com/s/qXBdAYtz5j5); 提取码: 81ao | <details> <summary>参考的数据来源,展开查看</summary> <pre> <code> https://github.com/codemayq/chinese_chatbot_corpus https://github.com/yangjianxin1/GPT2-chitchat </code> </pre> </details>
qgyd2021/chinese_chitchat
[ "size_categories:100M<n<1B", "language:zh", "license:apache-2.0", "chitchat", "region:us" ]
2023-09-22T01:24:54+00:00
{"language": ["zh"], "license": "apache-2.0", "size_categories": ["100M<n<1B"], "tags": ["chitchat"]}
2023-09-22T07:39:11+00:00
[]
[ "zh" ]
TAGS #size_categories-100M<n<1B #language-Chinese #license-apache-2.0 #chitchat #region-us
中文闲聊数据集 ------- role 的取值有: "unknown", "human", "assistant", 三种. 数据集从网上收集整理如下: 参考的数据来源,展开查看 ``` URL URL ```
[]
[ "TAGS\n#size_categories-100M<n<1B #language-Chinese #license-apache-2.0 #chitchat #region-us \n" ]
[ 35 ]
[ "passage: TAGS\n#size_categories-100M<n<1B #language-Chinese #license-apache-2.0 #chitchat #region-us \n" ]
376870dcb1055101b4507b9709d2e2d2090fcb2f
# Dataset Card for "novatusTest" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
minwook/novatusTest
[ "region:us" ]
2023-09-22T01:26:37+00:00
{"dataset_info": {"features": [{"name": "question", "dtype": "string"}, {"name": "answer", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 1384, "num_examples": 2}], "download_size": 5958, "dataset_size": 1384}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]}
2023-09-22T01:27:11+00:00
[]
[]
TAGS #region-us
# Dataset Card for "novatusTest" More Information needed
[ "# Dataset Card for \"novatusTest\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"novatusTest\"\n\nMore Information needed" ]
[ 6, 13 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"novatusTest\"\n\nMore Information needed" ]
608cc1f2371c4dfb73c3e8bebfa41038b2c6fa29
# Dataset Card for "data_soict_train_synthesis_entity" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
thanhduycao/data_soict_train_synthesis_entity
[ "region:us" ]
2023-09-22T01:37:14+00:00
{"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "test", "path": "data/test-*"}]}], "dataset_info": {"features": [{"name": "id", "dtype": "string"}, {"name": "audio", "struct": [{"name": "array", "sequence": "float64"}, {"name": "path", "dtype": "string"}, {"name": "sampling_rate", "dtype": "int64"}]}, {"name": "sentence_norm", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 6498333095, "num_examples": 18312}, {"name": "test", "num_bytes": 389981876, "num_examples": 748}], "download_size": 1639149838, "dataset_size": 6888314971}}
2023-09-22T01:39:25+00:00
[]
[]
TAGS #region-us
# Dataset Card for "data_soict_train_synthesis_entity" More Information needed
[ "# Dataset Card for \"data_soict_train_synthesis_entity\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"data_soict_train_synthesis_entity\"\n\nMore Information needed" ]
[ 6, 25 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"data_soict_train_synthesis_entity\"\n\nMore Information needed" ]
2bea4b5a9b1c371ecd04ae4433b4129fe236a869
# Dataset of Nagatoro Hayase This is the dataset of Nagatoro Hayase, containing 300 images and their tags. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)). | Name | Images | Download | Description | |:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------| | raw | 300 | [Download](dataset-raw.zip) | Raw data with meta information. | | raw-stage3 | 650 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. | | 384x512 | 300 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. | | 512x512 | 300 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. | | 512x704 | 300 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. | | 640x640 | 300 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. | | 640x880 | 300 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. | | stage3-640 | 650 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. | | stage3-800 | 650 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. | | stage3-1200 | 650 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
CyberHarem/nagatoro_hayase_donttoywithmemissnagatoro
[ "task_categories:text-to-image", "size_categories:n<1K", "license:mit", "art", "not-for-all-audiences", "region:us" ]
2023-09-22T01:44:25+00:00
{"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]}
2023-09-22T01:47:57+00:00
[]
[]
TAGS #task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
Dataset of Nagatoro Hayase ========================== This is the dataset of Nagatoro Hayase, containing 300 images and their tags. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
[]
[ "TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n" ]
[ 44 ]
[ "passage: TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n" ]
4ca02913204dfaa2f0ba62e60406faa1b844cadf
# Dataset of Wahira Nagomi This is the dataset of Wahira Nagomi, containing 295 images and their tags. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)). | Name | Images | Download | Description | |:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------| | raw | 295 | [Download](dataset-raw.zip) | Raw data with meta information. | | raw-stage3 | 738 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. | | 384x512 | 295 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. | | 512x512 | 295 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. | | 512x704 | 295 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. | | 640x640 | 295 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. | | 640x880 | 295 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. | | stage3-640 | 738 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. | | stage3-800 | 738 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. | | stage3-1200 | 738 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
CyberHarem/wahira_nagomi_akibameidosensou
[ "task_categories:text-to-image", "size_categories:n<1K", "license:mit", "art", "not-for-all-audiences", "region:us" ]
2023-09-22T01:53:13+00:00
{"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]}
2023-09-22T01:58:29+00:00
[]
[]
TAGS #task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
Dataset of Wahira Nagomi ======================== This is the dataset of Wahira Nagomi, containing 295 images and their tags. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
[]
[ "TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n" ]
[ 44 ]
[ "passage: TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n" ]
d3fcab7df8d70a1c162cab8717970fefca970f6a
# Dataset of Maki Gamou This is the dataset of Maki Gamou, containing 141 images and their tags. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)). | Name | Images | Download | Description | |:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------| | raw | 141 | [Download](dataset-raw.zip) | Raw data with meta information. | | raw-stage3 | 351 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. | | 384x512 | 141 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. | | 512x512 | 141 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. | | 512x704 | 141 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. | | 640x640 | 141 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. | | 640x880 | 141 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. | | stage3-640 | 351 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. | | stage3-800 | 351 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. | | stage3-1200 | 351 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
CyberHarem/maki_gamou_donttoywithmemissnagatoro
[ "task_categories:text-to-image", "size_categories:n<1K", "license:mit", "art", "not-for-all-audiences", "region:us" ]
2023-09-22T01:57:43+00:00
{"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]}
2023-09-22T02:02:47+00:00
[]
[]
TAGS #task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
Dataset of Maki Gamou ===================== This is the dataset of Maki Gamou, containing 141 images and their tags. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
[]
[ "TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n" ]
[ 44 ]
[ "passage: TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n" ]
4661405ec022e3b5c9a5cf6357a9b26e84cb4ab8
# Dataset of Yoshi This is the dataset of Yoshi, containing 86 images and their tags. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)). | Name | Images | Download | Description | |:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------| | raw | 86 | [Download](dataset-raw.zip) | Raw data with meta information. | | raw-stage3 | 223 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. | | 384x512 | 86 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. | | 512x512 | 86 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. | | 512x704 | 86 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. | | 640x640 | 86 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. | | 640x880 | 86 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. | | stage3-640 | 223 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. | | stage3-800 | 223 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. | | stage3-1200 | 223 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
CyberHarem/yoshi_donttoywithmemissnagatoro
[ "task_categories:text-to-image", "size_categories:n<1K", "license:mit", "art", "not-for-all-audiences", "region:us" ]
2023-09-22T02:08:36+00:00
{"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]}
2023-09-22T02:11:10+00:00
[]
[]
TAGS #task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
Dataset of Yoshi ================ This is the dataset of Yoshi, containing 86 images and their tags. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
[]
[ "TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n" ]
[ 44 ]
[ "passage: TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n" ]
5c996d0144fb7c5df4c336534cc38075799bfa2e
# Dataset of Sakura This is the dataset of Sakura, containing 79 images and their tags. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)). | Name | Images | Download | Description | |:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------| | raw | 79 | [Download](dataset-raw.zip) | Raw data with meta information. | | raw-stage3 | 178 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. | | 384x512 | 79 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. | | 512x512 | 79 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. | | 512x704 | 79 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. | | 640x640 | 79 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. | | 640x880 | 79 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. | | stage3-640 | 178 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. | | stage3-800 | 178 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. | | stage3-1200 | 178 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
CyberHarem/sakura_donttoywithmemissnagatoro
[ "task_categories:text-to-image", "size_categories:n<1K", "license:mit", "art", "not-for-all-audiences", "region:us" ]
2023-09-22T02:16:15+00:00
{"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]}
2023-09-22T02:17:32+00:00
[]
[]
TAGS #task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
Dataset of Sakura ================= This is the dataset of Sakura, containing 79 images and their tags. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
[]
[ "TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n" ]
[ 44 ]
[ "passage: TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n" ]
4c7daa4ae0f45c5fb750675b5e3615cbaac13b1a
# Dataset of Mannen Ranko This is the dataset of Mannen Ranko, containing 263 images and their tags. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)). | Name | Images | Download | Description | |:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------| | raw | 263 | [Download](dataset-raw.zip) | Raw data with meta information. | | raw-stage3 | 616 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. | | 384x512 | 263 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. | | 512x512 | 263 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. | | 512x704 | 263 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. | | 640x640 | 263 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. | | 640x880 | 263 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. | | stage3-640 | 616 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. | | stage3-800 | 616 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. | | stage3-1200 | 616 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
CyberHarem/mannen_ranko_akibameidosensou
[ "task_categories:text-to-image", "size_categories:n<1K", "license:mit", "art", "not-for-all-audiences", "region:us" ]
2023-09-22T02:19:04+00:00
{"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]}
2023-09-22T02:23:41+00:00
[]
[]
TAGS #task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
Dataset of Mannen Ranko ======================= This is the dataset of Mannen Ranko, containing 263 images and their tags. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
[]
[ "TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n" ]
[ 44 ]
[ "passage: TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n" ]
7ccea66d1c92fdd021ef611a40d055a06e0904ac
# Dataset of Yumechi This is the dataset of Yumechi, containing 164 images and their tags. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)). | Name | Images | Download | Description | |:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------| | raw | 164 | [Download](dataset-raw.zip) | Raw data with meta information. | | raw-stage3 | 407 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. | | 384x512 | 164 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. | | 512x512 | 164 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. | | 512x704 | 164 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. | | 640x640 | 164 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. | | 640x880 | 164 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. | | stage3-640 | 407 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. | | stage3-800 | 407 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. | | stage3-1200 | 407 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
CyberHarem/yumechi_akibameidosensou
[ "task_categories:text-to-image", "size_categories:n<1K", "license:mit", "art", "not-for-all-audiences", "region:us" ]
2023-09-22T02:35:27+00:00
{"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]}
2023-09-22T02:38:10+00:00
[]
[]
TAGS #task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
Dataset of Yumechi ================== This is the dataset of Yumechi, containing 164 images and their tags. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
[]
[ "TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n" ]
[ 44 ]
[ "passage: TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n" ]
7585b12647b4bf7405ac607bc7af80f587e9e61d
# Dataset Card for Evaluation run of YeungNLP/firefly-llama2-7b-chat-temp ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/YeungNLP/firefly-llama2-7b-chat-temp - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [YeungNLP/firefly-llama2-7b-chat-temp](https://huggingface.co/YeungNLP/firefly-llama2-7b-chat-temp) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_YeungNLP__firefly-llama2-7b-chat-temp", "harness_truthfulqa_mc_0", split="train") ``` ## Latest results These are the [latest results from run 2023-09-22T03:37:32.448737](https://huggingface.co/datasets/open-llm-leaderboard/details_YeungNLP__firefly-llama2-7b-chat-temp/blob/main/results_2023-09-22T03-37-32.448737.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.45675748784133063, "acc_stderr": 0.03523878979242221, "acc_norm": 0.46039529518457817, "acc_norm_stderr": 0.03522948379579862, "mc1": 0.31456548347613217, "mc1_stderr": 0.016255241993179178, "mc2": 0.46775413014717326, "mc2_stderr": 0.015305512973889742 }, "harness|arc:challenge|25": { "acc": 0.48293515358361777, "acc_stderr": 0.0146028783885366, "acc_norm": 0.5119453924914675, "acc_norm_stderr": 0.014607220340597167 }, "harness|hellaswag|10": { "acc": 0.5476000796654052, "acc_stderr": 0.004967118575905287, "acc_norm": 0.7332204740091616, "acc_norm_stderr": 0.004413722823053159 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.33, "acc_stderr": 0.04725815626252606, "acc_norm": 0.33, "acc_norm_stderr": 0.04725815626252606 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.45185185185185184, "acc_stderr": 0.04299268905480864, "acc_norm": 0.45185185185185184, "acc_norm_stderr": 0.04299268905480864 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.4144736842105263, "acc_stderr": 0.04008973785779206, "acc_norm": 0.4144736842105263, "acc_norm_stderr": 0.04008973785779206 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.52, "acc_stderr": 0.050211673156867795, "acc_norm": 0.52, "acc_norm_stderr": 0.050211673156867795 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.5056603773584906, "acc_stderr": 0.03077090076385131, "acc_norm": 0.5056603773584906, "acc_norm_stderr": 0.03077090076385131 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.4305555555555556, "acc_stderr": 0.04140685639111502, "acc_norm": 0.4305555555555556, "acc_norm_stderr": 0.04140685639111502 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.32, "acc_stderr": 0.04688261722621504, "acc_norm": 0.32, "acc_norm_stderr": 0.04688261722621504 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.42, "acc_stderr": 0.049604496374885836, "acc_norm": 0.42, "acc_norm_stderr": 0.049604496374885836 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.39, "acc_stderr": 0.04902071300001975, "acc_norm": 0.39, "acc_norm_stderr": 0.04902071300001975 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.3988439306358382, "acc_stderr": 0.037336266553835096, "acc_norm": 0.3988439306358382, "acc_norm_stderr": 0.037336266553835096 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.1568627450980392, "acc_stderr": 0.036186648199362466, "acc_norm": 0.1568627450980392, "acc_norm_stderr": 0.036186648199362466 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.61, "acc_stderr": 0.04902071300001974, "acc_norm": 0.61, "acc_norm_stderr": 0.04902071300001974 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.3829787234042553, "acc_stderr": 0.03177821250236922, "acc_norm": 0.3829787234042553, "acc_norm_stderr": 0.03177821250236922 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.2807017543859649, "acc_stderr": 0.042270544512322004, "acc_norm": 0.2807017543859649, "acc_norm_stderr": 0.042270544512322004 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.4206896551724138, "acc_stderr": 0.0411391498118926, "acc_norm": 0.4206896551724138, "acc_norm_stderr": 0.0411391498118926 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.25925925925925924, "acc_stderr": 0.02256989707491841, "acc_norm": 0.25925925925925924, "acc_norm_stderr": 0.02256989707491841 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.2857142857142857, "acc_stderr": 0.04040610178208841, "acc_norm": 0.2857142857142857, "acc_norm_stderr": 0.04040610178208841 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.29, "acc_stderr": 0.04560480215720684, "acc_norm": 0.29, "acc_norm_stderr": 0.04560480215720684 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.4774193548387097, "acc_stderr": 0.028414985019707868, "acc_norm": 0.4774193548387097, "acc_norm_stderr": 0.028414985019707868 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.30049261083743845, "acc_stderr": 0.03225799476233484, "acc_norm": 0.30049261083743845, "acc_norm_stderr": 0.03225799476233484 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.4, "acc_stderr": 0.04923659639173309, "acc_norm": 0.4, "acc_norm_stderr": 0.04923659639173309 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.5515151515151515, "acc_stderr": 0.03883565977956929, "acc_norm": 0.5515151515151515, "acc_norm_stderr": 0.03883565977956929 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.5656565656565656, "acc_stderr": 0.03531505879359183, "acc_norm": 0.5656565656565656, "acc_norm_stderr": 0.03531505879359183 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.5751295336787565, "acc_stderr": 0.035674713352125395, "acc_norm": 0.5751295336787565, "acc_norm_stderr": 0.035674713352125395 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.4128205128205128, "acc_stderr": 0.024962683564331803, "acc_norm": 0.4128205128205128, "acc_norm_stderr": 0.024962683564331803 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.2851851851851852, "acc_stderr": 0.027528599210340492, "acc_norm": 0.2851851851851852, "acc_norm_stderr": 0.027528599210340492 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.39915966386554624, "acc_stderr": 0.03181110032413925, "acc_norm": 0.39915966386554624, "acc_norm_stderr": 0.03181110032413925 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.33774834437086093, "acc_stderr": 0.038615575462551684, "acc_norm": 0.33774834437086093, "acc_norm_stderr": 0.038615575462551684 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.5834862385321101, "acc_stderr": 0.021136376504030864, "acc_norm": 0.5834862385321101, "acc_norm_stderr": 0.021136376504030864 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.3101851851851852, "acc_stderr": 0.03154696285656629, "acc_norm": 0.3101851851851852, "acc_norm_stderr": 0.03154696285656629 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.5637254901960784, "acc_stderr": 0.03480693138457039, "acc_norm": 0.5637254901960784, "acc_norm_stderr": 0.03480693138457039 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.5991561181434599, "acc_stderr": 0.031900803894732356, "acc_norm": 0.5991561181434599, "acc_norm_stderr": 0.031900803894732356 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.5112107623318386, "acc_stderr": 0.033549366530984746, "acc_norm": 0.5112107623318386, "acc_norm_stderr": 0.033549366530984746 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.5267175572519084, "acc_stderr": 0.04379024936553894, "acc_norm": 0.5267175572519084, "acc_norm_stderr": 0.04379024936553894 }, "harness|hendrycksTest-international_law|5": { "acc": 0.628099173553719, "acc_stderr": 0.044120158066245044, "acc_norm": 0.628099173553719, "acc_norm_stderr": 0.044120158066245044 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.49074074074074076, "acc_stderr": 0.04832853553437055, "acc_norm": 0.49074074074074076, "acc_norm_stderr": 0.04832853553437055 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.49693251533742333, "acc_stderr": 0.03928297078179662, "acc_norm": 0.49693251533742333, "acc_norm_stderr": 0.03928297078179662 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.41964285714285715, "acc_stderr": 0.04684099321077106, "acc_norm": 0.41964285714285715, "acc_norm_stderr": 0.04684099321077106 }, "harness|hendrycksTest-management|5": { "acc": 0.5825242718446602, "acc_stderr": 0.048828405482122375, "acc_norm": 0.5825242718446602, "acc_norm_stderr": 0.048828405482122375 }, "harness|hendrycksTest-marketing|5": { "acc": 0.6666666666666666, "acc_stderr": 0.03088273697413866, "acc_norm": 0.6666666666666666, "acc_norm_stderr": 0.03088273697413866 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.51, "acc_stderr": 0.05024183937956913, "acc_norm": 0.51, "acc_norm_stderr": 0.05024183937956913 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.5862068965517241, "acc_stderr": 0.01761220408466377, "acc_norm": 0.5862068965517241, "acc_norm_stderr": 0.01761220408466377 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.5057803468208093, "acc_stderr": 0.026917296179149123, "acc_norm": 0.5057803468208093, "acc_norm_stderr": 0.026917296179149123 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.24134078212290502, "acc_stderr": 0.01431099954796145, "acc_norm": 0.24134078212290502, "acc_norm_stderr": 0.01431099954796145 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.49019607843137253, "acc_stderr": 0.028624412550167958, "acc_norm": 0.49019607843137253, "acc_norm_stderr": 0.028624412550167958 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.5144694533762058, "acc_stderr": 0.02838619808417768, "acc_norm": 0.5144694533762058, "acc_norm_stderr": 0.02838619808417768 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.5092592592592593, "acc_stderr": 0.027815973433878014, "acc_norm": 0.5092592592592593, "acc_norm_stderr": 0.027815973433878014 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.35106382978723405, "acc_stderr": 0.028473501272963764, "acc_norm": 0.35106382978723405, "acc_norm_stderr": 0.028473501272963764 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.3376792698826597, "acc_stderr": 0.012078563777145574, "acc_norm": 0.3376792698826597, "acc_norm_stderr": 0.012078563777145574 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.4264705882352941, "acc_stderr": 0.03004261583271487, "acc_norm": 0.4264705882352941, "acc_norm_stderr": 0.03004261583271487 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.44281045751633985, "acc_stderr": 0.02009508315457735, "acc_norm": 0.44281045751633985, "acc_norm_stderr": 0.02009508315457735 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6, "acc_stderr": 0.0469237132203465, "acc_norm": 0.6, "acc_norm_stderr": 0.0469237132203465 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.40408163265306124, "acc_stderr": 0.03141470802586589, "acc_norm": 0.40408163265306124, "acc_norm_stderr": 0.03141470802586589 }, "harness|hendrycksTest-sociology|5": { "acc": 0.6467661691542289, "acc_stderr": 0.03379790611796777, "acc_norm": 0.6467661691542289, "acc_norm_stderr": 0.03379790611796777 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.7, "acc_stderr": 0.04605661864718381, "acc_norm": 0.7, "acc_norm_stderr": 0.04605661864718381 }, "harness|hendrycksTest-virology|5": { "acc": 0.42771084337349397, "acc_stderr": 0.038515976837185335, "acc_norm": 0.42771084337349397, "acc_norm_stderr": 0.038515976837185335 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.672514619883041, "acc_stderr": 0.035993357714560276, "acc_norm": 0.672514619883041, "acc_norm_stderr": 0.035993357714560276 }, "harness|truthfulqa:mc|0": { "mc1": 0.31456548347613217, "mc1_stderr": 0.016255241993179178, "mc2": 0.46775413014717326, "mc2_stderr": 0.015305512973889742 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_YeungNLP__firefly-llama2-7b-chat-temp
[ "region:us" ]
2023-09-22T02:37:56+00:00
{"pretty_name": "Evaluation run of YeungNLP/firefly-llama2-7b-chat-temp", "dataset_summary": "Dataset automatically created during the evaluation run of model [YeungNLP/firefly-llama2-7b-chat-temp](https://huggingface.co/YeungNLP/firefly-llama2-7b-chat-temp) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_YeungNLP__firefly-llama2-7b-chat-temp\",\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-09-22T03:37:32.448737](https://huggingface.co/datasets/open-llm-leaderboard/details_YeungNLP__firefly-llama2-7b-chat-temp/blob/main/results_2023-09-22T03-37-32.448737.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.45675748784133063,\n \"acc_stderr\": 0.03523878979242221,\n \"acc_norm\": 0.46039529518457817,\n \"acc_norm_stderr\": 0.03522948379579862,\n \"mc1\": 0.31456548347613217,\n \"mc1_stderr\": 0.016255241993179178,\n \"mc2\": 0.46775413014717326,\n \"mc2_stderr\": 0.015305512973889742\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.48293515358361777,\n \"acc_stderr\": 0.0146028783885366,\n \"acc_norm\": 0.5119453924914675,\n \"acc_norm_stderr\": 0.014607220340597167\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5476000796654052,\n \"acc_stderr\": 0.004967118575905287,\n \"acc_norm\": 0.7332204740091616,\n \"acc_norm_stderr\": 0.004413722823053159\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252606,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252606\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.45185185185185184,\n \"acc_stderr\": 0.04299268905480864,\n \"acc_norm\": 0.45185185185185184,\n \"acc_norm_stderr\": 0.04299268905480864\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.4144736842105263,\n \"acc_stderr\": 0.04008973785779206,\n \"acc_norm\": 0.4144736842105263,\n \"acc_norm_stderr\": 0.04008973785779206\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.5056603773584906,\n \"acc_stderr\": 0.03077090076385131,\n \"acc_norm\": 0.5056603773584906,\n \"acc_norm_stderr\": 0.03077090076385131\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.4305555555555556,\n \"acc_stderr\": 0.04140685639111502,\n \"acc_norm\": 0.4305555555555556,\n \"acc_norm_stderr\": 0.04140685639111502\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.3988439306358382,\n \"acc_stderr\": 0.037336266553835096,\n \"acc_norm\": 0.3988439306358382,\n \"acc_norm_stderr\": 0.037336266553835096\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.1568627450980392,\n \"acc_stderr\": 0.036186648199362466,\n \"acc_norm\": 0.1568627450980392,\n \"acc_norm_stderr\": 0.036186648199362466\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.61,\n \"acc_stderr\": 0.04902071300001974,\n \"acc_norm\": 0.61,\n \"acc_norm_stderr\": 0.04902071300001974\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.3829787234042553,\n \"acc_stderr\": 0.03177821250236922,\n \"acc_norm\": 0.3829787234042553,\n \"acc_norm_stderr\": 0.03177821250236922\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2807017543859649,\n \"acc_stderr\": 0.042270544512322004,\n \"acc_norm\": 0.2807017543859649,\n \"acc_norm_stderr\": 0.042270544512322004\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.4206896551724138,\n \"acc_stderr\": 0.0411391498118926,\n \"acc_norm\": 0.4206896551724138,\n \"acc_norm_stderr\": 0.0411391498118926\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.25925925925925924,\n \"acc_stderr\": 0.02256989707491841,\n \"acc_norm\": 0.25925925925925924,\n \"acc_norm_stderr\": 0.02256989707491841\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2857142857142857,\n \"acc_stderr\": 0.04040610178208841,\n \"acc_norm\": 0.2857142857142857,\n \"acc_norm_stderr\": 0.04040610178208841\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720684,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720684\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.4774193548387097,\n \"acc_stderr\": 0.028414985019707868,\n \"acc_norm\": 0.4774193548387097,\n \"acc_norm_stderr\": 0.028414985019707868\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.30049261083743845,\n \"acc_stderr\": 0.03225799476233484,\n \"acc_norm\": 0.30049261083743845,\n \"acc_norm_stderr\": 0.03225799476233484\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.4,\n \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.5515151515151515,\n \"acc_stderr\": 0.03883565977956929,\n \"acc_norm\": 0.5515151515151515,\n \"acc_norm_stderr\": 0.03883565977956929\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.5656565656565656,\n \"acc_stderr\": 0.03531505879359183,\n \"acc_norm\": 0.5656565656565656,\n \"acc_norm_stderr\": 0.03531505879359183\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.5751295336787565,\n \"acc_stderr\": 0.035674713352125395,\n \"acc_norm\": 0.5751295336787565,\n \"acc_norm_stderr\": 0.035674713352125395\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.4128205128205128,\n \"acc_stderr\": 0.024962683564331803,\n \"acc_norm\": 0.4128205128205128,\n \"acc_norm_stderr\": 0.024962683564331803\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.2851851851851852,\n \"acc_stderr\": 0.027528599210340492,\n \"acc_norm\": 0.2851851851851852,\n \"acc_norm_stderr\": 0.027528599210340492\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.39915966386554624,\n \"acc_stderr\": 0.03181110032413925,\n \"acc_norm\": 0.39915966386554624,\n \"acc_norm_stderr\": 0.03181110032413925\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.33774834437086093,\n \"acc_stderr\": 0.038615575462551684,\n \"acc_norm\": 0.33774834437086093,\n \"acc_norm_stderr\": 0.038615575462551684\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.5834862385321101,\n \"acc_stderr\": 0.021136376504030864,\n \"acc_norm\": 0.5834862385321101,\n \"acc_norm_stderr\": 0.021136376504030864\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.3101851851851852,\n \"acc_stderr\": 0.03154696285656629,\n \"acc_norm\": 0.3101851851851852,\n \"acc_norm_stderr\": 0.03154696285656629\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.5637254901960784,\n \"acc_stderr\": 0.03480693138457039,\n \"acc_norm\": 0.5637254901960784,\n \"acc_norm_stderr\": 0.03480693138457039\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.5991561181434599,\n \"acc_stderr\": 0.031900803894732356,\n \"acc_norm\": 0.5991561181434599,\n \"acc_norm_stderr\": 0.031900803894732356\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5112107623318386,\n \"acc_stderr\": 0.033549366530984746,\n \"acc_norm\": 0.5112107623318386,\n \"acc_norm_stderr\": 0.033549366530984746\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.5267175572519084,\n \"acc_stderr\": 0.04379024936553894,\n \"acc_norm\": 0.5267175572519084,\n \"acc_norm_stderr\": 0.04379024936553894\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.628099173553719,\n \"acc_stderr\": 0.044120158066245044,\n \"acc_norm\": 0.628099173553719,\n \"acc_norm_stderr\": 0.044120158066245044\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.49074074074074076,\n \"acc_stderr\": 0.04832853553437055,\n \"acc_norm\": 0.49074074074074076,\n \"acc_norm_stderr\": 0.04832853553437055\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.49693251533742333,\n \"acc_stderr\": 0.03928297078179662,\n \"acc_norm\": 0.49693251533742333,\n \"acc_norm_stderr\": 0.03928297078179662\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.41964285714285715,\n \"acc_stderr\": 0.04684099321077106,\n \"acc_norm\": 0.41964285714285715,\n \"acc_norm_stderr\": 0.04684099321077106\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.5825242718446602,\n \"acc_stderr\": 0.048828405482122375,\n \"acc_norm\": 0.5825242718446602,\n \"acc_norm_stderr\": 0.048828405482122375\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.03088273697413866,\n \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.03088273697413866\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956913,\n \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956913\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.5862068965517241,\n \"acc_stderr\": 0.01761220408466377,\n \"acc_norm\": 0.5862068965517241,\n \"acc_norm_stderr\": 0.01761220408466377\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.5057803468208093,\n \"acc_stderr\": 0.026917296179149123,\n \"acc_norm\": 0.5057803468208093,\n \"acc_norm_stderr\": 0.026917296179149123\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24134078212290502,\n \"acc_stderr\": 0.01431099954796145,\n \"acc_norm\": 0.24134078212290502,\n \"acc_norm_stderr\": 0.01431099954796145\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.49019607843137253,\n \"acc_stderr\": 0.028624412550167958,\n \"acc_norm\": 0.49019607843137253,\n \"acc_norm_stderr\": 0.028624412550167958\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5144694533762058,\n \"acc_stderr\": 0.02838619808417768,\n \"acc_norm\": 0.5144694533762058,\n \"acc_norm_stderr\": 0.02838619808417768\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.5092592592592593,\n \"acc_stderr\": 0.027815973433878014,\n \"acc_norm\": 0.5092592592592593,\n \"acc_norm_stderr\": 0.027815973433878014\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.35106382978723405,\n \"acc_stderr\": 0.028473501272963764,\n \"acc_norm\": 0.35106382978723405,\n \"acc_norm_stderr\": 0.028473501272963764\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3376792698826597,\n \"acc_stderr\": 0.012078563777145574,\n \"acc_norm\": 0.3376792698826597,\n \"acc_norm_stderr\": 0.012078563777145574\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.4264705882352941,\n \"acc_stderr\": 0.03004261583271487,\n \"acc_norm\": 0.4264705882352941,\n \"acc_norm_stderr\": 0.03004261583271487\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.44281045751633985,\n \"acc_stderr\": 0.02009508315457735,\n \"acc_norm\": 0.44281045751633985,\n \"acc_norm_stderr\": 0.02009508315457735\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.0469237132203465,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.0469237132203465\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.40408163265306124,\n \"acc_stderr\": 0.03141470802586589,\n \"acc_norm\": 0.40408163265306124,\n \"acc_norm_stderr\": 0.03141470802586589\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6467661691542289,\n \"acc_stderr\": 0.03379790611796777,\n \"acc_norm\": 0.6467661691542289,\n \"acc_norm_stderr\": 0.03379790611796777\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.04605661864718381,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.04605661864718381\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.42771084337349397,\n \"acc_stderr\": 0.038515976837185335,\n \"acc_norm\": 0.42771084337349397,\n \"acc_norm_stderr\": 0.038515976837185335\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.672514619883041,\n \"acc_stderr\": 0.035993357714560276,\n \"acc_norm\": 0.672514619883041,\n \"acc_norm_stderr\": 0.035993357714560276\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.31456548347613217,\n \"mc1_stderr\": 0.016255241993179178,\n \"mc2\": 0.46775413014717326,\n \"mc2_stderr\": 0.015305512973889742\n }\n}\n```", "repo_url": "https://huggingface.co/YeungNLP/firefly-llama2-7b-chat-temp", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_09_22T03_37_32.448737", "path": ["**/details_harness|arc:challenge|25_2023-09-22T03-37-32.448737.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-09-22T03-37-32.448737.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_09_22T03_37_32.448737", "path": ["**/details_harness|hellaswag|10_2023-09-22T03-37-32.448737.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-09-22T03-37-32.448737.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_09_22T03_37_32.448737", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-22T03-37-32.448737.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-09-22T03-37-32.448737.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-09-22T03-37-32.448737.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-09-22T03-37-32.448737.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-22T03-37-32.448737.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-09-22T03-37-32.448737.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-09-22T03-37-32.448737.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-09-22T03-37-32.448737.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-09-22T03-37-32.448737.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-09-22T03-37-32.448737.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-09-22T03-37-32.448737.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-09-22T03-37-32.448737.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-22T03-37-32.448737.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-09-22T03-37-32.448737.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-22T03-37-32.448737.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-22T03-37-32.448737.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-09-22T03-37-32.448737.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-09-22T03-37-32.448737.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-09-22T03-37-32.448737.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-22T03-37-32.448737.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-22T03-37-32.448737.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-22T03-37-32.448737.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-09-22T03-37-32.448737.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-22T03-37-32.448737.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-22T03-37-32.448737.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-22T03-37-32.448737.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-22T03-37-32.448737.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-09-22T03-37-32.448737.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-22T03-37-32.448737.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-22T03-37-32.448737.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-22T03-37-32.448737.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-22T03-37-32.448737.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-09-22T03-37-32.448737.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-09-22T03-37-32.448737.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-09-22T03-37-32.448737.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-09-22T03-37-32.448737.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-22T03-37-32.448737.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-09-22T03-37-32.448737.parquet", "**/details_harness|hendrycksTest-management|5_2023-09-22T03-37-32.448737.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-09-22T03-37-32.448737.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-09-22T03-37-32.448737.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-09-22T03-37-32.448737.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-09-22T03-37-32.448737.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-22T03-37-32.448737.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-09-22T03-37-32.448737.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-09-22T03-37-32.448737.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-09-22T03-37-32.448737.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-09-22T03-37-32.448737.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-09-22T03-37-32.448737.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-09-22T03-37-32.448737.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-09-22T03-37-32.448737.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-09-22T03-37-32.448737.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-09-22T03-37-32.448737.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-09-22T03-37-32.448737.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-22T03-37-32.448737.parquet", "**/details_harness|hendrycksTest-virology|5_2023-09-22T03-37-32.448737.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-09-22T03-37-32.448737.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-22T03-37-32.448737.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-09-22T03-37-32.448737.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-09-22T03-37-32.448737.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-09-22T03-37-32.448737.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-22T03-37-32.448737.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-09-22T03-37-32.448737.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-09-22T03-37-32.448737.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-09-22T03-37-32.448737.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-09-22T03-37-32.448737.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-09-22T03-37-32.448737.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-09-22T03-37-32.448737.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-09-22T03-37-32.448737.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-22T03-37-32.448737.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-09-22T03-37-32.448737.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-22T03-37-32.448737.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-22T03-37-32.448737.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-09-22T03-37-32.448737.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-09-22T03-37-32.448737.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-09-22T03-37-32.448737.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-22T03-37-32.448737.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-22T03-37-32.448737.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-22T03-37-32.448737.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-09-22T03-37-32.448737.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-22T03-37-32.448737.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-22T03-37-32.448737.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-22T03-37-32.448737.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-22T03-37-32.448737.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-09-22T03-37-32.448737.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-22T03-37-32.448737.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-22T03-37-32.448737.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-22T03-37-32.448737.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-22T03-37-32.448737.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-09-22T03-37-32.448737.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-09-22T03-37-32.448737.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-09-22T03-37-32.448737.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-09-22T03-37-32.448737.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-22T03-37-32.448737.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-09-22T03-37-32.448737.parquet", "**/details_harness|hendrycksTest-management|5_2023-09-22T03-37-32.448737.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-09-22T03-37-32.448737.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-09-22T03-37-32.448737.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-09-22T03-37-32.448737.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-09-22T03-37-32.448737.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-22T03-37-32.448737.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-09-22T03-37-32.448737.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-09-22T03-37-32.448737.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-09-22T03-37-32.448737.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-09-22T03-37-32.448737.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-09-22T03-37-32.448737.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-09-22T03-37-32.448737.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-09-22T03-37-32.448737.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-09-22T03-37-32.448737.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-09-22T03-37-32.448737.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-09-22T03-37-32.448737.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-22T03-37-32.448737.parquet", "**/details_harness|hendrycksTest-virology|5_2023-09-22T03-37-32.448737.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-09-22T03-37-32.448737.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_09_22T03_37_32.448737", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-22T03-37-32.448737.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-22T03-37-32.448737.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_09_22T03_37_32.448737", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-09-22T03-37-32.448737.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-09-22T03-37-32.448737.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_09_22T03_37_32.448737", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-09-22T03-37-32.448737.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-09-22T03-37-32.448737.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_09_22T03_37_32.448737", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-09-22T03-37-32.448737.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-09-22T03-37-32.448737.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_09_22T03_37_32.448737", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-22T03-37-32.448737.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-22T03-37-32.448737.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_09_22T03_37_32.448737", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-09-22T03-37-32.448737.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-09-22T03-37-32.448737.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_09_22T03_37_32.448737", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-09-22T03-37-32.448737.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-09-22T03-37-32.448737.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_09_22T03_37_32.448737", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-09-22T03-37-32.448737.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-09-22T03-37-32.448737.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_09_22T03_37_32.448737", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-09-22T03-37-32.448737.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-09-22T03-37-32.448737.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_09_22T03_37_32.448737", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-09-22T03-37-32.448737.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-09-22T03-37-32.448737.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_09_22T03_37_32.448737", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-09-22T03-37-32.448737.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-09-22T03-37-32.448737.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_09_22T03_37_32.448737", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-09-22T03-37-32.448737.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-09-22T03-37-32.448737.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_09_22T03_37_32.448737", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-22T03-37-32.448737.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-22T03-37-32.448737.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_09_22T03_37_32.448737", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-09-22T03-37-32.448737.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-09-22T03-37-32.448737.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_09_22T03_37_32.448737", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-22T03-37-32.448737.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-22T03-37-32.448737.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_09_22T03_37_32.448737", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-22T03-37-32.448737.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-22T03-37-32.448737.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_09_22T03_37_32.448737", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-09-22T03-37-32.448737.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-09-22T03-37-32.448737.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_09_22T03_37_32.448737", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-09-22T03-37-32.448737.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-09-22T03-37-32.448737.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_09_22T03_37_32.448737", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-09-22T03-37-32.448737.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-09-22T03-37-32.448737.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_09_22T03_37_32.448737", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-22T03-37-32.448737.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-22T03-37-32.448737.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_09_22T03_37_32.448737", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-22T03-37-32.448737.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-22T03-37-32.448737.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_09_22T03_37_32.448737", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-22T03-37-32.448737.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-22T03-37-32.448737.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_09_22T03_37_32.448737", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-09-22T03-37-32.448737.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-09-22T03-37-32.448737.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_09_22T03_37_32.448737", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-22T03-37-32.448737.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-22T03-37-32.448737.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_09_22T03_37_32.448737", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-22T03-37-32.448737.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-22T03-37-32.448737.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_09_22T03_37_32.448737", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-22T03-37-32.448737.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-22T03-37-32.448737.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_09_22T03_37_32.448737", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-22T03-37-32.448737.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-22T03-37-32.448737.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_09_22T03_37_32.448737", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-09-22T03-37-32.448737.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-09-22T03-37-32.448737.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_09_22T03_37_32.448737", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-22T03-37-32.448737.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-22T03-37-32.448737.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_09_22T03_37_32.448737", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-22T03-37-32.448737.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-22T03-37-32.448737.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_09_22T03_37_32.448737", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-22T03-37-32.448737.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-22T03-37-32.448737.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_09_22T03_37_32.448737", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-22T03-37-32.448737.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-22T03-37-32.448737.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_09_22T03_37_32.448737", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-09-22T03-37-32.448737.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-09-22T03-37-32.448737.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_09_22T03_37_32.448737", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-09-22T03-37-32.448737.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-09-22T03-37-32.448737.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_09_22T03_37_32.448737", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-09-22T03-37-32.448737.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-09-22T03-37-32.448737.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_09_22T03_37_32.448737", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-09-22T03-37-32.448737.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-09-22T03-37-32.448737.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_09_22T03_37_32.448737", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-22T03-37-32.448737.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-22T03-37-32.448737.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_09_22T03_37_32.448737", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-09-22T03-37-32.448737.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-09-22T03-37-32.448737.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_09_22T03_37_32.448737", "path": ["**/details_harness|hendrycksTest-management|5_2023-09-22T03-37-32.448737.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-09-22T03-37-32.448737.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_09_22T03_37_32.448737", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-09-22T03-37-32.448737.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-09-22T03-37-32.448737.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_09_22T03_37_32.448737", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-09-22T03-37-32.448737.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-09-22T03-37-32.448737.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_09_22T03_37_32.448737", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-09-22T03-37-32.448737.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-09-22T03-37-32.448737.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_09_22T03_37_32.448737", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-09-22T03-37-32.448737.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-09-22T03-37-32.448737.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_09_22T03_37_32.448737", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-22T03-37-32.448737.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-22T03-37-32.448737.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_09_22T03_37_32.448737", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-09-22T03-37-32.448737.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-09-22T03-37-32.448737.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_09_22T03_37_32.448737", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-09-22T03-37-32.448737.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-09-22T03-37-32.448737.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_09_22T03_37_32.448737", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-09-22T03-37-32.448737.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-09-22T03-37-32.448737.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_09_22T03_37_32.448737", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-09-22T03-37-32.448737.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-09-22T03-37-32.448737.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_09_22T03_37_32.448737", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-09-22T03-37-32.448737.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-09-22T03-37-32.448737.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_09_22T03_37_32.448737", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-09-22T03-37-32.448737.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-09-22T03-37-32.448737.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_09_22T03_37_32.448737", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-09-22T03-37-32.448737.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-09-22T03-37-32.448737.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_09_22T03_37_32.448737", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-09-22T03-37-32.448737.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-09-22T03-37-32.448737.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_09_22T03_37_32.448737", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-09-22T03-37-32.448737.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-09-22T03-37-32.448737.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_09_22T03_37_32.448737", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-09-22T03-37-32.448737.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-09-22T03-37-32.448737.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_09_22T03_37_32.448737", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-22T03-37-32.448737.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-22T03-37-32.448737.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_09_22T03_37_32.448737", "path": ["**/details_harness|hendrycksTest-virology|5_2023-09-22T03-37-32.448737.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-09-22T03-37-32.448737.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_09_22T03_37_32.448737", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-09-22T03-37-32.448737.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-09-22T03-37-32.448737.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_09_22T03_37_32.448737", "path": ["**/details_harness|truthfulqa:mc|0_2023-09-22T03-37-32.448737.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-09-22T03-37-32.448737.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_09_22T03_37_32.448737", "path": ["results_2023-09-22T03-37-32.448737.parquet"]}, {"split": "latest", "path": ["results_2023-09-22T03-37-32.448737.parquet"]}]}]}
2023-09-22T02:38:54+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of YeungNLP/firefly-llama2-7b-chat-temp ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model YeungNLP/firefly-llama2-7b-chat-temp on the Open LLM Leaderboard. The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-09-22T03:37:32.448737(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of YeungNLP/firefly-llama2-7b-chat-temp", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model YeungNLP/firefly-llama2-7b-chat-temp on the Open LLM Leaderboard.\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-09-22T03:37:32.448737(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of YeungNLP/firefly-llama2-7b-chat-temp", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model YeungNLP/firefly-llama2-7b-chat-temp on the Open LLM Leaderboard.\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-09-22T03:37:32.448737(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 26, 31, 174, 66, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of YeungNLP/firefly-llama2-7b-chat-temp## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model YeungNLP/firefly-llama2-7b-chat-temp on the Open LLM Leaderboard.\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-09-22T03:37:32.448737(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
ab9ae6bd164bcac02b03e8aab233f946cc272fd0
# Dataset Card for Evaluation run of jb723/llama2-ko-7B-model ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/jb723/llama2-ko-7B-model - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [jb723/llama2-ko-7B-model](https://huggingface.co/jb723/llama2-ko-7B-model) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_jb723__llama2-ko-7B-model", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-10-28T09:35:48.028758](https://huggingface.co/datasets/open-llm-leaderboard/details_jb723__llama2-ko-7B-model/blob/main/results_2023-10-28T09-35-48.028758.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.23427013422818793, "em_stderr": 0.004337464243138509, "f1": 0.3152516778523505, "f1_stderr": 0.004353725712557671, "acc": 0.37318847300668456, "acc_stderr": 0.00848793474651324 }, "harness|drop|3": { "em": 0.23427013422818793, "em_stderr": 0.004337464243138509, "f1": 0.3152516778523505, "f1_stderr": 0.004353725712557671 }, "harness|gsm8k|5": { "acc": 0.02577710386656558, "acc_stderr": 0.0043650429536218095 }, "harness|winogrande|5": { "acc": 0.7205998421468035, "acc_stderr": 0.012610826539404667 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_jb723__llama2-ko-7B-model
[ "region:us" ]
2023-09-22T02:46:33+00:00
{"pretty_name": "Evaluation run of jb723/llama2-ko-7B-model", "dataset_summary": "Dataset automatically created during the evaluation run of model [jb723/llama2-ko-7B-model](https://huggingface.co/jb723/llama2-ko-7B-model) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_jb723__llama2-ko-7B-model\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-28T09:35:48.028758](https://huggingface.co/datasets/open-llm-leaderboard/details_jb723__llama2-ko-7B-model/blob/main/results_2023-10-28T09-35-48.028758.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.23427013422818793,\n \"em_stderr\": 0.004337464243138509,\n \"f1\": 0.3152516778523505,\n \"f1_stderr\": 0.004353725712557671,\n \"acc\": 0.37318847300668456,\n \"acc_stderr\": 0.00848793474651324\n },\n \"harness|drop|3\": {\n \"em\": 0.23427013422818793,\n \"em_stderr\": 0.004337464243138509,\n \"f1\": 0.3152516778523505,\n \"f1_stderr\": 0.004353725712557671\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.02577710386656558,\n \"acc_stderr\": 0.0043650429536218095\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7205998421468035,\n \"acc_stderr\": 0.012610826539404667\n }\n}\n```", "repo_url": "https://huggingface.co/jb723/llama2-ko-7B-model", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_09_22T03_46_09.444345", "path": ["**/details_harness|arc:challenge|25_2023-09-22T03-46-09.444345.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-09-22T03-46-09.444345.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_28T09_35_48.028758", "path": ["**/details_harness|drop|3_2023-10-28T09-35-48.028758.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-28T09-35-48.028758.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_28T09_35_48.028758", "path": ["**/details_harness|gsm8k|5_2023-10-28T09-35-48.028758.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-28T09-35-48.028758.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_09_22T03_46_09.444345", "path": ["**/details_harness|hellaswag|10_2023-09-22T03-46-09.444345.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-09-22T03-46-09.444345.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_09_22T03_46_09.444345", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-22T03-46-09.444345.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-09-22T03-46-09.444345.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-09-22T03-46-09.444345.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-09-22T03-46-09.444345.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-22T03-46-09.444345.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-09-22T03-46-09.444345.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-09-22T03-46-09.444345.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-09-22T03-46-09.444345.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-09-22T03-46-09.444345.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-09-22T03-46-09.444345.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-09-22T03-46-09.444345.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-09-22T03-46-09.444345.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-22T03-46-09.444345.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-09-22T03-46-09.444345.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-22T03-46-09.444345.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-22T03-46-09.444345.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-09-22T03-46-09.444345.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-09-22T03-46-09.444345.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-09-22T03-46-09.444345.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-22T03-46-09.444345.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-22T03-46-09.444345.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-22T03-46-09.444345.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-09-22T03-46-09.444345.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-22T03-46-09.444345.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-22T03-46-09.444345.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-22T03-46-09.444345.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-22T03-46-09.444345.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-09-22T03-46-09.444345.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-22T03-46-09.444345.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-22T03-46-09.444345.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-22T03-46-09.444345.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-22T03-46-09.444345.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-09-22T03-46-09.444345.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-09-22T03-46-09.444345.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-09-22T03-46-09.444345.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-09-22T03-46-09.444345.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-22T03-46-09.444345.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-09-22T03-46-09.444345.parquet", "**/details_harness|hendrycksTest-management|5_2023-09-22T03-46-09.444345.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-09-22T03-46-09.444345.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-09-22T03-46-09.444345.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-09-22T03-46-09.444345.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-09-22T03-46-09.444345.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-22T03-46-09.444345.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-09-22T03-46-09.444345.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-09-22T03-46-09.444345.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-09-22T03-46-09.444345.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-09-22T03-46-09.444345.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-09-22T03-46-09.444345.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-09-22T03-46-09.444345.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-09-22T03-46-09.444345.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-09-22T03-46-09.444345.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-09-22T03-46-09.444345.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-09-22T03-46-09.444345.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-22T03-46-09.444345.parquet", "**/details_harness|hendrycksTest-virology|5_2023-09-22T03-46-09.444345.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-09-22T03-46-09.444345.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-22T03-46-09.444345.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-09-22T03-46-09.444345.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-09-22T03-46-09.444345.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-09-22T03-46-09.444345.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-22T03-46-09.444345.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-09-22T03-46-09.444345.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-09-22T03-46-09.444345.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-09-22T03-46-09.444345.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-09-22T03-46-09.444345.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-09-22T03-46-09.444345.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-09-22T03-46-09.444345.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-09-22T03-46-09.444345.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-22T03-46-09.444345.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-09-22T03-46-09.444345.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-22T03-46-09.444345.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-22T03-46-09.444345.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-09-22T03-46-09.444345.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-09-22T03-46-09.444345.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-09-22T03-46-09.444345.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-22T03-46-09.444345.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-22T03-46-09.444345.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-22T03-46-09.444345.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-09-22T03-46-09.444345.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-22T03-46-09.444345.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-22T03-46-09.444345.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-22T03-46-09.444345.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-22T03-46-09.444345.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-09-22T03-46-09.444345.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-22T03-46-09.444345.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-22T03-46-09.444345.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-22T03-46-09.444345.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-22T03-46-09.444345.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-09-22T03-46-09.444345.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-09-22T03-46-09.444345.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-09-22T03-46-09.444345.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-09-22T03-46-09.444345.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-22T03-46-09.444345.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-09-22T03-46-09.444345.parquet", "**/details_harness|hendrycksTest-management|5_2023-09-22T03-46-09.444345.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-09-22T03-46-09.444345.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-09-22T03-46-09.444345.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-09-22T03-46-09.444345.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-09-22T03-46-09.444345.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-22T03-46-09.444345.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-09-22T03-46-09.444345.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-09-22T03-46-09.444345.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-09-22T03-46-09.444345.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-09-22T03-46-09.444345.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-09-22T03-46-09.444345.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-09-22T03-46-09.444345.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-09-22T03-46-09.444345.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-09-22T03-46-09.444345.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-09-22T03-46-09.444345.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-09-22T03-46-09.444345.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-22T03-46-09.444345.parquet", "**/details_harness|hendrycksTest-virology|5_2023-09-22T03-46-09.444345.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-09-22T03-46-09.444345.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_09_22T03_46_09.444345", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-22T03-46-09.444345.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-22T03-46-09.444345.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_09_22T03_46_09.444345", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-09-22T03-46-09.444345.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-09-22T03-46-09.444345.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_09_22T03_46_09.444345", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-09-22T03-46-09.444345.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-09-22T03-46-09.444345.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_09_22T03_46_09.444345", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-09-22T03-46-09.444345.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-09-22T03-46-09.444345.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_09_22T03_46_09.444345", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-22T03-46-09.444345.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-22T03-46-09.444345.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_09_22T03_46_09.444345", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-09-22T03-46-09.444345.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-09-22T03-46-09.444345.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_09_22T03_46_09.444345", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-09-22T03-46-09.444345.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-09-22T03-46-09.444345.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_09_22T03_46_09.444345", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-09-22T03-46-09.444345.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-09-22T03-46-09.444345.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_09_22T03_46_09.444345", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-09-22T03-46-09.444345.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-09-22T03-46-09.444345.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_09_22T03_46_09.444345", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-09-22T03-46-09.444345.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-09-22T03-46-09.444345.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_09_22T03_46_09.444345", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-09-22T03-46-09.444345.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-09-22T03-46-09.444345.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_09_22T03_46_09.444345", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-09-22T03-46-09.444345.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-09-22T03-46-09.444345.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_09_22T03_46_09.444345", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-22T03-46-09.444345.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-22T03-46-09.444345.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_09_22T03_46_09.444345", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-09-22T03-46-09.444345.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-09-22T03-46-09.444345.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_09_22T03_46_09.444345", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-22T03-46-09.444345.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-22T03-46-09.444345.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_09_22T03_46_09.444345", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-22T03-46-09.444345.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-22T03-46-09.444345.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_09_22T03_46_09.444345", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-09-22T03-46-09.444345.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-09-22T03-46-09.444345.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_09_22T03_46_09.444345", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-09-22T03-46-09.444345.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-09-22T03-46-09.444345.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_09_22T03_46_09.444345", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-09-22T03-46-09.444345.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-09-22T03-46-09.444345.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_09_22T03_46_09.444345", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-22T03-46-09.444345.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-22T03-46-09.444345.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_09_22T03_46_09.444345", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-22T03-46-09.444345.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-22T03-46-09.444345.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_09_22T03_46_09.444345", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-22T03-46-09.444345.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-22T03-46-09.444345.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_09_22T03_46_09.444345", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-09-22T03-46-09.444345.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-09-22T03-46-09.444345.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_09_22T03_46_09.444345", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-22T03-46-09.444345.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-22T03-46-09.444345.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_09_22T03_46_09.444345", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-22T03-46-09.444345.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-22T03-46-09.444345.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_09_22T03_46_09.444345", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-22T03-46-09.444345.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-22T03-46-09.444345.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_09_22T03_46_09.444345", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-22T03-46-09.444345.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-22T03-46-09.444345.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_09_22T03_46_09.444345", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-09-22T03-46-09.444345.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-09-22T03-46-09.444345.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_09_22T03_46_09.444345", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-22T03-46-09.444345.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-22T03-46-09.444345.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_09_22T03_46_09.444345", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-22T03-46-09.444345.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-22T03-46-09.444345.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_09_22T03_46_09.444345", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-22T03-46-09.444345.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-22T03-46-09.444345.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_09_22T03_46_09.444345", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-22T03-46-09.444345.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-22T03-46-09.444345.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_09_22T03_46_09.444345", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-09-22T03-46-09.444345.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-09-22T03-46-09.444345.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_09_22T03_46_09.444345", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-09-22T03-46-09.444345.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-09-22T03-46-09.444345.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_09_22T03_46_09.444345", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-09-22T03-46-09.444345.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-09-22T03-46-09.444345.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_09_22T03_46_09.444345", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-09-22T03-46-09.444345.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-09-22T03-46-09.444345.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_09_22T03_46_09.444345", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-22T03-46-09.444345.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-22T03-46-09.444345.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_09_22T03_46_09.444345", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-09-22T03-46-09.444345.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-09-22T03-46-09.444345.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_09_22T03_46_09.444345", "path": ["**/details_harness|hendrycksTest-management|5_2023-09-22T03-46-09.444345.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-09-22T03-46-09.444345.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_09_22T03_46_09.444345", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-09-22T03-46-09.444345.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-09-22T03-46-09.444345.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_09_22T03_46_09.444345", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-09-22T03-46-09.444345.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-09-22T03-46-09.444345.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_09_22T03_46_09.444345", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-09-22T03-46-09.444345.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-09-22T03-46-09.444345.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_09_22T03_46_09.444345", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-09-22T03-46-09.444345.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-09-22T03-46-09.444345.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_09_22T03_46_09.444345", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-22T03-46-09.444345.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-22T03-46-09.444345.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_09_22T03_46_09.444345", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-09-22T03-46-09.444345.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-09-22T03-46-09.444345.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_09_22T03_46_09.444345", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-09-22T03-46-09.444345.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-09-22T03-46-09.444345.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_09_22T03_46_09.444345", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-09-22T03-46-09.444345.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-09-22T03-46-09.444345.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_09_22T03_46_09.444345", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-09-22T03-46-09.444345.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-09-22T03-46-09.444345.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_09_22T03_46_09.444345", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-09-22T03-46-09.444345.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-09-22T03-46-09.444345.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_09_22T03_46_09.444345", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-09-22T03-46-09.444345.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-09-22T03-46-09.444345.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_09_22T03_46_09.444345", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-09-22T03-46-09.444345.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-09-22T03-46-09.444345.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_09_22T03_46_09.444345", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-09-22T03-46-09.444345.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-09-22T03-46-09.444345.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_09_22T03_46_09.444345", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-09-22T03-46-09.444345.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-09-22T03-46-09.444345.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_09_22T03_46_09.444345", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-09-22T03-46-09.444345.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-09-22T03-46-09.444345.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_09_22T03_46_09.444345", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-22T03-46-09.444345.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-22T03-46-09.444345.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_09_22T03_46_09.444345", "path": ["**/details_harness|hendrycksTest-virology|5_2023-09-22T03-46-09.444345.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-09-22T03-46-09.444345.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_09_22T03_46_09.444345", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-09-22T03-46-09.444345.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-09-22T03-46-09.444345.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_09_22T03_46_09.444345", "path": ["**/details_harness|truthfulqa:mc|0_2023-09-22T03-46-09.444345.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-09-22T03-46-09.444345.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_28T09_35_48.028758", "path": ["**/details_harness|winogrande|5_2023-10-28T09-35-48.028758.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-28T09-35-48.028758.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_09_22T03_46_09.444345", "path": ["results_2023-09-22T03-46-09.444345.parquet"]}, {"split": "2023_10_28T09_35_48.028758", "path": ["results_2023-10-28T09-35-48.028758.parquet"]}, {"split": "latest", "path": ["results_2023-10-28T09-35-48.028758.parquet"]}]}]}
2023-10-28T08:36:00+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of jb723/llama2-ko-7B-model ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model jb723/llama2-ko-7B-model on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-10-28T09:35:48.028758(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of jb723/llama2-ko-7B-model", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model jb723/llama2-ko-7B-model on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-28T09:35:48.028758(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of jb723/llama2-ko-7B-model", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model jb723/llama2-ko-7B-model on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-28T09:35:48.028758(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 21, 31, 169, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of jb723/llama2-ko-7B-model## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model jb723/llama2-ko-7B-model on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-28T09:35:48.028758(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
2462ee9f4954d0b975362bac4979ca17a01d73ef
# Dataset of Shiipon This is the dataset of Shiipon, containing 132 images and their tags. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)). | Name | Images | Download | Description | |:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------| | raw | 132 | [Download](dataset-raw.zip) | Raw data with meta information. | | raw-stage3 | 327 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. | | 384x512 | 132 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. | | 512x512 | 132 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. | | 512x704 | 132 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. | | 640x640 | 132 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. | | 640x880 | 132 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. | | stage3-640 | 327 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. | | stage3-800 | 327 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. | | stage3-1200 | 327 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
CyberHarem/shiipon_akibameidosensou
[ "task_categories:text-to-image", "size_categories:n<1K", "license:mit", "art", "not-for-all-audiences", "region:us" ]
2023-09-22T02:46:47+00:00
{"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]}
2023-09-22T02:48:18+00:00
[]
[]
TAGS #task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
Dataset of Shiipon ================== This is the dataset of Shiipon, containing 132 images and their tags. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
[]
[ "TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n" ]
[ 44 ]
[ "passage: TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n" ]
02f69dee3c4b830fe37f44b6845e4a1d0efbddbf
# Dataset Card for Evaluation run of PocketDoc/Dans-RetroRodeo-13b ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/PocketDoc/Dans-RetroRodeo-13b - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [PocketDoc/Dans-RetroRodeo-13b](https://huggingface.co/PocketDoc/Dans-RetroRodeo-13b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_PocketDoc__Dans-RetroRodeo-13b", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-10-27T07:46:06.592709](https://huggingface.co/datasets/open-llm-leaderboard/details_PocketDoc__Dans-RetroRodeo-13b/blob/main/results_2023-10-27T07-46-06.592709.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.016883389261744968, "em_stderr": 0.0013193863452894662, "f1": 0.07098993288590594, "f1_stderr": 0.0017755070196112286, "acc": 0.3689818468823994, "acc_stderr": 0.006179472215818783 }, "harness|drop|3": { "em": 0.016883389261744968, "em_stderr": 0.0013193863452894662, "f1": 0.07098993288590594, "f1_stderr": 0.0017755070196112286 }, "harness|gsm8k|5": { "acc": 0.0, "acc_stderr": 0.0 }, "harness|winogrande|5": { "acc": 0.7379636937647988, "acc_stderr": 0.012358944431637566 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_PocketDoc__Dans-RetroRodeo-13b
[ "region:us" ]
2023-09-22T02:52:14+00:00
{"pretty_name": "Evaluation run of PocketDoc/Dans-RetroRodeo-13b", "dataset_summary": "Dataset automatically created during the evaluation run of model [PocketDoc/Dans-RetroRodeo-13b](https://huggingface.co/PocketDoc/Dans-RetroRodeo-13b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_PocketDoc__Dans-RetroRodeo-13b\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-27T07:46:06.592709](https://huggingface.co/datasets/open-llm-leaderboard/details_PocketDoc__Dans-RetroRodeo-13b/blob/main/results_2023-10-27T07-46-06.592709.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.016883389261744968,\n \"em_stderr\": 0.0013193863452894662,\n \"f1\": 0.07098993288590594,\n \"f1_stderr\": 0.0017755070196112286,\n \"acc\": 0.3689818468823994,\n \"acc_stderr\": 0.006179472215818783\n },\n \"harness|drop|3\": {\n \"em\": 0.016883389261744968,\n \"em_stderr\": 0.0013193863452894662,\n \"f1\": 0.07098993288590594,\n \"f1_stderr\": 0.0017755070196112286\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\": 0.0\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7379636937647988,\n \"acc_stderr\": 0.012358944431637566\n }\n}\n```", "repo_url": "https://huggingface.co/PocketDoc/Dans-RetroRodeo-13b", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_09_22T03_51_50.269402", "path": ["**/details_harness|arc:challenge|25_2023-09-22T03-51-50.269402.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-09-22T03-51-50.269402.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_27T07_46_06.592709", "path": ["**/details_harness|drop|3_2023-10-27T07-46-06.592709.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-27T07-46-06.592709.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_27T07_46_06.592709", "path": ["**/details_harness|gsm8k|5_2023-10-27T07-46-06.592709.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-27T07-46-06.592709.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_09_22T03_51_50.269402", "path": ["**/details_harness|hellaswag|10_2023-09-22T03-51-50.269402.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-09-22T03-51-50.269402.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_09_22T03_51_50.269402", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-22T03-51-50.269402.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-09-22T03-51-50.269402.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-09-22T03-51-50.269402.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-09-22T03-51-50.269402.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-22T03-51-50.269402.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-09-22T03-51-50.269402.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-09-22T03-51-50.269402.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-09-22T03-51-50.269402.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-09-22T03-51-50.269402.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-09-22T03-51-50.269402.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-09-22T03-51-50.269402.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-09-22T03-51-50.269402.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-22T03-51-50.269402.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-09-22T03-51-50.269402.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-22T03-51-50.269402.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-22T03-51-50.269402.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-09-22T03-51-50.269402.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-09-22T03-51-50.269402.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-09-22T03-51-50.269402.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-22T03-51-50.269402.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-22T03-51-50.269402.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-22T03-51-50.269402.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-09-22T03-51-50.269402.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-22T03-51-50.269402.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-22T03-51-50.269402.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-22T03-51-50.269402.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-22T03-51-50.269402.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-09-22T03-51-50.269402.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-22T03-51-50.269402.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-22T03-51-50.269402.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-22T03-51-50.269402.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-22T03-51-50.269402.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-09-22T03-51-50.269402.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-09-22T03-51-50.269402.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-09-22T03-51-50.269402.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-09-22T03-51-50.269402.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-22T03-51-50.269402.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-09-22T03-51-50.269402.parquet", "**/details_harness|hendrycksTest-management|5_2023-09-22T03-51-50.269402.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-09-22T03-51-50.269402.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-09-22T03-51-50.269402.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-09-22T03-51-50.269402.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-09-22T03-51-50.269402.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-22T03-51-50.269402.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-09-22T03-51-50.269402.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-09-22T03-51-50.269402.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-09-22T03-51-50.269402.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-09-22T03-51-50.269402.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-09-22T03-51-50.269402.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-09-22T03-51-50.269402.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-09-22T03-51-50.269402.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-09-22T03-51-50.269402.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-09-22T03-51-50.269402.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-09-22T03-51-50.269402.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-22T03-51-50.269402.parquet", "**/details_harness|hendrycksTest-virology|5_2023-09-22T03-51-50.269402.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-09-22T03-51-50.269402.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-22T03-51-50.269402.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-09-22T03-51-50.269402.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-09-22T03-51-50.269402.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-09-22T03-51-50.269402.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-22T03-51-50.269402.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-09-22T03-51-50.269402.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-09-22T03-51-50.269402.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-09-22T03-51-50.269402.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-09-22T03-51-50.269402.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-09-22T03-51-50.269402.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-09-22T03-51-50.269402.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-09-22T03-51-50.269402.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-22T03-51-50.269402.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-09-22T03-51-50.269402.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-22T03-51-50.269402.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-22T03-51-50.269402.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-09-22T03-51-50.269402.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-09-22T03-51-50.269402.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-09-22T03-51-50.269402.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-22T03-51-50.269402.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-22T03-51-50.269402.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-22T03-51-50.269402.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-09-22T03-51-50.269402.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-22T03-51-50.269402.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-22T03-51-50.269402.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-22T03-51-50.269402.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-22T03-51-50.269402.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-09-22T03-51-50.269402.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-22T03-51-50.269402.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-22T03-51-50.269402.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-22T03-51-50.269402.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-22T03-51-50.269402.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-09-22T03-51-50.269402.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-09-22T03-51-50.269402.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-09-22T03-51-50.269402.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-09-22T03-51-50.269402.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-22T03-51-50.269402.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-09-22T03-51-50.269402.parquet", "**/details_harness|hendrycksTest-management|5_2023-09-22T03-51-50.269402.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-09-22T03-51-50.269402.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-09-22T03-51-50.269402.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-09-22T03-51-50.269402.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-09-22T03-51-50.269402.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-22T03-51-50.269402.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-09-22T03-51-50.269402.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-09-22T03-51-50.269402.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-09-22T03-51-50.269402.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-09-22T03-51-50.269402.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-09-22T03-51-50.269402.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-09-22T03-51-50.269402.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-09-22T03-51-50.269402.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-09-22T03-51-50.269402.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-09-22T03-51-50.269402.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-09-22T03-51-50.269402.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-22T03-51-50.269402.parquet", "**/details_harness|hendrycksTest-virology|5_2023-09-22T03-51-50.269402.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-09-22T03-51-50.269402.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_09_22T03_51_50.269402", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-22T03-51-50.269402.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-22T03-51-50.269402.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_09_22T03_51_50.269402", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-09-22T03-51-50.269402.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-09-22T03-51-50.269402.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_09_22T03_51_50.269402", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-09-22T03-51-50.269402.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-09-22T03-51-50.269402.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_09_22T03_51_50.269402", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-09-22T03-51-50.269402.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-09-22T03-51-50.269402.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_09_22T03_51_50.269402", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-22T03-51-50.269402.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-22T03-51-50.269402.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_09_22T03_51_50.269402", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-09-22T03-51-50.269402.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-09-22T03-51-50.269402.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_09_22T03_51_50.269402", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-09-22T03-51-50.269402.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-09-22T03-51-50.269402.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_09_22T03_51_50.269402", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-09-22T03-51-50.269402.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-09-22T03-51-50.269402.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_09_22T03_51_50.269402", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-09-22T03-51-50.269402.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-09-22T03-51-50.269402.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_09_22T03_51_50.269402", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-09-22T03-51-50.269402.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-09-22T03-51-50.269402.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_09_22T03_51_50.269402", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-09-22T03-51-50.269402.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-09-22T03-51-50.269402.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_09_22T03_51_50.269402", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-09-22T03-51-50.269402.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-09-22T03-51-50.269402.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_09_22T03_51_50.269402", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-22T03-51-50.269402.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-22T03-51-50.269402.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_09_22T03_51_50.269402", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-09-22T03-51-50.269402.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-09-22T03-51-50.269402.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_09_22T03_51_50.269402", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-22T03-51-50.269402.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-22T03-51-50.269402.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_09_22T03_51_50.269402", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-22T03-51-50.269402.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-22T03-51-50.269402.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_09_22T03_51_50.269402", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-09-22T03-51-50.269402.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-09-22T03-51-50.269402.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_09_22T03_51_50.269402", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-09-22T03-51-50.269402.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-09-22T03-51-50.269402.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_09_22T03_51_50.269402", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-09-22T03-51-50.269402.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-09-22T03-51-50.269402.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_09_22T03_51_50.269402", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-22T03-51-50.269402.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-22T03-51-50.269402.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_09_22T03_51_50.269402", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-22T03-51-50.269402.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-22T03-51-50.269402.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_09_22T03_51_50.269402", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-22T03-51-50.269402.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-22T03-51-50.269402.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_09_22T03_51_50.269402", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-09-22T03-51-50.269402.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-09-22T03-51-50.269402.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_09_22T03_51_50.269402", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-22T03-51-50.269402.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-22T03-51-50.269402.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_09_22T03_51_50.269402", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-22T03-51-50.269402.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-22T03-51-50.269402.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_09_22T03_51_50.269402", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-22T03-51-50.269402.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-22T03-51-50.269402.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_09_22T03_51_50.269402", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-22T03-51-50.269402.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-22T03-51-50.269402.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_09_22T03_51_50.269402", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-09-22T03-51-50.269402.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-09-22T03-51-50.269402.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_09_22T03_51_50.269402", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-22T03-51-50.269402.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-22T03-51-50.269402.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_09_22T03_51_50.269402", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-22T03-51-50.269402.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-22T03-51-50.269402.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_09_22T03_51_50.269402", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-22T03-51-50.269402.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-22T03-51-50.269402.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_09_22T03_51_50.269402", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-22T03-51-50.269402.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-22T03-51-50.269402.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_09_22T03_51_50.269402", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-09-22T03-51-50.269402.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-09-22T03-51-50.269402.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_09_22T03_51_50.269402", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-09-22T03-51-50.269402.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-09-22T03-51-50.269402.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_09_22T03_51_50.269402", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-09-22T03-51-50.269402.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-09-22T03-51-50.269402.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_09_22T03_51_50.269402", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-09-22T03-51-50.269402.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-09-22T03-51-50.269402.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_09_22T03_51_50.269402", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-22T03-51-50.269402.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-22T03-51-50.269402.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_09_22T03_51_50.269402", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-09-22T03-51-50.269402.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-09-22T03-51-50.269402.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_09_22T03_51_50.269402", "path": ["**/details_harness|hendrycksTest-management|5_2023-09-22T03-51-50.269402.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-09-22T03-51-50.269402.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_09_22T03_51_50.269402", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-09-22T03-51-50.269402.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-09-22T03-51-50.269402.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_09_22T03_51_50.269402", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-09-22T03-51-50.269402.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-09-22T03-51-50.269402.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_09_22T03_51_50.269402", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-09-22T03-51-50.269402.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-09-22T03-51-50.269402.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_09_22T03_51_50.269402", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-09-22T03-51-50.269402.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-09-22T03-51-50.269402.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_09_22T03_51_50.269402", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-22T03-51-50.269402.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-22T03-51-50.269402.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_09_22T03_51_50.269402", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-09-22T03-51-50.269402.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-09-22T03-51-50.269402.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_09_22T03_51_50.269402", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-09-22T03-51-50.269402.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-09-22T03-51-50.269402.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_09_22T03_51_50.269402", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-09-22T03-51-50.269402.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-09-22T03-51-50.269402.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_09_22T03_51_50.269402", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-09-22T03-51-50.269402.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-09-22T03-51-50.269402.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_09_22T03_51_50.269402", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-09-22T03-51-50.269402.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-09-22T03-51-50.269402.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_09_22T03_51_50.269402", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-09-22T03-51-50.269402.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-09-22T03-51-50.269402.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_09_22T03_51_50.269402", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-09-22T03-51-50.269402.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-09-22T03-51-50.269402.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_09_22T03_51_50.269402", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-09-22T03-51-50.269402.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-09-22T03-51-50.269402.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_09_22T03_51_50.269402", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-09-22T03-51-50.269402.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-09-22T03-51-50.269402.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_09_22T03_51_50.269402", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-09-22T03-51-50.269402.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-09-22T03-51-50.269402.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_09_22T03_51_50.269402", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-22T03-51-50.269402.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-22T03-51-50.269402.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_09_22T03_51_50.269402", "path": ["**/details_harness|hendrycksTest-virology|5_2023-09-22T03-51-50.269402.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-09-22T03-51-50.269402.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_09_22T03_51_50.269402", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-09-22T03-51-50.269402.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-09-22T03-51-50.269402.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_09_22T03_51_50.269402", "path": ["**/details_harness|truthfulqa:mc|0_2023-09-22T03-51-50.269402.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-09-22T03-51-50.269402.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_27T07_46_06.592709", "path": ["**/details_harness|winogrande|5_2023-10-27T07-46-06.592709.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-27T07-46-06.592709.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_09_22T03_51_50.269402", "path": ["results_2023-09-22T03-51-50.269402.parquet"]}, {"split": "2023_10_27T07_46_06.592709", "path": ["results_2023-10-27T07-46-06.592709.parquet"]}, {"split": "latest", "path": ["results_2023-10-27T07-46-06.592709.parquet"]}]}]}
2023-10-27T06:46:18+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of PocketDoc/Dans-RetroRodeo-13b ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model PocketDoc/Dans-RetroRodeo-13b on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-10-27T07:46:06.592709(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of PocketDoc/Dans-RetroRodeo-13b", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model PocketDoc/Dans-RetroRodeo-13b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-27T07:46:06.592709(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of PocketDoc/Dans-RetroRodeo-13b", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model PocketDoc/Dans-RetroRodeo-13b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-27T07:46:06.592709(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 23, 31, 171, 66, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of PocketDoc/Dans-RetroRodeo-13b## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model PocketDoc/Dans-RetroRodeo-13b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-27T07:46:06.592709(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
7acb9189aea75e2bcc571ee967bb583dbef2ec1d
# Dataset of Zoya This is the dataset of Zoya, containing 114 images and their tags. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)). | Name | Images | Download | Description | |:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------| | raw | 114 | [Download](dataset-raw.zip) | Raw data with meta information. | | raw-stage3 | 276 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. | | 384x512 | 114 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. | | 512x512 | 114 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. | | 512x704 | 114 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. | | 640x640 | 114 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. | | 640x880 | 114 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. | | stage3-640 | 276 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. | | stage3-800 | 276 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. | | stage3-1200 | 276 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
CyberHarem/zoya_akibameidosensou
[ "task_categories:text-to-image", "size_categories:n<1K", "license:mit", "art", "not-for-all-audiences", "region:us" ]
2023-09-22T02:55:21+00:00
{"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]}
2023-09-22T02:56:27+00:00
[]
[]
TAGS #task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
Dataset of Zoya =============== This is the dataset of Zoya, containing 114 images and their tags. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
[]
[ "TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n" ]
[ 44 ]
[ "passage: TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n" ]
8526079b61875d1d75ee4965ec4cee71256fc417
# Dataset Card for "OpenOrca-20k" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
timothyckl/OpenOrca-20k
[ "region:us" ]
2023-09-22T02:55:42+00:00
{"dataset_info": {"features": [{"name": "id", "dtype": "string"}, {"name": "system_prompt", "dtype": "string"}, {"name": "question", "dtype": "string"}, {"name": "response", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 33921209, "num_examples": 20000}], "download_size": 19510634, "dataset_size": 33921209}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]}
2023-09-22T02:59:50+00:00
[]
[]
TAGS #region-us
# Dataset Card for "OpenOrca-20k" More Information needed
[ "# Dataset Card for \"OpenOrca-20k\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"OpenOrca-20k\"\n\nMore Information needed" ]
[ 6, 15 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"OpenOrca-20k\"\n\nMore Information needed" ]
dca3f13b2ba654ce6b1bb45022323f849e09ff17
# Dataset of Yaegashi Yasuko This is the dataset of Yaegashi Yasuko, containing 156 images and their tags. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)). | Name | Images | Download | Description | |:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------| | raw | 156 | [Download](dataset-raw.zip) | Raw data with meta information. | | raw-stage3 | 372 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. | | 384x512 | 156 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. | | 512x512 | 156 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. | | 512x704 | 156 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. | | 640x640 | 156 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. | | 640x880 | 156 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. | | stage3-640 | 372 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. | | stage3-800 | 372 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. | | stage3-1200 | 372 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
CyberHarem/yaegashi_yasuko_akibameidosensou
[ "task_categories:text-to-image", "size_categories:n<1K", "license:mit", "art", "not-for-all-audiences", "region:us" ]
2023-09-22T03:07:48+00:00
{"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]}
2023-09-22T03:12:46+00:00
[]
[]
TAGS #task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
Dataset of Yaegashi Yasuko ========================== This is the dataset of Yaegashi Yasuko, containing 156 images and their tags. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
[]
[ "TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n" ]
[ 44 ]
[ "passage: TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n" ]
5167a7dd6ca98f30fe5c788d63a80cac777f6289
# Dataset Card for Evaluation run of BEE-spoke-data/TinyLlama-1.1bee ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/BEE-spoke-data/TinyLlama-1.1bee - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [BEE-spoke-data/TinyLlama-1.1bee](https://huggingface.co/BEE-spoke-data/TinyLlama-1.1bee) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_BEE-spoke-data__TinyLlama-1.1bee", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-10-29T06:23:03.017532](https://huggingface.co/datasets/open-llm-leaderboard/details_BEE-spoke-data__TinyLlama-1.1bee/blob/main/results_2023-10-29T06-23-03.017532.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.0018875838926174498, "em_stderr": 0.00044451099905590025, "f1": 0.03736262583892626, "f1_stderr": 0.0011684343114881778, "acc": 0.27343398918005496, "acc_stderr": 0.007654321426298602 }, "harness|drop|3": { "em": 0.0018875838926174498, "em_stderr": 0.00044451099905590025, "f1": 0.03736262583892626, "f1_stderr": 0.0011684343114881778 }, "harness|gsm8k|5": { "acc": 0.002274450341167551, "acc_stderr": 0.0013121578148674216 }, "harness|winogrande|5": { "acc": 0.5445935280189423, "acc_stderr": 0.013996485037729782 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_BEE-spoke-data__TinyLlama-1.1bee
[ "region:us" ]
2023-09-22T03:13:33+00:00
{"pretty_name": "Evaluation run of BEE-spoke-data/TinyLlama-1.1bee", "dataset_summary": "Dataset automatically created during the evaluation run of model [BEE-spoke-data/TinyLlama-1.1bee](https://huggingface.co/BEE-spoke-data/TinyLlama-1.1bee) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_BEE-spoke-data__TinyLlama-1.1bee\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-29T06:23:03.017532](https://huggingface.co/datasets/open-llm-leaderboard/details_BEE-spoke-data__TinyLlama-1.1bee/blob/main/results_2023-10-29T06-23-03.017532.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0018875838926174498,\n \"em_stderr\": 0.00044451099905590025,\n \"f1\": 0.03736262583892626,\n \"f1_stderr\": 0.0011684343114881778,\n \"acc\": 0.27343398918005496,\n \"acc_stderr\": 0.007654321426298602\n },\n \"harness|drop|3\": {\n \"em\": 0.0018875838926174498,\n \"em_stderr\": 0.00044451099905590025,\n \"f1\": 0.03736262583892626,\n \"f1_stderr\": 0.0011684343114881778\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.002274450341167551,\n \"acc_stderr\": 0.0013121578148674216\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.5445935280189423,\n \"acc_stderr\": 0.013996485037729782\n }\n}\n```", "repo_url": "https://huggingface.co/BEE-spoke-data/TinyLlama-1.1bee", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_09_22T04_13_14.200799", "path": ["**/details_harness|arc:challenge|25_2023-09-22T04-13-14.200799.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-09-22T04-13-14.200799.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_29T06_23_03.017532", "path": ["**/details_harness|drop|3_2023-10-29T06-23-03.017532.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-29T06-23-03.017532.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_29T06_23_03.017532", "path": ["**/details_harness|gsm8k|5_2023-10-29T06-23-03.017532.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-29T06-23-03.017532.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_09_22T04_13_14.200799", "path": ["**/details_harness|hellaswag|10_2023-09-22T04-13-14.200799.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-09-22T04-13-14.200799.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_09_22T04_13_14.200799", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-22T04-13-14.200799.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-09-22T04-13-14.200799.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-09-22T04-13-14.200799.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-09-22T04-13-14.200799.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-22T04-13-14.200799.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-09-22T04-13-14.200799.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-09-22T04-13-14.200799.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-09-22T04-13-14.200799.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-09-22T04-13-14.200799.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-09-22T04-13-14.200799.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-09-22T04-13-14.200799.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-09-22T04-13-14.200799.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-22T04-13-14.200799.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-09-22T04-13-14.200799.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-22T04-13-14.200799.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-22T04-13-14.200799.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-09-22T04-13-14.200799.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-09-22T04-13-14.200799.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-09-22T04-13-14.200799.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-22T04-13-14.200799.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-22T04-13-14.200799.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-22T04-13-14.200799.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-09-22T04-13-14.200799.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-22T04-13-14.200799.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-22T04-13-14.200799.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-22T04-13-14.200799.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-22T04-13-14.200799.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-09-22T04-13-14.200799.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-22T04-13-14.200799.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-22T04-13-14.200799.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-22T04-13-14.200799.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-22T04-13-14.200799.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-09-22T04-13-14.200799.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-09-22T04-13-14.200799.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-09-22T04-13-14.200799.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-09-22T04-13-14.200799.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-22T04-13-14.200799.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-09-22T04-13-14.200799.parquet", "**/details_harness|hendrycksTest-management|5_2023-09-22T04-13-14.200799.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-09-22T04-13-14.200799.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-09-22T04-13-14.200799.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-09-22T04-13-14.200799.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-09-22T04-13-14.200799.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-22T04-13-14.200799.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-09-22T04-13-14.200799.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-09-22T04-13-14.200799.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-09-22T04-13-14.200799.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-09-22T04-13-14.200799.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-09-22T04-13-14.200799.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-09-22T04-13-14.200799.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-09-22T04-13-14.200799.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-09-22T04-13-14.200799.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-09-22T04-13-14.200799.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-09-22T04-13-14.200799.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-22T04-13-14.200799.parquet", "**/details_harness|hendrycksTest-virology|5_2023-09-22T04-13-14.200799.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-09-22T04-13-14.200799.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-22T04-13-14.200799.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-09-22T04-13-14.200799.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-09-22T04-13-14.200799.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-09-22T04-13-14.200799.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-22T04-13-14.200799.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-09-22T04-13-14.200799.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-09-22T04-13-14.200799.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-09-22T04-13-14.200799.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-09-22T04-13-14.200799.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-09-22T04-13-14.200799.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-09-22T04-13-14.200799.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-09-22T04-13-14.200799.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-22T04-13-14.200799.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-09-22T04-13-14.200799.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-22T04-13-14.200799.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-22T04-13-14.200799.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-09-22T04-13-14.200799.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-09-22T04-13-14.200799.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-09-22T04-13-14.200799.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-22T04-13-14.200799.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-22T04-13-14.200799.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-22T04-13-14.200799.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-09-22T04-13-14.200799.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-22T04-13-14.200799.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-22T04-13-14.200799.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-22T04-13-14.200799.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-22T04-13-14.200799.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-09-22T04-13-14.200799.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-22T04-13-14.200799.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-22T04-13-14.200799.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-22T04-13-14.200799.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-22T04-13-14.200799.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-09-22T04-13-14.200799.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-09-22T04-13-14.200799.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-09-22T04-13-14.200799.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-09-22T04-13-14.200799.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-22T04-13-14.200799.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-09-22T04-13-14.200799.parquet", "**/details_harness|hendrycksTest-management|5_2023-09-22T04-13-14.200799.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-09-22T04-13-14.200799.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-09-22T04-13-14.200799.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-09-22T04-13-14.200799.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-09-22T04-13-14.200799.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-22T04-13-14.200799.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-09-22T04-13-14.200799.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-09-22T04-13-14.200799.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-09-22T04-13-14.200799.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-09-22T04-13-14.200799.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-09-22T04-13-14.200799.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-09-22T04-13-14.200799.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-09-22T04-13-14.200799.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-09-22T04-13-14.200799.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-09-22T04-13-14.200799.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-09-22T04-13-14.200799.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-22T04-13-14.200799.parquet", "**/details_harness|hendrycksTest-virology|5_2023-09-22T04-13-14.200799.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-09-22T04-13-14.200799.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_09_22T04_13_14.200799", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-22T04-13-14.200799.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-22T04-13-14.200799.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_09_22T04_13_14.200799", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-09-22T04-13-14.200799.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-09-22T04-13-14.200799.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_09_22T04_13_14.200799", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-09-22T04-13-14.200799.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-09-22T04-13-14.200799.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_09_22T04_13_14.200799", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-09-22T04-13-14.200799.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-09-22T04-13-14.200799.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_09_22T04_13_14.200799", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-22T04-13-14.200799.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-22T04-13-14.200799.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_09_22T04_13_14.200799", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-09-22T04-13-14.200799.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-09-22T04-13-14.200799.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_09_22T04_13_14.200799", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-09-22T04-13-14.200799.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-09-22T04-13-14.200799.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_09_22T04_13_14.200799", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-09-22T04-13-14.200799.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-09-22T04-13-14.200799.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_09_22T04_13_14.200799", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-09-22T04-13-14.200799.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-09-22T04-13-14.200799.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_09_22T04_13_14.200799", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-09-22T04-13-14.200799.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-09-22T04-13-14.200799.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_09_22T04_13_14.200799", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-09-22T04-13-14.200799.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-09-22T04-13-14.200799.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_09_22T04_13_14.200799", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-09-22T04-13-14.200799.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-09-22T04-13-14.200799.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_09_22T04_13_14.200799", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-22T04-13-14.200799.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-22T04-13-14.200799.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_09_22T04_13_14.200799", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-09-22T04-13-14.200799.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-09-22T04-13-14.200799.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_09_22T04_13_14.200799", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-22T04-13-14.200799.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-22T04-13-14.200799.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_09_22T04_13_14.200799", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-22T04-13-14.200799.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-22T04-13-14.200799.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_09_22T04_13_14.200799", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-09-22T04-13-14.200799.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-09-22T04-13-14.200799.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_09_22T04_13_14.200799", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-09-22T04-13-14.200799.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-09-22T04-13-14.200799.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_09_22T04_13_14.200799", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-09-22T04-13-14.200799.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-09-22T04-13-14.200799.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_09_22T04_13_14.200799", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-22T04-13-14.200799.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-22T04-13-14.200799.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_09_22T04_13_14.200799", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-22T04-13-14.200799.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-22T04-13-14.200799.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_09_22T04_13_14.200799", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-22T04-13-14.200799.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-22T04-13-14.200799.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_09_22T04_13_14.200799", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-09-22T04-13-14.200799.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-09-22T04-13-14.200799.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_09_22T04_13_14.200799", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-22T04-13-14.200799.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-22T04-13-14.200799.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_09_22T04_13_14.200799", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-22T04-13-14.200799.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-22T04-13-14.200799.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_09_22T04_13_14.200799", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-22T04-13-14.200799.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-22T04-13-14.200799.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_09_22T04_13_14.200799", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-22T04-13-14.200799.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-22T04-13-14.200799.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_09_22T04_13_14.200799", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-09-22T04-13-14.200799.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-09-22T04-13-14.200799.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_09_22T04_13_14.200799", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-22T04-13-14.200799.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-22T04-13-14.200799.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_09_22T04_13_14.200799", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-22T04-13-14.200799.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-22T04-13-14.200799.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_09_22T04_13_14.200799", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-22T04-13-14.200799.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-22T04-13-14.200799.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_09_22T04_13_14.200799", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-22T04-13-14.200799.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-22T04-13-14.200799.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_09_22T04_13_14.200799", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-09-22T04-13-14.200799.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-09-22T04-13-14.200799.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_09_22T04_13_14.200799", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-09-22T04-13-14.200799.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-09-22T04-13-14.200799.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_09_22T04_13_14.200799", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-09-22T04-13-14.200799.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-09-22T04-13-14.200799.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_09_22T04_13_14.200799", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-09-22T04-13-14.200799.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-09-22T04-13-14.200799.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_09_22T04_13_14.200799", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-22T04-13-14.200799.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-22T04-13-14.200799.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_09_22T04_13_14.200799", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-09-22T04-13-14.200799.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-09-22T04-13-14.200799.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_09_22T04_13_14.200799", "path": ["**/details_harness|hendrycksTest-management|5_2023-09-22T04-13-14.200799.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-09-22T04-13-14.200799.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_09_22T04_13_14.200799", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-09-22T04-13-14.200799.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-09-22T04-13-14.200799.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_09_22T04_13_14.200799", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-09-22T04-13-14.200799.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-09-22T04-13-14.200799.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_09_22T04_13_14.200799", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-09-22T04-13-14.200799.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-09-22T04-13-14.200799.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_09_22T04_13_14.200799", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-09-22T04-13-14.200799.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-09-22T04-13-14.200799.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_09_22T04_13_14.200799", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-22T04-13-14.200799.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-22T04-13-14.200799.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_09_22T04_13_14.200799", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-09-22T04-13-14.200799.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-09-22T04-13-14.200799.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_09_22T04_13_14.200799", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-09-22T04-13-14.200799.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-09-22T04-13-14.200799.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_09_22T04_13_14.200799", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-09-22T04-13-14.200799.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-09-22T04-13-14.200799.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_09_22T04_13_14.200799", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-09-22T04-13-14.200799.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-09-22T04-13-14.200799.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_09_22T04_13_14.200799", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-09-22T04-13-14.200799.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-09-22T04-13-14.200799.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_09_22T04_13_14.200799", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-09-22T04-13-14.200799.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-09-22T04-13-14.200799.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_09_22T04_13_14.200799", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-09-22T04-13-14.200799.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-09-22T04-13-14.200799.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_09_22T04_13_14.200799", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-09-22T04-13-14.200799.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-09-22T04-13-14.200799.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_09_22T04_13_14.200799", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-09-22T04-13-14.200799.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-09-22T04-13-14.200799.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_09_22T04_13_14.200799", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-09-22T04-13-14.200799.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-09-22T04-13-14.200799.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_09_22T04_13_14.200799", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-22T04-13-14.200799.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-22T04-13-14.200799.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_09_22T04_13_14.200799", "path": ["**/details_harness|hendrycksTest-virology|5_2023-09-22T04-13-14.200799.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-09-22T04-13-14.200799.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_09_22T04_13_14.200799", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-09-22T04-13-14.200799.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-09-22T04-13-14.200799.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_09_22T04_13_14.200799", "path": ["**/details_harness|truthfulqa:mc|0_2023-09-22T04-13-14.200799.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-09-22T04-13-14.200799.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_29T06_23_03.017532", "path": ["**/details_harness|winogrande|5_2023-10-29T06-23-03.017532.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-29T06-23-03.017532.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_09_22T04_13_14.200799", "path": ["results_2023-09-22T04-13-14.200799.parquet"]}, {"split": "2023_10_29T06_23_03.017532", "path": ["results_2023-10-29T06-23-03.017532.parquet"]}, {"split": "latest", "path": ["results_2023-10-29T06-23-03.017532.parquet"]}]}]}
2023-10-29T06:23:15+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of BEE-spoke-data/TinyLlama-1.1bee ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model BEE-spoke-data/TinyLlama-1.1bee on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-10-29T06:23:03.017532(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of BEE-spoke-data/TinyLlama-1.1bee", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model BEE-spoke-data/TinyLlama-1.1bee on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-29T06:23:03.017532(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of BEE-spoke-data/TinyLlama-1.1bee", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model BEE-spoke-data/TinyLlama-1.1bee on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-29T06:23:03.017532(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 25, 31, 173, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of BEE-spoke-data/TinyLlama-1.1bee## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model BEE-spoke-data/TinyLlama-1.1bee on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-29T06:23:03.017532(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
09d998455617154e5b21057abc378147bb1d5cc1
# Dataset Card for "qa_wikipedia_retrieved_chunks-og" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
legacy107/qa_wikipedia_retrieved_chunks-og
[ "region:us" ]
2023-09-22T03:18:35+00:00
{"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "validation", "path": "data/validation-*"}, {"split": "test", "path": "data/test-*"}]}], "dataset_info": {"features": [{"name": "id", "dtype": "string"}, {"name": "title", "dtype": "string"}, {"name": "context", "dtype": "string"}, {"name": "question", "dtype": "string"}, {"name": "answer_start", "dtype": "int64"}, {"name": "answer", "dtype": "string"}, {"name": "article", "dtype": "string"}, {"name": "retrieved_context", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 6210772758, "num_examples": 110970}, {"name": "validation", "num_bytes": 732036635, "num_examples": 13833}, {"name": "test", "num_bytes": 762734936, "num_examples": 13873}], "download_size": 417751805, "dataset_size": 7705544329}}
2023-09-22T03:20:01+00:00
[]
[]
TAGS #region-us
# Dataset Card for "qa_wikipedia_retrieved_chunks-og" More Information needed
[ "# Dataset Card for \"qa_wikipedia_retrieved_chunks-og\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"qa_wikipedia_retrieved_chunks-og\"\n\nMore Information needed" ]
[ 6, 23 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"qa_wikipedia_retrieved_chunks-og\"\n\nMore Information needed" ]
d7addebbf68a936df1e7b20bd9d1be3042923efc
# Dataset of Nagi This is the dataset of Nagi, containing 70 images and their tags. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)). | Name | Images | Download | Description | |:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------| | raw | 70 | [Download](dataset-raw.zip) | Raw data with meta information. | | raw-stage3 | 167 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. | | 384x512 | 70 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. | | 512x512 | 70 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. | | 512x704 | 70 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. | | 640x640 | 70 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. | | 640x880 | 70 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. | | stage3-640 | 167 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. | | stage3-800 | 167 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. | | stage3-1200 | 167 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
CyberHarem/nagi_akibameidosensou
[ "task_categories:text-to-image", "size_categories:n<1K", "license:mit", "art", "not-for-all-audiences", "region:us" ]
2023-09-22T03:19:23+00:00
{"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]}
2023-09-22T03:20:49+00:00
[]
[]
TAGS #task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
Dataset of Nagi =============== This is the dataset of Nagi, containing 70 images and their tags. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
[]
[ "TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n" ]
[ 44 ]
[ "passage: TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n" ]
b7a6e35320fa021a3853f1902a454f644b89ed0a
# Dataset Card for Evaluation run of DevaMalla/llama_7b_qlora_cds ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/DevaMalla/llama_7b_qlora_cds - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [DevaMalla/llama_7b_qlora_cds](https://huggingface.co/DevaMalla/llama_7b_qlora_cds) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_DevaMalla__llama_7b_qlora_cds", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-10-27T04:49:41.521355](https://huggingface.co/datasets/open-llm-leaderboard/details_DevaMalla__llama_7b_qlora_cds/blob/main/results_2023-10-27T04-49-41.521355.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.001153523489932886, "em_stderr": 0.0003476179896857104, "f1": 0.05950083892617462, "f1_stderr": 0.0013434612460508462, "acc": 0.37919144217863743, "acc_stderr": 0.00905606982363287 }, "harness|drop|3": { "em": 0.001153523489932886, "em_stderr": 0.0003476179896857104, "f1": 0.05950083892617462, "f1_stderr": 0.0013434612460508462 }, "harness|gsm8k|5": { "acc": 0.04094010614101592, "acc_stderr": 0.0054580767962943534 }, "harness|winogrande|5": { "acc": 0.7174427782162589, "acc_stderr": 0.012654062850971388 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_DevaMalla__llama_7b_qlora_cds
[ "region:us" ]
2023-09-22T03:46:11+00:00
{"pretty_name": "Evaluation run of DevaMalla/llama_7b_qlora_cds", "dataset_summary": "Dataset automatically created during the evaluation run of model [DevaMalla/llama_7b_qlora_cds](https://huggingface.co/DevaMalla/llama_7b_qlora_cds) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_DevaMalla__llama_7b_qlora_cds\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-27T04:49:41.521355](https://huggingface.co/datasets/open-llm-leaderboard/details_DevaMalla__llama_7b_qlora_cds/blob/main/results_2023-10-27T04-49-41.521355.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.001153523489932886,\n \"em_stderr\": 0.0003476179896857104,\n \"f1\": 0.05950083892617462,\n \"f1_stderr\": 0.0013434612460508462,\n \"acc\": 0.37919144217863743,\n \"acc_stderr\": 0.00905606982363287\n },\n \"harness|drop|3\": {\n \"em\": 0.001153523489932886,\n \"em_stderr\": 0.0003476179896857104,\n \"f1\": 0.05950083892617462,\n \"f1_stderr\": 0.0013434612460508462\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.04094010614101592,\n \"acc_stderr\": 0.0054580767962943534\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7174427782162589,\n \"acc_stderr\": 0.012654062850971388\n }\n}\n```", "repo_url": "https://huggingface.co/DevaMalla/llama_7b_qlora_cds", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_09_22T04_45_53.038804", "path": ["**/details_harness|arc:challenge|25_2023-09-22T04-45-53.038804.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-09-22T04-45-53.038804.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_27T04_49_41.521355", "path": ["**/details_harness|drop|3_2023-10-27T04-49-41.521355.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-27T04-49-41.521355.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_27T04_49_41.521355", "path": ["**/details_harness|gsm8k|5_2023-10-27T04-49-41.521355.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-27T04-49-41.521355.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_09_22T04_45_53.038804", "path": ["**/details_harness|hellaswag|10_2023-09-22T04-45-53.038804.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-09-22T04-45-53.038804.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_09_22T04_45_53.038804", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-22T04-45-53.038804.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-09-22T04-45-53.038804.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-09-22T04-45-53.038804.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-09-22T04-45-53.038804.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-22T04-45-53.038804.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-09-22T04-45-53.038804.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-09-22T04-45-53.038804.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-09-22T04-45-53.038804.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-09-22T04-45-53.038804.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-09-22T04-45-53.038804.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-09-22T04-45-53.038804.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-09-22T04-45-53.038804.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-22T04-45-53.038804.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-09-22T04-45-53.038804.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-22T04-45-53.038804.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-22T04-45-53.038804.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-09-22T04-45-53.038804.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-09-22T04-45-53.038804.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-09-22T04-45-53.038804.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-22T04-45-53.038804.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-22T04-45-53.038804.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-22T04-45-53.038804.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-09-22T04-45-53.038804.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-22T04-45-53.038804.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-22T04-45-53.038804.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-22T04-45-53.038804.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-22T04-45-53.038804.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-09-22T04-45-53.038804.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-22T04-45-53.038804.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-22T04-45-53.038804.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-22T04-45-53.038804.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-22T04-45-53.038804.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-09-22T04-45-53.038804.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-09-22T04-45-53.038804.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-09-22T04-45-53.038804.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-09-22T04-45-53.038804.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-22T04-45-53.038804.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-09-22T04-45-53.038804.parquet", "**/details_harness|hendrycksTest-management|5_2023-09-22T04-45-53.038804.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-09-22T04-45-53.038804.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-09-22T04-45-53.038804.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-09-22T04-45-53.038804.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-09-22T04-45-53.038804.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-22T04-45-53.038804.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-09-22T04-45-53.038804.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-09-22T04-45-53.038804.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-09-22T04-45-53.038804.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-09-22T04-45-53.038804.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-09-22T04-45-53.038804.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-09-22T04-45-53.038804.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-09-22T04-45-53.038804.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-09-22T04-45-53.038804.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-09-22T04-45-53.038804.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-09-22T04-45-53.038804.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-22T04-45-53.038804.parquet", "**/details_harness|hendrycksTest-virology|5_2023-09-22T04-45-53.038804.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-09-22T04-45-53.038804.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-22T04-45-53.038804.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-09-22T04-45-53.038804.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-09-22T04-45-53.038804.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-09-22T04-45-53.038804.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-22T04-45-53.038804.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-09-22T04-45-53.038804.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-09-22T04-45-53.038804.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-09-22T04-45-53.038804.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-09-22T04-45-53.038804.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-09-22T04-45-53.038804.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-09-22T04-45-53.038804.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-09-22T04-45-53.038804.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-22T04-45-53.038804.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-09-22T04-45-53.038804.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-22T04-45-53.038804.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-22T04-45-53.038804.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-09-22T04-45-53.038804.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-09-22T04-45-53.038804.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-09-22T04-45-53.038804.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-22T04-45-53.038804.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-22T04-45-53.038804.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-22T04-45-53.038804.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-09-22T04-45-53.038804.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-22T04-45-53.038804.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-22T04-45-53.038804.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-22T04-45-53.038804.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-22T04-45-53.038804.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-09-22T04-45-53.038804.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-22T04-45-53.038804.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-22T04-45-53.038804.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-22T04-45-53.038804.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-22T04-45-53.038804.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-09-22T04-45-53.038804.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-09-22T04-45-53.038804.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-09-22T04-45-53.038804.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-09-22T04-45-53.038804.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-22T04-45-53.038804.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-09-22T04-45-53.038804.parquet", "**/details_harness|hendrycksTest-management|5_2023-09-22T04-45-53.038804.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-09-22T04-45-53.038804.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-09-22T04-45-53.038804.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-09-22T04-45-53.038804.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-09-22T04-45-53.038804.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-22T04-45-53.038804.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-09-22T04-45-53.038804.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-09-22T04-45-53.038804.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-09-22T04-45-53.038804.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-09-22T04-45-53.038804.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-09-22T04-45-53.038804.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-09-22T04-45-53.038804.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-09-22T04-45-53.038804.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-09-22T04-45-53.038804.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-09-22T04-45-53.038804.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-09-22T04-45-53.038804.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-22T04-45-53.038804.parquet", "**/details_harness|hendrycksTest-virology|5_2023-09-22T04-45-53.038804.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-09-22T04-45-53.038804.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_09_22T04_45_53.038804", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-22T04-45-53.038804.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-22T04-45-53.038804.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_09_22T04_45_53.038804", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-09-22T04-45-53.038804.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-09-22T04-45-53.038804.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_09_22T04_45_53.038804", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-09-22T04-45-53.038804.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-09-22T04-45-53.038804.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_09_22T04_45_53.038804", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-09-22T04-45-53.038804.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-09-22T04-45-53.038804.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_09_22T04_45_53.038804", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-22T04-45-53.038804.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-22T04-45-53.038804.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_09_22T04_45_53.038804", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-09-22T04-45-53.038804.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-09-22T04-45-53.038804.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_09_22T04_45_53.038804", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-09-22T04-45-53.038804.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-09-22T04-45-53.038804.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_09_22T04_45_53.038804", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-09-22T04-45-53.038804.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-09-22T04-45-53.038804.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_09_22T04_45_53.038804", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-09-22T04-45-53.038804.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-09-22T04-45-53.038804.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_09_22T04_45_53.038804", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-09-22T04-45-53.038804.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-09-22T04-45-53.038804.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_09_22T04_45_53.038804", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-09-22T04-45-53.038804.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-09-22T04-45-53.038804.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_09_22T04_45_53.038804", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-09-22T04-45-53.038804.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-09-22T04-45-53.038804.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_09_22T04_45_53.038804", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-22T04-45-53.038804.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-22T04-45-53.038804.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_09_22T04_45_53.038804", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-09-22T04-45-53.038804.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-09-22T04-45-53.038804.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_09_22T04_45_53.038804", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-22T04-45-53.038804.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-22T04-45-53.038804.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_09_22T04_45_53.038804", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-22T04-45-53.038804.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-22T04-45-53.038804.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_09_22T04_45_53.038804", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-09-22T04-45-53.038804.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-09-22T04-45-53.038804.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_09_22T04_45_53.038804", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-09-22T04-45-53.038804.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-09-22T04-45-53.038804.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_09_22T04_45_53.038804", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-09-22T04-45-53.038804.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-09-22T04-45-53.038804.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_09_22T04_45_53.038804", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-22T04-45-53.038804.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-22T04-45-53.038804.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_09_22T04_45_53.038804", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-22T04-45-53.038804.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-22T04-45-53.038804.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_09_22T04_45_53.038804", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-22T04-45-53.038804.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-22T04-45-53.038804.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_09_22T04_45_53.038804", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-09-22T04-45-53.038804.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-09-22T04-45-53.038804.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_09_22T04_45_53.038804", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-22T04-45-53.038804.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-22T04-45-53.038804.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_09_22T04_45_53.038804", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-22T04-45-53.038804.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-22T04-45-53.038804.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_09_22T04_45_53.038804", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-22T04-45-53.038804.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-22T04-45-53.038804.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_09_22T04_45_53.038804", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-22T04-45-53.038804.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-22T04-45-53.038804.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_09_22T04_45_53.038804", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-09-22T04-45-53.038804.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-09-22T04-45-53.038804.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_09_22T04_45_53.038804", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-22T04-45-53.038804.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-22T04-45-53.038804.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_09_22T04_45_53.038804", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-22T04-45-53.038804.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-22T04-45-53.038804.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_09_22T04_45_53.038804", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-22T04-45-53.038804.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-22T04-45-53.038804.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_09_22T04_45_53.038804", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-22T04-45-53.038804.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-22T04-45-53.038804.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_09_22T04_45_53.038804", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-09-22T04-45-53.038804.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-09-22T04-45-53.038804.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_09_22T04_45_53.038804", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-09-22T04-45-53.038804.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-09-22T04-45-53.038804.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_09_22T04_45_53.038804", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-09-22T04-45-53.038804.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-09-22T04-45-53.038804.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_09_22T04_45_53.038804", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-09-22T04-45-53.038804.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-09-22T04-45-53.038804.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_09_22T04_45_53.038804", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-22T04-45-53.038804.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-22T04-45-53.038804.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_09_22T04_45_53.038804", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-09-22T04-45-53.038804.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-09-22T04-45-53.038804.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_09_22T04_45_53.038804", "path": ["**/details_harness|hendrycksTest-management|5_2023-09-22T04-45-53.038804.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-09-22T04-45-53.038804.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_09_22T04_45_53.038804", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-09-22T04-45-53.038804.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-09-22T04-45-53.038804.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_09_22T04_45_53.038804", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-09-22T04-45-53.038804.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-09-22T04-45-53.038804.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_09_22T04_45_53.038804", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-09-22T04-45-53.038804.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-09-22T04-45-53.038804.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_09_22T04_45_53.038804", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-09-22T04-45-53.038804.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-09-22T04-45-53.038804.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_09_22T04_45_53.038804", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-22T04-45-53.038804.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-22T04-45-53.038804.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_09_22T04_45_53.038804", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-09-22T04-45-53.038804.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-09-22T04-45-53.038804.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_09_22T04_45_53.038804", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-09-22T04-45-53.038804.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-09-22T04-45-53.038804.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_09_22T04_45_53.038804", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-09-22T04-45-53.038804.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-09-22T04-45-53.038804.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_09_22T04_45_53.038804", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-09-22T04-45-53.038804.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-09-22T04-45-53.038804.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_09_22T04_45_53.038804", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-09-22T04-45-53.038804.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-09-22T04-45-53.038804.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_09_22T04_45_53.038804", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-09-22T04-45-53.038804.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-09-22T04-45-53.038804.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_09_22T04_45_53.038804", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-09-22T04-45-53.038804.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-09-22T04-45-53.038804.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_09_22T04_45_53.038804", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-09-22T04-45-53.038804.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-09-22T04-45-53.038804.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_09_22T04_45_53.038804", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-09-22T04-45-53.038804.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-09-22T04-45-53.038804.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_09_22T04_45_53.038804", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-09-22T04-45-53.038804.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-09-22T04-45-53.038804.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_09_22T04_45_53.038804", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-22T04-45-53.038804.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-22T04-45-53.038804.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_09_22T04_45_53.038804", "path": ["**/details_harness|hendrycksTest-virology|5_2023-09-22T04-45-53.038804.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-09-22T04-45-53.038804.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_09_22T04_45_53.038804", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-09-22T04-45-53.038804.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-09-22T04-45-53.038804.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_09_22T04_45_53.038804", "path": ["**/details_harness|truthfulqa:mc|0_2023-09-22T04-45-53.038804.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-09-22T04-45-53.038804.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_27T04_49_41.521355", "path": ["**/details_harness|winogrande|5_2023-10-27T04-49-41.521355.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-27T04-49-41.521355.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_09_22T04_45_53.038804", "path": ["results_2023-09-22T04-45-53.038804.parquet"]}, {"split": "2023_10_27T04_49_41.521355", "path": ["results_2023-10-27T04-49-41.521355.parquet"]}, {"split": "latest", "path": ["results_2023-10-27T04-49-41.521355.parquet"]}]}]}
2023-10-27T03:49:54+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of DevaMalla/llama_7b_qlora_cds ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model DevaMalla/llama_7b_qlora_cds on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-10-27T04:49:41.521355(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of DevaMalla/llama_7b_qlora_cds", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model DevaMalla/llama_7b_qlora_cds on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-27T04:49:41.521355(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of DevaMalla/llama_7b_qlora_cds", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model DevaMalla/llama_7b_qlora_cds on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-27T04:49:41.521355(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 25, 31, 173, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of DevaMalla/llama_7b_qlora_cds## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model DevaMalla/llama_7b_qlora_cds on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-27T04:49:41.521355(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
181d94c0f4dbbf50dbf8811fdbef6f5ee56a4150
# Dataset Card for "data_soict_train_with_entity" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
thanhduycao/data_soict_train_with_entity
[ "region:us" ]
2023-09-22T04:00:11+00:00
{"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "test", "path": "data/test-*"}]}], "dataset_info": {"features": [{"name": "id", "dtype": "string"}, {"name": "audio", "struct": [{"name": "array", "sequence": "float64"}, {"name": "path", "dtype": "string"}, {"name": "sampling_rate", "dtype": "int64"}]}, {"name": "sentence_norm", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 3955020961, "num_examples": 11629}, {"name": "test", "num_bytes": 389981876, "num_examples": 748}], "download_size": 1036033410, "dataset_size": 4345002837}}
2023-09-22T04:01:24+00:00
[]
[]
TAGS #region-us
# Dataset Card for "data_soict_train_with_entity" More Information needed
[ "# Dataset Card for \"data_soict_train_with_entity\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"data_soict_train_with_entity\"\n\nMore Information needed" ]
[ 6, 23 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"data_soict_train_with_entity\"\n\nMore Information needed" ]
dae52562603787e65ee00cc95561bab4fa818d9b
中文金融资讯数据集,包括(压缩前): - 上市公司公告 announcement_data.jsonl 20G - 金融资讯/新闻 - fin_news_data.jsonl 30G - fin_articles_data.jsonl 10G - 金融试题 fin_exam.jsonl 370M 数据格式: ``` { "text": <文本内容>, "meta": { "source": <数据来源> } } ```
Duxiaoman-DI/FinCorpus
[ "size_categories:10M<n<100M", "language:zh", "license:apache-2.0", "finance", "region:us" ]
2023-09-22T04:01:30+00:00
{"language": ["zh"], "license": "apache-2.0", "size_categories": ["10M<n<100M"], "tags": ["finance"]}
2023-09-22T09:10:10+00:00
[]
[ "zh" ]
TAGS #size_categories-10M<n<100M #language-Chinese #license-apache-2.0 #finance #region-us
中文金融资讯数据集,包括(压缩前): - 上市公司公告 announcement_data.jsonl 20G - 金融资讯/新闻 - fin_news_data.jsonl 30G - fin_articles_data.jsonl 10G - 金融试题 fin_exam.jsonl 370M 数据格式:
[]
[ "TAGS\n#size_categories-10M<n<100M #language-Chinese #license-apache-2.0 #finance #region-us \n" ]
[ 34 ]
[ "passage: TAGS\n#size_categories-10M<n<100M #language-Chinese #license-apache-2.0 #finance #region-us \n" ]
5748ccb83a0e13a63ea9158877cf013e0f8d63fc
# Dataset Card for Evaluation run of TinyPixel/elm-test ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/TinyPixel/elm-test - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [TinyPixel/elm-test](https://huggingface.co/TinyPixel/elm-test) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_TinyPixel__elm-test", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-10-28T16:54:03.304592](https://huggingface.co/datasets/open-llm-leaderboard/details_TinyPixel__elm-test/blob/main/results_2023-10-28T16-54-03.304592.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.0012583892617449664, "em_stderr": 0.0003630560893119392, "f1": 0.05654886744966456, "f1_stderr": 0.0013251750673152706, "acc": 0.4092727084508905, "acc_stderr": 0.00976564057712332 }, "harness|drop|3": { "em": 0.0012583892617449664, "em_stderr": 0.0003630560893119392, "f1": 0.05654886744966456, "f1_stderr": 0.0013251750673152706 }, "harness|gsm8k|5": { "acc": 0.07505686125852919, "acc_stderr": 0.007257633145486643 }, "harness|winogrande|5": { "acc": 0.7434885556432518, "acc_stderr": 0.012273648008759996 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_TinyPixel__elm-test
[ "region:us" ]
2023-09-22T04:13:27+00:00
{"pretty_name": "Evaluation run of TinyPixel/elm-test", "dataset_summary": "Dataset automatically created during the evaluation run of model [TinyPixel/elm-test](https://huggingface.co/TinyPixel/elm-test) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_TinyPixel__elm-test\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-28T16:54:03.304592](https://huggingface.co/datasets/open-llm-leaderboard/details_TinyPixel__elm-test/blob/main/results_2023-10-28T16-54-03.304592.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0012583892617449664,\n \"em_stderr\": 0.0003630560893119392,\n \"f1\": 0.05654886744966456,\n \"f1_stderr\": 0.0013251750673152706,\n \"acc\": 0.4092727084508905,\n \"acc_stderr\": 0.00976564057712332\n },\n \"harness|drop|3\": {\n \"em\": 0.0012583892617449664,\n \"em_stderr\": 0.0003630560893119392,\n \"f1\": 0.05654886744966456,\n \"f1_stderr\": 0.0013251750673152706\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.07505686125852919,\n \"acc_stderr\": 0.007257633145486643\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7434885556432518,\n \"acc_stderr\": 0.012273648008759996\n }\n}\n```", "repo_url": "https://huggingface.co/TinyPixel/elm-test", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_09_22T05_13_08.764414", "path": ["**/details_harness|arc:challenge|25_2023-09-22T05-13-08.764414.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-09-22T05-13-08.764414.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_28T16_54_03.304592", "path": ["**/details_harness|drop|3_2023-10-28T16-54-03.304592.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-28T16-54-03.304592.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_28T16_54_03.304592", "path": ["**/details_harness|gsm8k|5_2023-10-28T16-54-03.304592.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-28T16-54-03.304592.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_09_22T05_13_08.764414", "path": ["**/details_harness|hellaswag|10_2023-09-22T05-13-08.764414.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-09-22T05-13-08.764414.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_09_22T05_13_08.764414", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-22T05-13-08.764414.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-09-22T05-13-08.764414.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-09-22T05-13-08.764414.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-09-22T05-13-08.764414.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-22T05-13-08.764414.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-09-22T05-13-08.764414.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-09-22T05-13-08.764414.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-09-22T05-13-08.764414.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-09-22T05-13-08.764414.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-09-22T05-13-08.764414.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-09-22T05-13-08.764414.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-09-22T05-13-08.764414.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-22T05-13-08.764414.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-09-22T05-13-08.764414.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-22T05-13-08.764414.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-22T05-13-08.764414.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-09-22T05-13-08.764414.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-09-22T05-13-08.764414.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-09-22T05-13-08.764414.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-22T05-13-08.764414.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-22T05-13-08.764414.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-22T05-13-08.764414.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-09-22T05-13-08.764414.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-22T05-13-08.764414.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-22T05-13-08.764414.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-22T05-13-08.764414.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-22T05-13-08.764414.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-09-22T05-13-08.764414.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-22T05-13-08.764414.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-22T05-13-08.764414.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-22T05-13-08.764414.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-22T05-13-08.764414.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-09-22T05-13-08.764414.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-09-22T05-13-08.764414.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-09-22T05-13-08.764414.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-09-22T05-13-08.764414.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-22T05-13-08.764414.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-09-22T05-13-08.764414.parquet", "**/details_harness|hendrycksTest-management|5_2023-09-22T05-13-08.764414.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-09-22T05-13-08.764414.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-09-22T05-13-08.764414.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-09-22T05-13-08.764414.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-09-22T05-13-08.764414.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-22T05-13-08.764414.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-09-22T05-13-08.764414.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-09-22T05-13-08.764414.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-09-22T05-13-08.764414.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-09-22T05-13-08.764414.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-09-22T05-13-08.764414.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-09-22T05-13-08.764414.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-09-22T05-13-08.764414.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-09-22T05-13-08.764414.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-09-22T05-13-08.764414.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-09-22T05-13-08.764414.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-22T05-13-08.764414.parquet", "**/details_harness|hendrycksTest-virology|5_2023-09-22T05-13-08.764414.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-09-22T05-13-08.764414.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-22T05-13-08.764414.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-09-22T05-13-08.764414.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-09-22T05-13-08.764414.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-09-22T05-13-08.764414.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-22T05-13-08.764414.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-09-22T05-13-08.764414.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-09-22T05-13-08.764414.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-09-22T05-13-08.764414.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-09-22T05-13-08.764414.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-09-22T05-13-08.764414.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-09-22T05-13-08.764414.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-09-22T05-13-08.764414.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-22T05-13-08.764414.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-09-22T05-13-08.764414.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-22T05-13-08.764414.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-22T05-13-08.764414.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-09-22T05-13-08.764414.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-09-22T05-13-08.764414.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-09-22T05-13-08.764414.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-22T05-13-08.764414.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-22T05-13-08.764414.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-22T05-13-08.764414.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-09-22T05-13-08.764414.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-22T05-13-08.764414.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-22T05-13-08.764414.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-22T05-13-08.764414.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-22T05-13-08.764414.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-09-22T05-13-08.764414.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-22T05-13-08.764414.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-22T05-13-08.764414.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-22T05-13-08.764414.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-22T05-13-08.764414.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-09-22T05-13-08.764414.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-09-22T05-13-08.764414.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-09-22T05-13-08.764414.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-09-22T05-13-08.764414.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-22T05-13-08.764414.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-09-22T05-13-08.764414.parquet", "**/details_harness|hendrycksTest-management|5_2023-09-22T05-13-08.764414.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-09-22T05-13-08.764414.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-09-22T05-13-08.764414.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-09-22T05-13-08.764414.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-09-22T05-13-08.764414.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-22T05-13-08.764414.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-09-22T05-13-08.764414.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-09-22T05-13-08.764414.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-09-22T05-13-08.764414.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-09-22T05-13-08.764414.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-09-22T05-13-08.764414.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-09-22T05-13-08.764414.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-09-22T05-13-08.764414.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-09-22T05-13-08.764414.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-09-22T05-13-08.764414.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-09-22T05-13-08.764414.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-22T05-13-08.764414.parquet", "**/details_harness|hendrycksTest-virology|5_2023-09-22T05-13-08.764414.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-09-22T05-13-08.764414.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_09_22T05_13_08.764414", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-22T05-13-08.764414.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-22T05-13-08.764414.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_09_22T05_13_08.764414", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-09-22T05-13-08.764414.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-09-22T05-13-08.764414.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_09_22T05_13_08.764414", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-09-22T05-13-08.764414.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-09-22T05-13-08.764414.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_09_22T05_13_08.764414", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-09-22T05-13-08.764414.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-09-22T05-13-08.764414.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_09_22T05_13_08.764414", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-22T05-13-08.764414.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-22T05-13-08.764414.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_09_22T05_13_08.764414", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-09-22T05-13-08.764414.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-09-22T05-13-08.764414.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_09_22T05_13_08.764414", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-09-22T05-13-08.764414.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-09-22T05-13-08.764414.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_09_22T05_13_08.764414", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-09-22T05-13-08.764414.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-09-22T05-13-08.764414.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_09_22T05_13_08.764414", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-09-22T05-13-08.764414.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-09-22T05-13-08.764414.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_09_22T05_13_08.764414", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-09-22T05-13-08.764414.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-09-22T05-13-08.764414.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_09_22T05_13_08.764414", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-09-22T05-13-08.764414.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-09-22T05-13-08.764414.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_09_22T05_13_08.764414", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-09-22T05-13-08.764414.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-09-22T05-13-08.764414.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_09_22T05_13_08.764414", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-22T05-13-08.764414.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-22T05-13-08.764414.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_09_22T05_13_08.764414", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-09-22T05-13-08.764414.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-09-22T05-13-08.764414.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_09_22T05_13_08.764414", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-22T05-13-08.764414.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-22T05-13-08.764414.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_09_22T05_13_08.764414", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-22T05-13-08.764414.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-22T05-13-08.764414.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_09_22T05_13_08.764414", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-09-22T05-13-08.764414.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-09-22T05-13-08.764414.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_09_22T05_13_08.764414", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-09-22T05-13-08.764414.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-09-22T05-13-08.764414.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_09_22T05_13_08.764414", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-09-22T05-13-08.764414.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-09-22T05-13-08.764414.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_09_22T05_13_08.764414", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-22T05-13-08.764414.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-22T05-13-08.764414.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_09_22T05_13_08.764414", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-22T05-13-08.764414.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-22T05-13-08.764414.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_09_22T05_13_08.764414", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-22T05-13-08.764414.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-22T05-13-08.764414.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_09_22T05_13_08.764414", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-09-22T05-13-08.764414.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-09-22T05-13-08.764414.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_09_22T05_13_08.764414", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-22T05-13-08.764414.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-22T05-13-08.764414.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_09_22T05_13_08.764414", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-22T05-13-08.764414.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-22T05-13-08.764414.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_09_22T05_13_08.764414", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-22T05-13-08.764414.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-22T05-13-08.764414.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_09_22T05_13_08.764414", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-22T05-13-08.764414.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-22T05-13-08.764414.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_09_22T05_13_08.764414", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-09-22T05-13-08.764414.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-09-22T05-13-08.764414.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_09_22T05_13_08.764414", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-22T05-13-08.764414.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-22T05-13-08.764414.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_09_22T05_13_08.764414", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-22T05-13-08.764414.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-22T05-13-08.764414.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_09_22T05_13_08.764414", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-22T05-13-08.764414.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-22T05-13-08.764414.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_09_22T05_13_08.764414", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-22T05-13-08.764414.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-22T05-13-08.764414.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_09_22T05_13_08.764414", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-09-22T05-13-08.764414.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-09-22T05-13-08.764414.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_09_22T05_13_08.764414", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-09-22T05-13-08.764414.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-09-22T05-13-08.764414.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_09_22T05_13_08.764414", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-09-22T05-13-08.764414.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-09-22T05-13-08.764414.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_09_22T05_13_08.764414", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-09-22T05-13-08.764414.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-09-22T05-13-08.764414.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_09_22T05_13_08.764414", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-22T05-13-08.764414.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-22T05-13-08.764414.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_09_22T05_13_08.764414", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-09-22T05-13-08.764414.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-09-22T05-13-08.764414.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_09_22T05_13_08.764414", "path": ["**/details_harness|hendrycksTest-management|5_2023-09-22T05-13-08.764414.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-09-22T05-13-08.764414.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_09_22T05_13_08.764414", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-09-22T05-13-08.764414.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-09-22T05-13-08.764414.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_09_22T05_13_08.764414", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-09-22T05-13-08.764414.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-09-22T05-13-08.764414.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_09_22T05_13_08.764414", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-09-22T05-13-08.764414.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-09-22T05-13-08.764414.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_09_22T05_13_08.764414", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-09-22T05-13-08.764414.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-09-22T05-13-08.764414.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_09_22T05_13_08.764414", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-22T05-13-08.764414.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-22T05-13-08.764414.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_09_22T05_13_08.764414", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-09-22T05-13-08.764414.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-09-22T05-13-08.764414.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_09_22T05_13_08.764414", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-09-22T05-13-08.764414.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-09-22T05-13-08.764414.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_09_22T05_13_08.764414", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-09-22T05-13-08.764414.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-09-22T05-13-08.764414.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_09_22T05_13_08.764414", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-09-22T05-13-08.764414.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-09-22T05-13-08.764414.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_09_22T05_13_08.764414", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-09-22T05-13-08.764414.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-09-22T05-13-08.764414.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_09_22T05_13_08.764414", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-09-22T05-13-08.764414.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-09-22T05-13-08.764414.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_09_22T05_13_08.764414", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-09-22T05-13-08.764414.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-09-22T05-13-08.764414.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_09_22T05_13_08.764414", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-09-22T05-13-08.764414.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-09-22T05-13-08.764414.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_09_22T05_13_08.764414", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-09-22T05-13-08.764414.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-09-22T05-13-08.764414.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_09_22T05_13_08.764414", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-09-22T05-13-08.764414.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-09-22T05-13-08.764414.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_09_22T05_13_08.764414", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-22T05-13-08.764414.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-22T05-13-08.764414.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_09_22T05_13_08.764414", "path": ["**/details_harness|hendrycksTest-virology|5_2023-09-22T05-13-08.764414.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-09-22T05-13-08.764414.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_09_22T05_13_08.764414", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-09-22T05-13-08.764414.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-09-22T05-13-08.764414.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_09_22T05_13_08.764414", "path": ["**/details_harness|truthfulqa:mc|0_2023-09-22T05-13-08.764414.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-09-22T05-13-08.764414.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_28T16_54_03.304592", "path": ["**/details_harness|winogrande|5_2023-10-28T16-54-03.304592.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-28T16-54-03.304592.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_09_22T05_13_08.764414", "path": ["results_2023-09-22T05-13-08.764414.parquet"]}, {"split": "2023_10_28T16_54_03.304592", "path": ["results_2023-10-28T16-54-03.304592.parquet"]}, {"split": "latest", "path": ["results_2023-10-28T16-54-03.304592.parquet"]}]}]}
2023-10-28T15:54:15+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of TinyPixel/elm-test ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model TinyPixel/elm-test on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-10-28T16:54:03.304592(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of TinyPixel/elm-test", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model TinyPixel/elm-test on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-28T16:54:03.304592(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of TinyPixel/elm-test", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model TinyPixel/elm-test on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-28T16:54:03.304592(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 17, 31, 165, 66, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of TinyPixel/elm-test## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model TinyPixel/elm-test on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-28T16:54:03.304592(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
1f4574b306a40818a35162fa577224b118ece765
# Dataset Card for "elm-m" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
TinyPixel/elm-m
[ "region:us" ]
2023-09-22T04:29:59+00:00
{"dataset_info": {"features": [{"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 2555808, "num_examples": 1073}], "download_size": 1391592, "dataset_size": 2555808}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]}
2023-09-22T04:30:03+00:00
[]
[]
TAGS #region-us
# Dataset Card for "elm-m" More Information needed
[ "# Dataset Card for \"elm-m\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"elm-m\"\n\nMore Information needed" ]
[ 6, 13 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"elm-m\"\n\nMore Information needed" ]
a2fd20daed26874ab6aee143257165f90233f9f6
# Dataset Card for Evaluation run of glaiveai/glaive-coder-7b ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/glaiveai/glaive-coder-7b - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [glaiveai/glaive-coder-7b](https://huggingface.co/glaiveai/glaive-coder-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_glaiveai__glaive-coder-7b", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-10-29T06:05:57.317368](https://huggingface.co/datasets/open-llm-leaderboard/details_glaiveai__glaive-coder-7b/blob/main/results_2023-10-29T06-05-57.317368.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.00388003355704698, "em_stderr": 0.0006366682825519943, "f1": 0.055515939597315614, "f1_stderr": 0.0014057901382845646, "acc": 0.32489335335120895, "acc_stderr": 0.009957962270331142 }, "harness|drop|3": { "em": 0.00388003355704698, "em_stderr": 0.0006366682825519943, "f1": 0.055515939597315614, "f1_stderr": 0.0014057901382845646 }, "harness|gsm8k|5": { "acc": 0.052312357846853674, "acc_stderr": 0.006133057708959239 }, "harness|winogrande|5": { "acc": 0.5974743488555643, "acc_stderr": 0.013782866831703044 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_glaiveai__glaive-coder-7b
[ "region:us" ]
2023-09-22T04:33:36+00:00
{"pretty_name": "Evaluation run of glaiveai/glaive-coder-7b", "dataset_summary": "Dataset automatically created during the evaluation run of model [glaiveai/glaive-coder-7b](https://huggingface.co/glaiveai/glaive-coder-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_glaiveai__glaive-coder-7b\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-29T06:05:57.317368](https://huggingface.co/datasets/open-llm-leaderboard/details_glaiveai__glaive-coder-7b/blob/main/results_2023-10-29T06-05-57.317368.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.00388003355704698,\n \"em_stderr\": 0.0006366682825519943,\n \"f1\": 0.055515939597315614,\n \"f1_stderr\": 0.0014057901382845646,\n \"acc\": 0.32489335335120895,\n \"acc_stderr\": 0.009957962270331142\n },\n \"harness|drop|3\": {\n \"em\": 0.00388003355704698,\n \"em_stderr\": 0.0006366682825519943,\n \"f1\": 0.055515939597315614,\n \"f1_stderr\": 0.0014057901382845646\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.052312357846853674,\n \"acc_stderr\": 0.006133057708959239\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.5974743488555643,\n \"acc_stderr\": 0.013782866831703044\n }\n}\n```", "repo_url": "https://huggingface.co/glaiveai/glaive-coder-7b", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_09_22T05_33_12.124557", "path": ["**/details_harness|arc:challenge|25_2023-09-22T05-33-12.124557.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-09-22T05-33-12.124557.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_29T06_05_57.317368", "path": ["**/details_harness|drop|3_2023-10-29T06-05-57.317368.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-29T06-05-57.317368.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_29T06_05_57.317368", "path": ["**/details_harness|gsm8k|5_2023-10-29T06-05-57.317368.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-29T06-05-57.317368.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_09_22T05_33_12.124557", "path": ["**/details_harness|hellaswag|10_2023-09-22T05-33-12.124557.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-09-22T05-33-12.124557.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_09_22T05_33_12.124557", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-22T05-33-12.124557.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-09-22T05-33-12.124557.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-09-22T05-33-12.124557.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-09-22T05-33-12.124557.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-22T05-33-12.124557.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-09-22T05-33-12.124557.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-09-22T05-33-12.124557.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-09-22T05-33-12.124557.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-09-22T05-33-12.124557.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-09-22T05-33-12.124557.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-09-22T05-33-12.124557.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-09-22T05-33-12.124557.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-22T05-33-12.124557.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-09-22T05-33-12.124557.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-22T05-33-12.124557.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-22T05-33-12.124557.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-09-22T05-33-12.124557.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-09-22T05-33-12.124557.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-09-22T05-33-12.124557.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-22T05-33-12.124557.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-22T05-33-12.124557.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-22T05-33-12.124557.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-09-22T05-33-12.124557.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-22T05-33-12.124557.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-22T05-33-12.124557.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-22T05-33-12.124557.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-22T05-33-12.124557.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-09-22T05-33-12.124557.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-22T05-33-12.124557.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-22T05-33-12.124557.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-22T05-33-12.124557.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-22T05-33-12.124557.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-09-22T05-33-12.124557.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-09-22T05-33-12.124557.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-09-22T05-33-12.124557.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-09-22T05-33-12.124557.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-22T05-33-12.124557.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-09-22T05-33-12.124557.parquet", "**/details_harness|hendrycksTest-management|5_2023-09-22T05-33-12.124557.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-09-22T05-33-12.124557.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-09-22T05-33-12.124557.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-09-22T05-33-12.124557.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-09-22T05-33-12.124557.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-22T05-33-12.124557.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-09-22T05-33-12.124557.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-09-22T05-33-12.124557.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-09-22T05-33-12.124557.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-09-22T05-33-12.124557.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-09-22T05-33-12.124557.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-09-22T05-33-12.124557.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-09-22T05-33-12.124557.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-09-22T05-33-12.124557.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-09-22T05-33-12.124557.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-09-22T05-33-12.124557.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-22T05-33-12.124557.parquet", "**/details_harness|hendrycksTest-virology|5_2023-09-22T05-33-12.124557.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-09-22T05-33-12.124557.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-22T05-33-12.124557.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-09-22T05-33-12.124557.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-09-22T05-33-12.124557.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-09-22T05-33-12.124557.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-22T05-33-12.124557.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-09-22T05-33-12.124557.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-09-22T05-33-12.124557.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-09-22T05-33-12.124557.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-09-22T05-33-12.124557.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-09-22T05-33-12.124557.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-09-22T05-33-12.124557.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-09-22T05-33-12.124557.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-22T05-33-12.124557.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-09-22T05-33-12.124557.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-22T05-33-12.124557.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-22T05-33-12.124557.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-09-22T05-33-12.124557.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-09-22T05-33-12.124557.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-09-22T05-33-12.124557.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-22T05-33-12.124557.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-22T05-33-12.124557.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-22T05-33-12.124557.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-09-22T05-33-12.124557.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-22T05-33-12.124557.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-22T05-33-12.124557.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-22T05-33-12.124557.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-22T05-33-12.124557.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-09-22T05-33-12.124557.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-22T05-33-12.124557.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-22T05-33-12.124557.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-22T05-33-12.124557.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-22T05-33-12.124557.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-09-22T05-33-12.124557.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-09-22T05-33-12.124557.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-09-22T05-33-12.124557.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-09-22T05-33-12.124557.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-22T05-33-12.124557.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-09-22T05-33-12.124557.parquet", "**/details_harness|hendrycksTest-management|5_2023-09-22T05-33-12.124557.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-09-22T05-33-12.124557.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-09-22T05-33-12.124557.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-09-22T05-33-12.124557.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-09-22T05-33-12.124557.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-22T05-33-12.124557.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-09-22T05-33-12.124557.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-09-22T05-33-12.124557.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-09-22T05-33-12.124557.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-09-22T05-33-12.124557.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-09-22T05-33-12.124557.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-09-22T05-33-12.124557.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-09-22T05-33-12.124557.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-09-22T05-33-12.124557.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-09-22T05-33-12.124557.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-09-22T05-33-12.124557.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-22T05-33-12.124557.parquet", "**/details_harness|hendrycksTest-virology|5_2023-09-22T05-33-12.124557.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-09-22T05-33-12.124557.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_09_22T05_33_12.124557", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-22T05-33-12.124557.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-22T05-33-12.124557.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_09_22T05_33_12.124557", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-09-22T05-33-12.124557.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-09-22T05-33-12.124557.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_09_22T05_33_12.124557", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-09-22T05-33-12.124557.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-09-22T05-33-12.124557.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_09_22T05_33_12.124557", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-09-22T05-33-12.124557.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-09-22T05-33-12.124557.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_09_22T05_33_12.124557", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-22T05-33-12.124557.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-22T05-33-12.124557.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_09_22T05_33_12.124557", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-09-22T05-33-12.124557.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-09-22T05-33-12.124557.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_09_22T05_33_12.124557", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-09-22T05-33-12.124557.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-09-22T05-33-12.124557.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_09_22T05_33_12.124557", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-09-22T05-33-12.124557.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-09-22T05-33-12.124557.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_09_22T05_33_12.124557", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-09-22T05-33-12.124557.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-09-22T05-33-12.124557.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_09_22T05_33_12.124557", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-09-22T05-33-12.124557.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-09-22T05-33-12.124557.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_09_22T05_33_12.124557", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-09-22T05-33-12.124557.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-09-22T05-33-12.124557.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_09_22T05_33_12.124557", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-09-22T05-33-12.124557.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-09-22T05-33-12.124557.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_09_22T05_33_12.124557", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-22T05-33-12.124557.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-22T05-33-12.124557.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_09_22T05_33_12.124557", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-09-22T05-33-12.124557.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-09-22T05-33-12.124557.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_09_22T05_33_12.124557", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-22T05-33-12.124557.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-22T05-33-12.124557.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_09_22T05_33_12.124557", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-22T05-33-12.124557.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-22T05-33-12.124557.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_09_22T05_33_12.124557", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-09-22T05-33-12.124557.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-09-22T05-33-12.124557.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_09_22T05_33_12.124557", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-09-22T05-33-12.124557.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-09-22T05-33-12.124557.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_09_22T05_33_12.124557", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-09-22T05-33-12.124557.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-09-22T05-33-12.124557.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_09_22T05_33_12.124557", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-22T05-33-12.124557.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-22T05-33-12.124557.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_09_22T05_33_12.124557", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-22T05-33-12.124557.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-22T05-33-12.124557.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_09_22T05_33_12.124557", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-22T05-33-12.124557.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-22T05-33-12.124557.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_09_22T05_33_12.124557", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-09-22T05-33-12.124557.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-09-22T05-33-12.124557.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_09_22T05_33_12.124557", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-22T05-33-12.124557.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-22T05-33-12.124557.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_09_22T05_33_12.124557", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-22T05-33-12.124557.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-22T05-33-12.124557.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_09_22T05_33_12.124557", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-22T05-33-12.124557.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-22T05-33-12.124557.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_09_22T05_33_12.124557", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-22T05-33-12.124557.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-22T05-33-12.124557.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_09_22T05_33_12.124557", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-09-22T05-33-12.124557.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-09-22T05-33-12.124557.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_09_22T05_33_12.124557", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-22T05-33-12.124557.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-22T05-33-12.124557.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_09_22T05_33_12.124557", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-22T05-33-12.124557.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-22T05-33-12.124557.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_09_22T05_33_12.124557", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-22T05-33-12.124557.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-22T05-33-12.124557.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_09_22T05_33_12.124557", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-22T05-33-12.124557.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-22T05-33-12.124557.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_09_22T05_33_12.124557", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-09-22T05-33-12.124557.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-09-22T05-33-12.124557.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_09_22T05_33_12.124557", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-09-22T05-33-12.124557.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-09-22T05-33-12.124557.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_09_22T05_33_12.124557", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-09-22T05-33-12.124557.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-09-22T05-33-12.124557.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_09_22T05_33_12.124557", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-09-22T05-33-12.124557.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-09-22T05-33-12.124557.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_09_22T05_33_12.124557", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-22T05-33-12.124557.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-22T05-33-12.124557.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_09_22T05_33_12.124557", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-09-22T05-33-12.124557.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-09-22T05-33-12.124557.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_09_22T05_33_12.124557", "path": ["**/details_harness|hendrycksTest-management|5_2023-09-22T05-33-12.124557.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-09-22T05-33-12.124557.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_09_22T05_33_12.124557", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-09-22T05-33-12.124557.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-09-22T05-33-12.124557.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_09_22T05_33_12.124557", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-09-22T05-33-12.124557.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-09-22T05-33-12.124557.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_09_22T05_33_12.124557", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-09-22T05-33-12.124557.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-09-22T05-33-12.124557.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_09_22T05_33_12.124557", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-09-22T05-33-12.124557.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-09-22T05-33-12.124557.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_09_22T05_33_12.124557", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-22T05-33-12.124557.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-22T05-33-12.124557.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_09_22T05_33_12.124557", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-09-22T05-33-12.124557.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-09-22T05-33-12.124557.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_09_22T05_33_12.124557", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-09-22T05-33-12.124557.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-09-22T05-33-12.124557.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_09_22T05_33_12.124557", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-09-22T05-33-12.124557.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-09-22T05-33-12.124557.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_09_22T05_33_12.124557", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-09-22T05-33-12.124557.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-09-22T05-33-12.124557.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_09_22T05_33_12.124557", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-09-22T05-33-12.124557.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-09-22T05-33-12.124557.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_09_22T05_33_12.124557", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-09-22T05-33-12.124557.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-09-22T05-33-12.124557.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_09_22T05_33_12.124557", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-09-22T05-33-12.124557.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-09-22T05-33-12.124557.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_09_22T05_33_12.124557", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-09-22T05-33-12.124557.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-09-22T05-33-12.124557.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_09_22T05_33_12.124557", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-09-22T05-33-12.124557.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-09-22T05-33-12.124557.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_09_22T05_33_12.124557", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-09-22T05-33-12.124557.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-09-22T05-33-12.124557.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_09_22T05_33_12.124557", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-22T05-33-12.124557.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-22T05-33-12.124557.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_09_22T05_33_12.124557", "path": ["**/details_harness|hendrycksTest-virology|5_2023-09-22T05-33-12.124557.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-09-22T05-33-12.124557.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_09_22T05_33_12.124557", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-09-22T05-33-12.124557.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-09-22T05-33-12.124557.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_09_22T05_33_12.124557", "path": ["**/details_harness|truthfulqa:mc|0_2023-09-22T05-33-12.124557.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-09-22T05-33-12.124557.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_29T06_05_57.317368", "path": ["**/details_harness|winogrande|5_2023-10-29T06-05-57.317368.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-29T06-05-57.317368.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_09_22T05_33_12.124557", "path": ["results_2023-09-22T05-33-12.124557.parquet"]}, {"split": "2023_10_29T06_05_57.317368", "path": ["results_2023-10-29T06-05-57.317368.parquet"]}, {"split": "latest", "path": ["results_2023-10-29T06-05-57.317368.parquet"]}]}]}
2023-10-29T06:06:10+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of glaiveai/glaive-coder-7b ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model glaiveai/glaive-coder-7b on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-10-29T06:05:57.317368(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of glaiveai/glaive-coder-7b", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model glaiveai/glaive-coder-7b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-29T06:05:57.317368(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of glaiveai/glaive-coder-7b", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model glaiveai/glaive-coder-7b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-29T06:05:57.317368(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 20, 31, 168, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of glaiveai/glaive-coder-7b## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model glaiveai/glaive-coder-7b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-29T06:05:57.317368(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
65462fec37cf881f2fd8b4a17928ba37a4749b5b
# Dataset Card for Evaluation run of migtissera/Synthia-7B-v1.2 ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/migtissera/Synthia-7B-v1.2 - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [migtissera/Synthia-7B-v1.2](https://huggingface.co/migtissera/Synthia-7B-v1.2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_migtissera__Synthia-7B-v1.2", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-10-25T08:51:48.447096](https://huggingface.co/datasets/open-llm-leaderboard/details_migtissera__Synthia-7B-v1.2/blob/main/results_2023-10-25T08-51-48.447096.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.08913590604026846, "em_stderr": 0.0029180503705090555, "f1": 0.16236577181208006, "f1_stderr": 0.003176440216561889, "acc": 0.4220056810396051, "acc_stderr": 0.01047928870180564 }, "harness|drop|3": { "em": 0.08913590604026846, "em_stderr": 0.0029180503705090555, "f1": 0.16236577181208006, "f1_stderr": 0.003176440216561889 }, "harness|gsm8k|5": { "acc": 0.10841546626231995, "acc_stderr": 0.00856385250662748 }, "harness|winogrande|5": { "acc": 0.7355958958168903, "acc_stderr": 0.012394724896983799 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_migtissera__Synthia-7B-v1.2
[ "region:us" ]
2023-09-22T04:35:44+00:00
{"pretty_name": "Evaluation run of migtissera/Synthia-7B-v1.2", "dataset_summary": "Dataset automatically created during the evaluation run of model [migtissera/Synthia-7B-v1.2](https://huggingface.co/migtissera/Synthia-7B-v1.2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_migtissera__Synthia-7B-v1.2\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-25T08:51:48.447096](https://huggingface.co/datasets/open-llm-leaderboard/details_migtissera__Synthia-7B-v1.2/blob/main/results_2023-10-25T08-51-48.447096.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.08913590604026846,\n \"em_stderr\": 0.0029180503705090555,\n \"f1\": 0.16236577181208006,\n \"f1_stderr\": 0.003176440216561889,\n \"acc\": 0.4220056810396051,\n \"acc_stderr\": 0.01047928870180564\n },\n \"harness|drop|3\": {\n \"em\": 0.08913590604026846,\n \"em_stderr\": 0.0029180503705090555,\n \"f1\": 0.16236577181208006,\n \"f1_stderr\": 0.003176440216561889\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.10841546626231995,\n \"acc_stderr\": 0.00856385250662748\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7355958958168903,\n \"acc_stderr\": 0.012394724896983799\n }\n}\n```", "repo_url": "https://huggingface.co/migtissera/Synthia-7B-v1.2", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_09_22T05_35_25.402553", "path": ["**/details_harness|arc:challenge|25_2023-09-22T05-35-25.402553.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-09-22T05-35-25.402553.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_25T08_51_48.447096", "path": ["**/details_harness|drop|3_2023-10-25T08-51-48.447096.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-25T08-51-48.447096.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_25T08_51_48.447096", "path": ["**/details_harness|gsm8k|5_2023-10-25T08-51-48.447096.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-25T08-51-48.447096.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_09_22T05_35_25.402553", "path": ["**/details_harness|hellaswag|10_2023-09-22T05-35-25.402553.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-09-22T05-35-25.402553.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_09_22T05_35_25.402553", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-22T05-35-25.402553.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-09-22T05-35-25.402553.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-09-22T05-35-25.402553.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-09-22T05-35-25.402553.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-22T05-35-25.402553.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-09-22T05-35-25.402553.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-09-22T05-35-25.402553.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-09-22T05-35-25.402553.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-09-22T05-35-25.402553.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-09-22T05-35-25.402553.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-09-22T05-35-25.402553.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-09-22T05-35-25.402553.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-22T05-35-25.402553.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-09-22T05-35-25.402553.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-22T05-35-25.402553.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-22T05-35-25.402553.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-09-22T05-35-25.402553.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-09-22T05-35-25.402553.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-09-22T05-35-25.402553.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-22T05-35-25.402553.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-22T05-35-25.402553.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-22T05-35-25.402553.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-09-22T05-35-25.402553.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-22T05-35-25.402553.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-22T05-35-25.402553.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-22T05-35-25.402553.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-22T05-35-25.402553.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-09-22T05-35-25.402553.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-22T05-35-25.402553.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-22T05-35-25.402553.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-22T05-35-25.402553.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-22T05-35-25.402553.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-09-22T05-35-25.402553.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-09-22T05-35-25.402553.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-09-22T05-35-25.402553.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-09-22T05-35-25.402553.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-22T05-35-25.402553.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-09-22T05-35-25.402553.parquet", "**/details_harness|hendrycksTest-management|5_2023-09-22T05-35-25.402553.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-09-22T05-35-25.402553.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-09-22T05-35-25.402553.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-09-22T05-35-25.402553.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-09-22T05-35-25.402553.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-22T05-35-25.402553.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-09-22T05-35-25.402553.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-09-22T05-35-25.402553.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-09-22T05-35-25.402553.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-09-22T05-35-25.402553.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-09-22T05-35-25.402553.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-09-22T05-35-25.402553.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-09-22T05-35-25.402553.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-09-22T05-35-25.402553.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-09-22T05-35-25.402553.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-09-22T05-35-25.402553.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-22T05-35-25.402553.parquet", "**/details_harness|hendrycksTest-virology|5_2023-09-22T05-35-25.402553.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-09-22T05-35-25.402553.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-22T05-35-25.402553.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-09-22T05-35-25.402553.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-09-22T05-35-25.402553.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-09-22T05-35-25.402553.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-22T05-35-25.402553.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-09-22T05-35-25.402553.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-09-22T05-35-25.402553.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-09-22T05-35-25.402553.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-09-22T05-35-25.402553.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-09-22T05-35-25.402553.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-09-22T05-35-25.402553.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-09-22T05-35-25.402553.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-22T05-35-25.402553.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-09-22T05-35-25.402553.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-22T05-35-25.402553.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-22T05-35-25.402553.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-09-22T05-35-25.402553.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-09-22T05-35-25.402553.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-09-22T05-35-25.402553.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-22T05-35-25.402553.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-22T05-35-25.402553.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-22T05-35-25.402553.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-09-22T05-35-25.402553.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-22T05-35-25.402553.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-22T05-35-25.402553.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-22T05-35-25.402553.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-22T05-35-25.402553.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-09-22T05-35-25.402553.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-22T05-35-25.402553.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-22T05-35-25.402553.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-22T05-35-25.402553.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-22T05-35-25.402553.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-09-22T05-35-25.402553.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-09-22T05-35-25.402553.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-09-22T05-35-25.402553.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-09-22T05-35-25.402553.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-22T05-35-25.402553.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-09-22T05-35-25.402553.parquet", "**/details_harness|hendrycksTest-management|5_2023-09-22T05-35-25.402553.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-09-22T05-35-25.402553.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-09-22T05-35-25.402553.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-09-22T05-35-25.402553.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-09-22T05-35-25.402553.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-22T05-35-25.402553.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-09-22T05-35-25.402553.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-09-22T05-35-25.402553.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-09-22T05-35-25.402553.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-09-22T05-35-25.402553.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-09-22T05-35-25.402553.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-09-22T05-35-25.402553.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-09-22T05-35-25.402553.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-09-22T05-35-25.402553.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-09-22T05-35-25.402553.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-09-22T05-35-25.402553.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-22T05-35-25.402553.parquet", "**/details_harness|hendrycksTest-virology|5_2023-09-22T05-35-25.402553.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-09-22T05-35-25.402553.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_09_22T05_35_25.402553", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-22T05-35-25.402553.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-22T05-35-25.402553.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_09_22T05_35_25.402553", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-09-22T05-35-25.402553.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-09-22T05-35-25.402553.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_09_22T05_35_25.402553", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-09-22T05-35-25.402553.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-09-22T05-35-25.402553.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_09_22T05_35_25.402553", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-09-22T05-35-25.402553.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-09-22T05-35-25.402553.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_09_22T05_35_25.402553", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-22T05-35-25.402553.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-22T05-35-25.402553.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_09_22T05_35_25.402553", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-09-22T05-35-25.402553.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-09-22T05-35-25.402553.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_09_22T05_35_25.402553", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-09-22T05-35-25.402553.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-09-22T05-35-25.402553.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_09_22T05_35_25.402553", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-09-22T05-35-25.402553.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-09-22T05-35-25.402553.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_09_22T05_35_25.402553", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-09-22T05-35-25.402553.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-09-22T05-35-25.402553.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_09_22T05_35_25.402553", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-09-22T05-35-25.402553.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-09-22T05-35-25.402553.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_09_22T05_35_25.402553", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-09-22T05-35-25.402553.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-09-22T05-35-25.402553.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_09_22T05_35_25.402553", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-09-22T05-35-25.402553.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-09-22T05-35-25.402553.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_09_22T05_35_25.402553", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-22T05-35-25.402553.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-22T05-35-25.402553.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_09_22T05_35_25.402553", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-09-22T05-35-25.402553.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-09-22T05-35-25.402553.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_09_22T05_35_25.402553", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-22T05-35-25.402553.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-22T05-35-25.402553.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_09_22T05_35_25.402553", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-22T05-35-25.402553.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-22T05-35-25.402553.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_09_22T05_35_25.402553", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-09-22T05-35-25.402553.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-09-22T05-35-25.402553.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_09_22T05_35_25.402553", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-09-22T05-35-25.402553.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-09-22T05-35-25.402553.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_09_22T05_35_25.402553", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-09-22T05-35-25.402553.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-09-22T05-35-25.402553.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_09_22T05_35_25.402553", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-22T05-35-25.402553.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-22T05-35-25.402553.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_09_22T05_35_25.402553", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-22T05-35-25.402553.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-22T05-35-25.402553.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_09_22T05_35_25.402553", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-22T05-35-25.402553.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-22T05-35-25.402553.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_09_22T05_35_25.402553", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-09-22T05-35-25.402553.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-09-22T05-35-25.402553.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_09_22T05_35_25.402553", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-22T05-35-25.402553.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-22T05-35-25.402553.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_09_22T05_35_25.402553", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-22T05-35-25.402553.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-22T05-35-25.402553.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_09_22T05_35_25.402553", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-22T05-35-25.402553.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-22T05-35-25.402553.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_09_22T05_35_25.402553", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-22T05-35-25.402553.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-22T05-35-25.402553.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_09_22T05_35_25.402553", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-09-22T05-35-25.402553.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-09-22T05-35-25.402553.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_09_22T05_35_25.402553", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-22T05-35-25.402553.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-22T05-35-25.402553.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_09_22T05_35_25.402553", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-22T05-35-25.402553.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-22T05-35-25.402553.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_09_22T05_35_25.402553", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-22T05-35-25.402553.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-22T05-35-25.402553.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_09_22T05_35_25.402553", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-22T05-35-25.402553.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-22T05-35-25.402553.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_09_22T05_35_25.402553", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-09-22T05-35-25.402553.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-09-22T05-35-25.402553.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_09_22T05_35_25.402553", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-09-22T05-35-25.402553.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-09-22T05-35-25.402553.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_09_22T05_35_25.402553", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-09-22T05-35-25.402553.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-09-22T05-35-25.402553.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_09_22T05_35_25.402553", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-09-22T05-35-25.402553.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-09-22T05-35-25.402553.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_09_22T05_35_25.402553", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-22T05-35-25.402553.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-22T05-35-25.402553.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_09_22T05_35_25.402553", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-09-22T05-35-25.402553.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-09-22T05-35-25.402553.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_09_22T05_35_25.402553", "path": ["**/details_harness|hendrycksTest-management|5_2023-09-22T05-35-25.402553.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-09-22T05-35-25.402553.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_09_22T05_35_25.402553", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-09-22T05-35-25.402553.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-09-22T05-35-25.402553.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_09_22T05_35_25.402553", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-09-22T05-35-25.402553.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-09-22T05-35-25.402553.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_09_22T05_35_25.402553", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-09-22T05-35-25.402553.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-09-22T05-35-25.402553.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_09_22T05_35_25.402553", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-09-22T05-35-25.402553.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-09-22T05-35-25.402553.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_09_22T05_35_25.402553", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-22T05-35-25.402553.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-22T05-35-25.402553.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_09_22T05_35_25.402553", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-09-22T05-35-25.402553.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-09-22T05-35-25.402553.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_09_22T05_35_25.402553", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-09-22T05-35-25.402553.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-09-22T05-35-25.402553.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_09_22T05_35_25.402553", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-09-22T05-35-25.402553.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-09-22T05-35-25.402553.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_09_22T05_35_25.402553", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-09-22T05-35-25.402553.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-09-22T05-35-25.402553.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_09_22T05_35_25.402553", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-09-22T05-35-25.402553.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-09-22T05-35-25.402553.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_09_22T05_35_25.402553", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-09-22T05-35-25.402553.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-09-22T05-35-25.402553.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_09_22T05_35_25.402553", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-09-22T05-35-25.402553.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-09-22T05-35-25.402553.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_09_22T05_35_25.402553", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-09-22T05-35-25.402553.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-09-22T05-35-25.402553.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_09_22T05_35_25.402553", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-09-22T05-35-25.402553.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-09-22T05-35-25.402553.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_09_22T05_35_25.402553", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-09-22T05-35-25.402553.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-09-22T05-35-25.402553.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_09_22T05_35_25.402553", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-22T05-35-25.402553.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-22T05-35-25.402553.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_09_22T05_35_25.402553", "path": ["**/details_harness|hendrycksTest-virology|5_2023-09-22T05-35-25.402553.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-09-22T05-35-25.402553.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_09_22T05_35_25.402553", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-09-22T05-35-25.402553.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-09-22T05-35-25.402553.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_09_22T05_35_25.402553", "path": ["**/details_harness|truthfulqa:mc|0_2023-09-22T05-35-25.402553.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-09-22T05-35-25.402553.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_25T08_51_48.447096", "path": ["**/details_harness|winogrande|5_2023-10-25T08-51-48.447096.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-25T08-51-48.447096.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_09_22T05_35_25.402553", "path": ["results_2023-09-22T05-35-25.402553.parquet"]}, {"split": "2023_10_25T08_51_48.447096", "path": ["results_2023-10-25T08-51-48.447096.parquet"]}, {"split": "latest", "path": ["results_2023-10-25T08-51-48.447096.parquet"]}]}]}
2023-10-25T07:52:02+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of migtissera/Synthia-7B-v1.2 ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model migtissera/Synthia-7B-v1.2 on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-10-25T08:51:48.447096(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of migtissera/Synthia-7B-v1.2", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model migtissera/Synthia-7B-v1.2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-25T08:51:48.447096(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of migtissera/Synthia-7B-v1.2", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model migtissera/Synthia-7B-v1.2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-25T08:51:48.447096(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 22, 31, 170, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of migtissera/Synthia-7B-v1.2## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model migtissera/Synthia-7B-v1.2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-25T08:51:48.447096(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
a36478ba247a71f50cf04b7c7b6b68a6237bd259
# Dataset Card for "biology-scienceqa" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
veggiebird/biology-scienceqa
[ "region:us" ]
2023-09-22T04:36:09+00:00
{"dataset_info": {"features": [{"name": "instruction", "dtype": "string"}, {"name": "input", "dtype": "string"}, {"name": "output", "dtype": "string"}, {"name": "embeddings", "sequence": "float32"}], "splits": [{"name": "train", "num_bytes": 7464263, "num_examples": 1596}], "download_size": 7087955, "dataset_size": 7464263}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]}
2023-09-22T04:36:16+00:00
[]
[]
TAGS #region-us
# Dataset Card for "biology-scienceqa" More Information needed
[ "# Dataset Card for \"biology-scienceqa\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"biology-scienceqa\"\n\nMore Information needed" ]
[ 6, 15 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"biology-scienceqa\"\n\nMore Information needed" ]
25395fe701482c1a82e484a4003d54938a611036
# Dataset Card for "warmeVersorgen-50-undersampled" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
mboth/warmeVersorgen-50-undersampled
[ "region:us" ]
2023-09-22T04:42:25+00:00
{"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "test", "path": "data/test-*"}, {"split": "valid", "path": "data/valid-*"}]}], "dataset_info": {"features": [{"name": "Datatype", "dtype": "string"}, {"name": "Beschreibung", "dtype": "string"}, {"name": "Name", "dtype": "string"}, {"name": "Unit", "dtype": "string"}, {"name": "text", "dtype": "string"}, {"name": "Grundfunktion", "dtype": "string"}, {"name": "label", "dtype": {"class_label": {"names": {"0": "Beziehen", "1": "Erzeugen", "2": "Speichern", "3": "Verteilen"}}}}], "splits": [{"name": "train", "num_bytes": 39397.008169573855, "num_examples": 200}, {"name": "test", "num_bytes": 447086, "num_examples": 2265}, {"name": "valid", "num_bytes": 447086, "num_examples": 2265}], "download_size": 342904, "dataset_size": 933569.0081695738}}
2023-09-22T04:42:29+00:00
[]
[]
TAGS #region-us
# Dataset Card for "warmeVersorgen-50-undersampled" More Information needed
[ "# Dataset Card for \"warmeVersorgen-50-undersampled\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"warmeVersorgen-50-undersampled\"\n\nMore Information needed" ]
[ 6, 21 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"warmeVersorgen-50-undersampled\"\n\nMore Information needed" ]
37b1fcf69976c211bce24bf1a9cb4718b0a96659
# Dataset Card for "waermeVersorgen-100-undersampled" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
mboth/waermeVersorgen-100-undersampled
[ "region:us" ]
2023-09-22T04:42:29+00:00
{"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "test", "path": "data/test-*"}, {"split": "valid", "path": "data/valid-*"}]}], "dataset_info": {"features": [{"name": "Datatype", "dtype": "string"}, {"name": "Beschreibung", "dtype": "string"}, {"name": "Name", "dtype": "string"}, {"name": "Unit", "dtype": "string"}, {"name": "text", "dtype": "string"}, {"name": "Grundfunktion", "dtype": "string"}, {"name": "label", "dtype": {"class_label": {"names": {"0": "Beziehen", "1": "Erzeugen", "2": "Speichern", "3": "Verteilen"}}}}], "splits": [{"name": "train", "num_bytes": 78794.01633914771, "num_examples": 400}, {"name": "test", "num_bytes": 447086, "num_examples": 2265}, {"name": "valid", "num_bytes": 447086, "num_examples": 2265}], "download_size": 355050, "dataset_size": 972966.0163391477}}
2023-09-22T04:42:33+00:00
[]
[]
TAGS #region-us
# Dataset Card for "waermeVersorgen-100-undersampled" More Information needed
[ "# Dataset Card for \"waermeVersorgen-100-undersampled\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"waermeVersorgen-100-undersampled\"\n\nMore Information needed" ]
[ 6, 22 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"waermeVersorgen-100-undersampled\"\n\nMore Information needed" ]
947dde644c664cfda10dc4ce85a1c3c93b7fd565
# Dataset Card for "waermeVersorgen-200-undersampled" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
mboth/waermeVersorgen-200-undersampled
[ "region:us" ]
2023-09-22T04:42:33+00:00
{"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "test", "path": "data/test-*"}, {"split": "valid", "path": "data/valid-*"}]}], "dataset_info": {"features": [{"name": "Datatype", "dtype": "string"}, {"name": "Beschreibung", "dtype": "string"}, {"name": "Name", "dtype": "string"}, {"name": "Unit", "dtype": "string"}, {"name": "text", "dtype": "string"}, {"name": "Grundfunktion", "dtype": "string"}, {"name": "label", "dtype": {"class_label": {"names": {"0": "Beziehen", "1": "Erzeugen", "2": "Speichern", "3": "Verteilen"}}}}], "splits": [{"name": "train", "num_bytes": 144390.03494148818, "num_examples": 733}, {"name": "test", "num_bytes": 447086, "num_examples": 2265}, {"name": "valid", "num_bytes": 447086, "num_examples": 2265}], "download_size": 374039, "dataset_size": 1038562.0349414882}}
2023-09-22T04:42:37+00:00
[]
[]
TAGS #region-us
# Dataset Card for "waermeVersorgen-200-undersampled" More Information needed
[ "# Dataset Card for \"waermeVersorgen-200-undersampled\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"waermeVersorgen-200-undersampled\"\n\nMore Information needed" ]
[ 6, 22 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"waermeVersorgen-200-undersampled\"\n\nMore Information needed" ]
5e5c4cbae0328506e5fc346358e19f76f85ad475
# Dataset Card for "103deca7" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
result-kand2-sdxl-wuerst-karlo/103deca7
[ "region:us" ]
2023-09-22T04:57:31+00:00
{"dataset_info": {"features": [{"name": "result", "dtype": "string"}, {"name": "id", "dtype": "int64"}], "splits": [{"name": "train", "num_bytes": 210, "num_examples": 10}], "download_size": 1367, "dataset_size": 210}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]}
2023-09-22T04:57:32+00:00
[]
[]
TAGS #region-us
# Dataset Card for "103deca7" More Information needed
[ "# Dataset Card for \"103deca7\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"103deca7\"\n\nMore Information needed" ]
[ 6, 14 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"103deca7\"\n\nMore Information needed" ]
96b35e12eaca2f000634da450e6961bb11d35133
# Dataset Card for "luftVersorgen-50-undersampled" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
mboth/luftVersorgen-50-undersampled
[ "region:us" ]
2023-09-22T04:58:54+00:00
{"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "test", "path": "data/test-*"}, {"split": "valid", "path": "data/valid-*"}]}], "dataset_info": {"features": [{"name": "Datatype", "dtype": "string"}, {"name": "Beschreibung", "dtype": "string"}, {"name": "Name", "dtype": "string"}, {"name": "Unit", "dtype": "string"}, {"name": "text", "dtype": "string"}, {"name": "Grundfunktion", "dtype": "string"}, {"name": "label", "dtype": {"class_label": {"names": {"0": "LuftBereitstellen", "1": "LuftVerteilen"}}}}], "splits": [{"name": "train", "num_bytes": 19757.430602572782, "num_examples": 100}, {"name": "test", "num_bytes": 290707, "num_examples": 1477}, {"name": "valid", "num_bytes": 290707, "num_examples": 1477}], "download_size": 227539, "dataset_size": 601171.4306025729}}
2023-09-22T04:58:58+00:00
[]
[]
TAGS #region-us
# Dataset Card for "luftVersorgen-50-undersampled" More Information needed
[ "# Dataset Card for \"luftVersorgen-50-undersampled\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"luftVersorgen-50-undersampled\"\n\nMore Information needed" ]
[ 6, 20 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"luftVersorgen-50-undersampled\"\n\nMore Information needed" ]
6612d229dea0d444dfb455c6c9ec89a56a003290
# Dataset Card for "luftVersorgen-100-undersampled" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
mboth/luftVersorgen-100-undersampled
[ "region:us" ]
2023-09-22T04:58:59+00:00
{"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "test", "path": "data/test-*"}, {"split": "valid", "path": "data/valid-*"}]}], "dataset_info": {"features": [{"name": "Datatype", "dtype": "string"}, {"name": "Beschreibung", "dtype": "string"}, {"name": "Name", "dtype": "string"}, {"name": "Unit", "dtype": "string"}, {"name": "text", "dtype": "string"}, {"name": "Grundfunktion", "dtype": "string"}, {"name": "label", "dtype": {"class_label": {"names": {"0": "LuftBereitstellen", "1": "LuftVerteilen"}}}}], "splits": [{"name": "train", "num_bytes": 39514.861205145564, "num_examples": 200}, {"name": "test", "num_bytes": 290707, "num_examples": 1477}, {"name": "valid", "num_bytes": 290707, "num_examples": 1477}], "download_size": 234233, "dataset_size": 620928.8612051455}}
2023-09-22T04:59:02+00:00
[]
[]
TAGS #region-us
# Dataset Card for "luftVersorgen-100-undersampled" More Information needed
[ "# Dataset Card for \"luftVersorgen-100-undersampled\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"luftVersorgen-100-undersampled\"\n\nMore Information needed" ]
[ 6, 20 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"luftVersorgen-100-undersampled\"\n\nMore Information needed" ]
7596adeaba1236e6d029f6114d323bf6bd85637f
# Dataset Card for "luftVersorgen-200-undersampled" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
mboth/luftVersorgen-200-undersampled
[ "region:us" ]
2023-09-22T04:59:03+00:00
{"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "test", "path": "data/test-*"}, {"split": "valid", "path": "data/valid-*"}]}], "dataset_info": {"features": [{"name": "Datatype", "dtype": "string"}, {"name": "Beschreibung", "dtype": "string"}, {"name": "Name", "dtype": "string"}, {"name": "Unit", "dtype": "string"}, {"name": "text", "dtype": "string"}, {"name": "Grundfunktion", "dtype": "string"}, {"name": "label", "dtype": {"class_label": {"names": {"0": "LuftBereitstellen", "1": "LuftVerteilen"}}}}], "splits": [{"name": "train", "num_bytes": 79029.72241029113, "num_examples": 400}, {"name": "test", "num_bytes": 290707, "num_examples": 1477}, {"name": "valid", "num_bytes": 290707, "num_examples": 1477}], "download_size": 247001, "dataset_size": 660443.7224102912}}
2023-09-22T04:59:06+00:00
[]
[]
TAGS #region-us
# Dataset Card for "luftVersorgen-200-undersampled" More Information needed
[ "# Dataset Card for \"luftVersorgen-200-undersampled\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"luftVersorgen-200-undersampled\"\n\nMore Information needed" ]
[ 6, 20 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"luftVersorgen-200-undersampled\"\n\nMore Information needed" ]
5729d1ed8478c6c126cc490755396b379ba96c39
Squad V2 dataset Validation with counterfactual context
Deema/squad_v2_counterfactual
[ "region:us" ]
2023-09-22T04:59:19+00:00
{}
2023-09-22T05:00:30+00:00
[]
[]
TAGS #region-us
Squad V2 dataset Validation with counterfactual context
[]
[ "TAGS\n#region-us \n" ]
[ 6 ]
[ "passage: TAGS\n#region-us \n" ]
91d4c72a648b459e5b3fa12fd44c0850fd0330b6
# Dataset Card for "wiki-bpe-48k" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
cyrilzhang/wiki-bpe-48k
[ "region:us" ]
2023-09-22T05:07:24+00:00
{"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "test", "path": "data/test-*"}]}], "dataset_info": {"features": [{"name": "input_ids", "sequence": "int32"}], "splits": [{"name": "train", "num_bytes": 20505990100, "num_examples": 5001461}, {"name": "test", "num_bytes": 206143900, "num_examples": 50279}], "download_size": 9547305598, "dataset_size": 20712134000}}
2023-09-22T05:13:15+00:00
[]
[]
TAGS #region-us
# Dataset Card for "wiki-bpe-48k" More Information needed
[ "# Dataset Card for \"wiki-bpe-48k\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"wiki-bpe-48k\"\n\nMore Information needed" ]
[ 6, 17 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"wiki-bpe-48k\"\n\nMore Information needed" ]
5a8f597b81fbe3400f0fe5b3c09fb764bf721988
# Dataset Card for "medienVersorgen-50-undersampled" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
mboth/medienVersorgen-50-undersampled
[ "region:us" ]
2023-09-22T05:11:46+00:00
{"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "test", "path": "data/test-*"}, {"split": "valid", "path": "data/valid-*"}]}], "dataset_info": {"features": [{"name": "Datatype", "dtype": "string"}, {"name": "Beschreibung", "dtype": "string"}, {"name": "Name", "dtype": "string"}, {"name": "Unit", "dtype": "string"}, {"name": "text", "dtype": "string"}, {"name": "Grundfunktion", "dtype": "string"}, {"name": "label", "dtype": {"class_label": {"names": {"0": "Bereitstellen", "1": "Entsorgen", "2": "Speichern", "3": "Verteilen"}}}}], "splits": [{"name": "train", "num_bytes": 37075.44918032787, "num_examples": 188}, {"name": "test", "num_bytes": 14725, "num_examples": 77}, {"name": "valid", "num_bytes": 14725, "num_examples": 77}], "download_size": 36084, "dataset_size": 66525.44918032788}}
2023-09-22T05:11:50+00:00
[]
[]
TAGS #region-us
# Dataset Card for "medienVersorgen-50-undersampled" More Information needed
[ "# Dataset Card for \"medienVersorgen-50-undersampled\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"medienVersorgen-50-undersampled\"\n\nMore Information needed" ]
[ 6, 21 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"medienVersorgen-50-undersampled\"\n\nMore Information needed" ]
1b9a87681829b51628bdf6dea4b4bf39ae85e887
# Dataset Card for "medienVersorgen-100-undersampled" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
mboth/medienVersorgen-100-undersampled
[ "region:us" ]
2023-09-22T05:11:50+00:00
{"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "test", "path": "data/test-*"}, {"split": "valid", "path": "data/valid-*"}]}], "dataset_info": {"features": [{"name": "Datatype", "dtype": "string"}, {"name": "Beschreibung", "dtype": "string"}, {"name": "Name", "dtype": "string"}, {"name": "Unit", "dtype": "string"}, {"name": "text", "dtype": "string"}, {"name": "Grundfunktion", "dtype": "string"}, {"name": "label", "dtype": {"class_label": {"names": {"0": "Bereitstellen", "1": "Entsorgen", "2": "Speichern", "3": "Verteilen"}}}}], "splits": [{"name": "train", "num_bytes": 59754.580327868855, "num_examples": 303}, {"name": "test", "num_bytes": 14725, "num_examples": 77}, {"name": "valid", "num_bytes": 14725, "num_examples": 77}], "download_size": 42237, "dataset_size": 89204.58032786885}}
2023-09-22T05:11:54+00:00
[]
[]
TAGS #region-us
# Dataset Card for "medienVersorgen-100-undersampled" More Information needed
[ "# Dataset Card for \"medienVersorgen-100-undersampled\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"medienVersorgen-100-undersampled\"\n\nMore Information needed" ]
[ 6, 21 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"medienVersorgen-100-undersampled\"\n\nMore Information needed" ]
2c63a4819253b47a826548361d3379ed7445d2de
# Dataset Card for "medienVersorgen-200-undersampled" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
mboth/medienVersorgen-200-undersampled
[ "region:us" ]
2023-09-22T05:11:54+00:00
{"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "test", "path": "data/test-*"}, {"split": "valid", "path": "data/valid-*"}]}], "dataset_info": {"features": [{"name": "Datatype", "dtype": "string"}, {"name": "Beschreibung", "dtype": "string"}, {"name": "Name", "dtype": "string"}, {"name": "Unit", "dtype": "string"}, {"name": "text", "dtype": "string"}, {"name": "Grundfunktion", "dtype": "string"}, {"name": "label", "dtype": {"class_label": {"names": {"0": "Bereitstellen", "1": "Entsorgen", "2": "Speichern", "3": "Verteilen"}}}}], "splits": [{"name": "train", "num_bytes": 79475.56393442623, "num_examples": 403}, {"name": "test", "num_bytes": 14725, "num_examples": 77}, {"name": "valid", "num_bytes": 14725, "num_examples": 77}], "download_size": 47115, "dataset_size": 108925.56393442623}}
2023-09-22T05:11:57+00:00
[]
[]
TAGS #region-us
# Dataset Card for "medienVersorgen-200-undersampled" More Information needed
[ "# Dataset Card for \"medienVersorgen-200-undersampled\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"medienVersorgen-200-undersampled\"\n\nMore Information needed" ]
[ 6, 21 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"medienVersorgen-200-undersampled\"\n\nMore Information needed" ]
2419c356bdb2c3134aa52e2f12d6c20c279ad76a
# Dataset Card for "DocLayNet-tiny" Tiny set for unit tests based on https://huggingface.co/datasets/pierreguillou/DocLayNet-small. Total ~0.1% of DocLayNet.
miikatoi/DocLayNet-tiny
[ "region:us" ]
2023-09-22T05:22:19+00:00
{"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "validation", "path": "data/validation-*"}, {"split": "test", "path": "data/test-*"}]}], "dataset_info": {"features": [{"name": "id", "dtype": "string"}, {"name": "texts", "sequence": "string"}, {"name": "bboxes_block", "sequence": {"sequence": "int64"}}, {"name": "bboxes_line", "sequence": {"sequence": "int64"}}, {"name": "categories", "sequence": {"class_label": {"names": {"0": "Caption", "1": "Footnote", "2": "Formula", "3": "List-item", "4": "Page-footer", "5": "Page-header", "6": "Picture", "7": "Section-header", "8": "Table", "9": "Text", "10": "Title"}}}}, {"name": "image", "dtype": "image"}, {"name": "page_hash", "dtype": "string"}, {"name": "original_filename", "dtype": "string"}, {"name": "page_no", "dtype": "int32"}, {"name": "num_pages", "dtype": "int32"}, {"name": "original_width", "dtype": "int32"}, {"name": "original_height", "dtype": "int32"}, {"name": "coco_width", "dtype": "int32"}, {"name": "coco_height", "dtype": "int32"}, {"name": "collection", "dtype": "string"}, {"name": "doc_category", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 28393556.512301013, "num_examples": 70}, {"name": "validation", "num_bytes": 2641091.359375, "num_examples": 7}, {"name": "test", "num_bytes": 1779922.857142857, "num_examples": 5}], "download_size": 31476812, "dataset_size": 32814570.72881887}}
2023-09-22T05:24:24+00:00
[]
[]
TAGS #region-us
# Dataset Card for "DocLayNet-tiny" Tiny set for unit tests based on URL Total ~0.1% of DocLayNet.
[ "# Dataset Card for \"DocLayNet-tiny\"\n\nTiny set for unit tests based on URL\n\nTotal ~0.1% of DocLayNet." ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"DocLayNet-tiny\"\n\nTiny set for unit tests based on URL\n\nTotal ~0.1% of DocLayNet." ]
[ 6, 33 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"DocLayNet-tiny\"\n\nTiny set for unit tests based on URL\n\nTotal ~0.1% of DocLayNet." ]
a14179239de4b9c4c18561c991db12eebd8d8647
# Dataset Card for "sichern-50-undersampled" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
mboth/sichern-50-undersampled
[ "region:us" ]
2023-09-22T05:22:53+00:00
{"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "test", "path": "data/test-*"}, {"split": "valid", "path": "data/valid-*"}]}], "dataset_info": {"features": [{"name": "Datatype", "dtype": "string"}, {"name": "Beschreibung", "dtype": "string"}, {"name": "Name", "dtype": "string"}, {"name": "Unit", "dtype": "string"}, {"name": "text", "dtype": "string"}, {"name": "Grundfunktion", "dtype": "string"}, {"name": "label", "dtype": {"class_label": {"names": {"0": "Brandmeldeanlage", "1": "Brandschutzklappe", "2": "Einbruchmeldeanlage", "3": "Entrauchung-Ventilator", "4": "Feuerl\u00f6schanlage", "5": "Gaswarnanlage", "6": "Notruf", "7": "Rauchmeldeanlage"}}}}], "splits": [{"name": "train", "num_bytes": 38006.082374966565, "num_examples": 193}, {"name": "test", "num_bytes": 186480, "num_examples": 935}, {"name": "valid", "num_bytes": 186480, "num_examples": 935}], "download_size": 130269, "dataset_size": 410966.0823749666}}
2023-09-22T05:22:57+00:00
[]
[]
TAGS #region-us
# Dataset Card for "sichern-50-undersampled" More Information needed
[ "# Dataset Card for \"sichern-50-undersampled\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"sichern-50-undersampled\"\n\nMore Information needed" ]
[ 6, 18 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"sichern-50-undersampled\"\n\nMore Information needed" ]
8fd16983bbbe39ee07726f34c6e2533a8a149518
# Dataset Card for "sichern-100-undersampled" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
mboth/sichern-100-undersampled
[ "region:us" ]
2023-09-22T05:22:58+00:00
{"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "test", "path": "data/test-*"}, {"split": "valid", "path": "data/valid-*"}]}], "dataset_info": {"features": [{"name": "Datatype", "dtype": "string"}, {"name": "Beschreibung", "dtype": "string"}, {"name": "Name", "dtype": "string"}, {"name": "Unit", "dtype": "string"}, {"name": "text", "dtype": "string"}, {"name": "Grundfunktion", "dtype": "string"}, {"name": "label", "dtype": {"class_label": {"names": {"0": "Brandmeldeanlage", "1": "Brandschutzklappe", "2": "Einbruchmeldeanlage", "3": "Entrauchung-Ventilator", "4": "Feuerl\u00f6schanlage", "5": "Gaswarnanlage", "6": "Notruf", "7": "Rauchmeldeanlage"}}}}], "splits": [{"name": "train", "num_bytes": 66362.95212623697, "num_examples": 337}, {"name": "test", "num_bytes": 186480, "num_examples": 935}, {"name": "valid", "num_bytes": 186480, "num_examples": 935}], "download_size": 138099, "dataset_size": 439322.952126237}}
2023-09-22T05:23:01+00:00
[]
[]
TAGS #region-us
# Dataset Card for "sichern-100-undersampled" More Information needed
[ "# Dataset Card for \"sichern-100-undersampled\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"sichern-100-undersampled\"\n\nMore Information needed" ]
[ 6, 18 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"sichern-100-undersampled\"\n\nMore Information needed" ]
e1533c7c5c857a6db7d57030363ae9faa9168b28
# Dataset Card for "sichern-200-undersampled" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
mboth/sichern-200-undersampled
[ "region:us" ]
2023-09-22T05:23:02+00:00
{"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "test", "path": "data/test-*"}, {"split": "valid", "path": "data/valid-*"}]}], "dataset_info": {"features": [{"name": "Datatype", "dtype": "string"}, {"name": "Beschreibung", "dtype": "string"}, {"name": "Name", "dtype": "string"}, {"name": "Unit", "dtype": "string"}, {"name": "text", "dtype": "string"}, {"name": "Grundfunktion", "dtype": "string"}, {"name": "label", "dtype": {"class_label": {"names": {"0": "Brandmeldeanlage", "1": "Brandschutzklappe", "2": "Einbruchmeldeanlage", "3": "Entrauchung-Ventilator", "4": "Feuerl\u00f6schanlage", "5": "Gaswarnanlage", "6": "Notruf", "7": "Rauchmeldeanlage"}}}}], "splits": [{"name": "train", "num_bytes": 105747.49344744584, "num_examples": 537}, {"name": "test", "num_bytes": 186480, "num_examples": 935}, {"name": "valid", "num_bytes": 186480, "num_examples": 935}], "download_size": 148661, "dataset_size": 478707.49344744586}}
2023-09-22T05:23:05+00:00
[]
[]
TAGS #region-us
# Dataset Card for "sichern-200-undersampled" More Information needed
[ "# Dataset Card for \"sichern-200-undersampled\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"sichern-200-undersampled\"\n\nMore Information needed" ]
[ 6, 18 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"sichern-200-undersampled\"\n\nMore Information needed" ]
71386a46413f5905460d6084414b000925d2e8cd
# Dataset Card for "wiki-bpe-64k" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
cyrilzhang/wiki-bpe-64k
[ "region:us" ]
2023-09-22T05:27:37+00:00
{"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "test", "path": "data/test-*"}]}], "dataset_info": {"features": [{"name": "input_ids", "sequence": "int32"}], "splits": [{"name": "train", "num_bytes": 20157432700, "num_examples": 4916447}, {"name": "test", "num_bytes": 202663000, "num_examples": 49430}], "download_size": 8837145740, "dataset_size": 20360095700}}
2023-09-22T05:33:17+00:00
[]
[]
TAGS #region-us
# Dataset Card for "wiki-bpe-64k" More Information needed
[ "# Dataset Card for \"wiki-bpe-64k\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"wiki-bpe-64k\"\n\nMore Information needed" ]
[ 6, 17 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"wiki-bpe-64k\"\n\nMore Information needed" ]
daad5dea85511279ddafd793604caa7a947fed7b
# Dataset Card for "hh-rlhf_with_features_flan_t5_large_flan_t5_large_zeroshot" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
dongyoung4091/hh-rlhf_with_features_flan_t5_large_flan_t5_large_zeroshot
[ "region:us" ]
2023-09-22T05:27:55+00:00
{"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "test", "path": "data/test-*"}]}], "dataset_info": {"features": [{"name": "chosen", "dtype": "string"}, {"name": "rejected", "dtype": "string"}, {"name": "helpfulness_chosen", "dtype": "int64"}, {"name": "helpfulness_rejected", "dtype": "int64"}, {"name": "specificity_chosen", "dtype": "int64"}, {"name": "specificity_rejected", "dtype": "int64"}, {"name": "intent_chosen", "dtype": "int64"}, {"name": "intent_rejected", "dtype": "int64"}, {"name": "factuality_chosen", "dtype": "int64"}, {"name": "factuality_rejected", "dtype": "int64"}, {"name": "easy-to-understand_chosen", "dtype": "int64"}, {"name": "easy-to-understand_rejected", "dtype": "int64"}, {"name": "relevance_chosen", "dtype": "int64"}, {"name": "relevance_rejected", "dtype": "int64"}, {"name": "readability_chosen", "dtype": "int64"}, {"name": "readability_rejected", "dtype": "int64"}, {"name": "enough-detail_chosen", "dtype": "int64"}, {"name": "enough-detail_rejected", "dtype": "int64"}, {"name": "biased:_chosen", "dtype": "int64"}, {"name": "biased:_rejected", "dtype": "int64"}, {"name": "fail-to-consider-individual-preferences_chosen", "dtype": "int64"}, {"name": "fail-to-consider-individual-preferences_rejected", "dtype": "int64"}, {"name": "repetetive_chosen", "dtype": "int64"}, {"name": "repetetive_rejected", "dtype": "int64"}, {"name": "fail-to-consider-context_chosen", "dtype": "int64"}, {"name": "fail-to-consider-context_rejected", "dtype": "int64"}, {"name": "too-long_chosen", "dtype": "int64"}, {"name": "too-long_rejected", "dtype": "int64"}, {"name": "human", "dtype": "string"}, {"name": "assistant_chosen", "dtype": "string"}, {"name": "assistant_rejected", "dtype": "string"}, {"name": "log_score_chosen", "dtype": "float64"}, {"name": "log_score_rejected", "dtype": "float64"}, {"name": "labels", "dtype": "string"}, {"name": "zeroshot_helpfulness_chosen", "dtype": "float64"}, {"name": "zeroshot_helpfulness_rejected", "dtype": "float64"}, {"name": "zeroshot_specificity_chosen", "dtype": "float64"}, {"name": "zeroshot_specificity_rejected", "dtype": "float64"}, {"name": "zeroshot_intent_chosen", "dtype": "float64"}, {"name": "zeroshot_intent_rejected", "dtype": "float64"}, {"name": "zeroshot_factuality_chosen", "dtype": "float64"}, {"name": "zeroshot_factuality_rejected", "dtype": "float64"}, {"name": "zeroshot_easy-to-understand_chosen", "dtype": "float64"}, {"name": "zeroshot_easy-to-understand_rejected", "dtype": "float64"}, {"name": "zeroshot_relevance_chosen", "dtype": "float64"}, {"name": "zeroshot_relevance_rejected", "dtype": "float64"}, {"name": "zeroshot_readability_chosen", "dtype": "float64"}, {"name": "zeroshot_readability_rejected", "dtype": "float64"}, {"name": "zeroshot_enough-detail_chosen", "dtype": "float64"}, {"name": "zeroshot_enough-detail_rejected", "dtype": "float64"}, {"name": "zeroshot_biased:_chosen", "dtype": "float64"}, {"name": "zeroshot_biased:_rejected", "dtype": "float64"}, {"name": "zeroshot_fail-to-consider-individual-preferences_chosen", "dtype": "float64"}, {"name": "zeroshot_fail-to-consider-individual-preferences_rejected", "dtype": "float64"}, {"name": "zeroshot_repetetive_chosen", "dtype": "float64"}, {"name": "zeroshot_repetetive_rejected", "dtype": "float64"}, {"name": "zeroshot_fail-to-consider-context_chosen", "dtype": "float64"}, {"name": "zeroshot_fail-to-consider-context_rejected", "dtype": "float64"}, {"name": "zeroshot_too-long_chosen", "dtype": "float64"}, {"name": "zeroshot_too-long_rejected", "dtype": "float64"}], "splits": [{"name": "train", "num_bytes": 16425816, "num_examples": 9574}, {"name": "test", "num_bytes": 16369741, "num_examples": 9574}], "download_size": 16126499, "dataset_size": 32795557}}
2023-09-22T05:28:41+00:00
[]
[]
TAGS #region-us
# Dataset Card for "hh-rlhf_with_features_flan_t5_large_flan_t5_large_zeroshot" More Information needed
[ "# Dataset Card for \"hh-rlhf_with_features_flan_t5_large_flan_t5_large_zeroshot\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"hh-rlhf_with_features_flan_t5_large_flan_t5_large_zeroshot\"\n\nMore Information needed" ]
[ 6, 39 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"hh-rlhf_with_features_flan_t5_large_flan_t5_large_zeroshot\"\n\nMore Information needed" ]
158f6ca2847280d4572a4eac1270f7c5884d3e42
# Dataset Card for "physics-scienceqa" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
veggiebird/physics-scienceqa
[ "region:us" ]
2023-09-22T05:34:51+00:00
{"dataset_info": {"features": [{"name": "instruction", "dtype": "string"}, {"name": "input", "dtype": "string"}, {"name": "output", "dtype": "string"}, {"name": "embeddings", "sequence": "float32"}], "splits": [{"name": "train", "num_bytes": 3744399, "num_examples": 810}], "download_size": 4028413, "dataset_size": 3744399}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]}
2023-09-22T05:34:57+00:00
[]
[]
TAGS #region-us
# Dataset Card for "physics-scienceqa" More Information needed
[ "# Dataset Card for \"physics-scienceqa\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"physics-scienceqa\"\n\nMore Information needed" ]
[ 6, 16 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"physics-scienceqa\"\n\nMore Information needed" ]
2d17f05712ed86036a30b63e7d26fde27db7a8d7
# Dataset Card for "story4kids_0_prompts" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
Falah/story4kids_0_prompts
[ "region:us" ]
2023-09-22T05:39:36+00:00
{"dataset_info": {"features": [{"name": "prompts", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 2342, "num_examples": 9}], "download_size": 2943, "dataset_size": 2342}}
2023-09-22T05:39:37+00:00
[]
[]
TAGS #region-us
# Dataset Card for "story4kids_0_prompts" More Information needed
[ "# Dataset Card for \"story4kids_0_prompts\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"story4kids_0_prompts\"\n\nMore Information needed" ]
[ 6, 20 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"story4kids_0_prompts\"\n\nMore Information needed" ]
9db5af8c3e2d6b2519f33e1be1e665626464656f
# Dataset Card for "story4kids_1_prompts" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
Falah/story4kids_1_prompts
[ "region:us" ]
2023-09-22T05:39:38+00:00
{"dataset_info": {"features": [{"name": "prompts", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 2279, "num_examples": 9}], "download_size": 4259, "dataset_size": 2279}}
2023-09-22T05:39:39+00:00
[]
[]
TAGS #region-us
# Dataset Card for "story4kids_1_prompts" More Information needed
[ "# Dataset Card for \"story4kids_1_prompts\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"story4kids_1_prompts\"\n\nMore Information needed" ]
[ 6, 19 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"story4kids_1_prompts\"\n\nMore Information needed" ]
899dc1b05df8e73f15e41aa3ebd30867a611f0ac
# Dataset Card for "story4kids_2_prompts" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
Falah/story4kids_2_prompts
[ "region:us" ]
2023-09-22T05:39:39+00:00
{"dataset_info": {"features": [{"name": "prompts", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 2537, "num_examples": 10}], "download_size": 3322, "dataset_size": 2537}}
2023-09-22T05:39:41+00:00
[]
[]
TAGS #region-us
# Dataset Card for "story4kids_2_prompts" More Information needed
[ "# Dataset Card for \"story4kids_2_prompts\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"story4kids_2_prompts\"\n\nMore Information needed" ]
[ 6, 20 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"story4kids_2_prompts\"\n\nMore Information needed" ]
85dbdb0fe2202fbcc9afb8575a822210cfbe27b4
# Dataset Card for "waermeErzeugen-50-undersampled" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
mboth/waermeErzeugen-50-undersampled
[ "region:us" ]
2023-09-22T06:11:03+00:00
{"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "test", "path": "data/test-*"}, {"split": "valid", "path": "data/valid-*"}]}], "dataset_info": {"features": [{"name": "Datatype", "dtype": "string"}, {"name": "Beschreibung", "dtype": "string"}, {"name": "Name", "dtype": "string"}, {"name": "Unit", "dtype": "string"}, {"name": "text", "dtype": "string"}, {"name": "Grundfunktion", "dtype": "string"}, {"name": "ZweiteGrundfunktion", "dtype": "string"}, {"name": "label", "dtype": {"class_label": {"names": {"0": "BHKW", "1": "Kessel", "2": "Pelletkessel", "3": "Waermepumpe", "4": "WaermeversorgerAllgemein"}}}}], "splits": [{"name": "train", "num_bytes": 37366.89908256881, "num_examples": 209}, {"name": "test", "num_bytes": 38880, "num_examples": 218}, {"name": "valid", "num_bytes": 38880, "num_examples": 218}], "download_size": 54745, "dataset_size": 115126.89908256881}}
2023-09-22T06:11:07+00:00
[]
[]
TAGS #region-us
# Dataset Card for "waermeErzeugen-50-undersampled" More Information needed
[ "# Dataset Card for \"waermeErzeugen-50-undersampled\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"waermeErzeugen-50-undersampled\"\n\nMore Information needed" ]
[ 6, 22 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"waermeErzeugen-50-undersampled\"\n\nMore Information needed" ]
928ab18900785024172caefceb20952543823e66
# Dataset Card for "waermeErzeugen-100-undersampled" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
mboth/waermeErzeugen-100-undersampled
[ "region:us" ]
2023-09-22T06:11:07+00:00
{"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "test", "path": "data/test-*"}, {"split": "valid", "path": "data/valid-*"}]}], "dataset_info": {"features": [{"name": "Datatype", "dtype": "string"}, {"name": "Beschreibung", "dtype": "string"}, {"name": "Name", "dtype": "string"}, {"name": "Unit", "dtype": "string"}, {"name": "text", "dtype": "string"}, {"name": "Grundfunktion", "dtype": "string"}, {"name": "ZweiteGrundfunktion", "dtype": "string"}, {"name": "label", "dtype": {"class_label": {"names": {"0": "BHKW", "1": "Kessel", "2": "Pelletkessel", "3": "Waermepumpe", "4": "WaermeversorgerAllgemein"}}}}], "splits": [{"name": "train", "num_bytes": 64185.247706422015, "num_examples": 359}, {"name": "test", "num_bytes": 38880, "num_examples": 218}, {"name": "valid", "num_bytes": 38880, "num_examples": 218}], "download_size": 61981, "dataset_size": 141945.247706422}}
2023-09-22T06:11:11+00:00
[]
[]
TAGS #region-us
# Dataset Card for "waermeErzeugen-100-undersampled" More Information needed
[ "# Dataset Card for \"waermeErzeugen-100-undersampled\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"waermeErzeugen-100-undersampled\"\n\nMore Information needed" ]
[ 6, 22 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"waermeErzeugen-100-undersampled\"\n\nMore Information needed" ]
648e012bdf7c20161a88b3b6c10b07a638a7cef8
# Dataset Card for "waermeErzeugensichern-200-undersampled" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
mboth/waermeErzeugensichern-200-undersampled
[ "region:us" ]
2023-09-22T06:11:11+00:00
{"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "test", "path": "data/test-*"}, {"split": "valid", "path": "data/valid-*"}]}], "dataset_info": {"features": [{"name": "Datatype", "dtype": "string"}, {"name": "Beschreibung", "dtype": "string"}, {"name": "Name", "dtype": "string"}, {"name": "Unit", "dtype": "string"}, {"name": "text", "dtype": "string"}, {"name": "Grundfunktion", "dtype": "string"}, {"name": "ZweiteGrundfunktion", "dtype": "string"}, {"name": "label", "dtype": {"class_label": {"names": {"0": "BHKW", "1": "Kessel", "2": "Pelletkessel", "3": "Waermepumpe", "4": "WaermeversorgerAllgemein"}}}}], "splits": [{"name": "train", "num_bytes": 117821.94495412844, "num_examples": 659}, {"name": "test", "num_bytes": 38880, "num_examples": 218}, {"name": "valid", "num_bytes": 38880, "num_examples": 218}], "download_size": 76901, "dataset_size": 195581.94495412844}}
2023-09-22T06:11:14+00:00
[]
[]
TAGS #region-us
# Dataset Card for "waermeErzeugensichern-200-undersampled" More Information needed
[ "# Dataset Card for \"waermeErzeugensichern-200-undersampled\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"waermeErzeugensichern-200-undersampled\"\n\nMore Information needed" ]
[ 6, 24 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"waermeErzeugensichern-200-undersampled\"\n\nMore Information needed" ]
411b3723edb62fda82c3f3246f4bb5005839a6cb
# Dataset Card for Evaluation run of willyninja30/ARIA-70B-French ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/willyninja30/ARIA-70B-French - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [willyninja30/ARIA-70B-French](https://huggingface.co/willyninja30/ARIA-70B-French) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_willyninja30__ARIA-70B-French", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-10-25T03:07:36.932003](https://huggingface.co/datasets/open-llm-leaderboard/details_willyninja30__ARIA-70B-French/blob/main/results_2023-10-25T03-07-36.932003.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.040373322147651006, "em_stderr": 0.0020157564185176837, "f1": 0.1050272651006715, "f1_stderr": 0.0023756238577676155, "acc": 0.5359600711595986, "acc_stderr": 0.011658939983913113 }, "harness|drop|3": { "em": 0.040373322147651006, "em_stderr": 0.0020157564185176837, "f1": 0.1050272651006715, "f1_stderr": 0.0023756238577676155 }, "harness|gsm8k|5": { "acc": 0.266868840030326, "acc_stderr": 0.012183780551887957 }, "harness|winogrande|5": { "acc": 0.8050513022888713, "acc_stderr": 0.011134099415938268 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_willyninja30__ARIA-70B-French
[ "region:us" ]
2023-09-22T06:23:13+00:00
{"pretty_name": "Evaluation run of willyninja30/ARIA-70B-French", "dataset_summary": "Dataset automatically created during the evaluation run of model [willyninja30/ARIA-70B-French](https://huggingface.co/willyninja30/ARIA-70B-French) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_willyninja30__ARIA-70B-French\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-25T03:07:36.932003](https://huggingface.co/datasets/open-llm-leaderboard/details_willyninja30__ARIA-70B-French/blob/main/results_2023-10-25T03-07-36.932003.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.040373322147651006,\n \"em_stderr\": 0.0020157564185176837,\n \"f1\": 0.1050272651006715,\n \"f1_stderr\": 0.0023756238577676155,\n \"acc\": 0.5359600711595986,\n \"acc_stderr\": 0.011658939983913113\n },\n \"harness|drop|3\": {\n \"em\": 0.040373322147651006,\n \"em_stderr\": 0.0020157564185176837,\n \"f1\": 0.1050272651006715,\n \"f1_stderr\": 0.0023756238577676155\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.266868840030326,\n \"acc_stderr\": 0.012183780551887957\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8050513022888713,\n \"acc_stderr\": 0.011134099415938268\n }\n}\n```", "repo_url": "https://huggingface.co/willyninja30/ARIA-70B-French", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_09_22T07_22_49.937285", "path": ["**/details_harness|arc:challenge|25_2023-09-22T07-22-49.937285.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-09-22T07-22-49.937285.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_25T03_07_36.932003", "path": ["**/details_harness|drop|3_2023-10-25T03-07-36.932003.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-25T03-07-36.932003.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_25T03_07_36.932003", "path": ["**/details_harness|gsm8k|5_2023-10-25T03-07-36.932003.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-25T03-07-36.932003.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_09_22T07_22_49.937285", "path": ["**/details_harness|hellaswag|10_2023-09-22T07-22-49.937285.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-09-22T07-22-49.937285.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_09_22T07_22_49.937285", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-22T07-22-49.937285.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-09-22T07-22-49.937285.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-09-22T07-22-49.937285.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-09-22T07-22-49.937285.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-22T07-22-49.937285.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-09-22T07-22-49.937285.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-09-22T07-22-49.937285.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-09-22T07-22-49.937285.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-09-22T07-22-49.937285.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-09-22T07-22-49.937285.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-09-22T07-22-49.937285.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-09-22T07-22-49.937285.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-22T07-22-49.937285.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-09-22T07-22-49.937285.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-22T07-22-49.937285.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-22T07-22-49.937285.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-09-22T07-22-49.937285.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-09-22T07-22-49.937285.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-09-22T07-22-49.937285.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-22T07-22-49.937285.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-22T07-22-49.937285.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-22T07-22-49.937285.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-09-22T07-22-49.937285.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-22T07-22-49.937285.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-22T07-22-49.937285.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-22T07-22-49.937285.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-22T07-22-49.937285.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-09-22T07-22-49.937285.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-22T07-22-49.937285.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-22T07-22-49.937285.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-22T07-22-49.937285.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-22T07-22-49.937285.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-09-22T07-22-49.937285.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-09-22T07-22-49.937285.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-09-22T07-22-49.937285.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-09-22T07-22-49.937285.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-22T07-22-49.937285.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-09-22T07-22-49.937285.parquet", "**/details_harness|hendrycksTest-management|5_2023-09-22T07-22-49.937285.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-09-22T07-22-49.937285.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-09-22T07-22-49.937285.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-09-22T07-22-49.937285.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-09-22T07-22-49.937285.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-22T07-22-49.937285.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-09-22T07-22-49.937285.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-09-22T07-22-49.937285.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-09-22T07-22-49.937285.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-09-22T07-22-49.937285.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-09-22T07-22-49.937285.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-09-22T07-22-49.937285.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-09-22T07-22-49.937285.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-09-22T07-22-49.937285.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-09-22T07-22-49.937285.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-09-22T07-22-49.937285.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-22T07-22-49.937285.parquet", "**/details_harness|hendrycksTest-virology|5_2023-09-22T07-22-49.937285.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-09-22T07-22-49.937285.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-22T07-22-49.937285.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-09-22T07-22-49.937285.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-09-22T07-22-49.937285.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-09-22T07-22-49.937285.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-22T07-22-49.937285.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-09-22T07-22-49.937285.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-09-22T07-22-49.937285.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-09-22T07-22-49.937285.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-09-22T07-22-49.937285.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-09-22T07-22-49.937285.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-09-22T07-22-49.937285.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-09-22T07-22-49.937285.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-22T07-22-49.937285.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-09-22T07-22-49.937285.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-22T07-22-49.937285.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-22T07-22-49.937285.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-09-22T07-22-49.937285.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-09-22T07-22-49.937285.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-09-22T07-22-49.937285.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-22T07-22-49.937285.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-22T07-22-49.937285.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-22T07-22-49.937285.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-09-22T07-22-49.937285.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-22T07-22-49.937285.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-22T07-22-49.937285.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-22T07-22-49.937285.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-22T07-22-49.937285.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-09-22T07-22-49.937285.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-22T07-22-49.937285.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-22T07-22-49.937285.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-22T07-22-49.937285.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-22T07-22-49.937285.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-09-22T07-22-49.937285.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-09-22T07-22-49.937285.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-09-22T07-22-49.937285.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-09-22T07-22-49.937285.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-22T07-22-49.937285.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-09-22T07-22-49.937285.parquet", "**/details_harness|hendrycksTest-management|5_2023-09-22T07-22-49.937285.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-09-22T07-22-49.937285.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-09-22T07-22-49.937285.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-09-22T07-22-49.937285.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-09-22T07-22-49.937285.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-22T07-22-49.937285.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-09-22T07-22-49.937285.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-09-22T07-22-49.937285.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-09-22T07-22-49.937285.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-09-22T07-22-49.937285.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-09-22T07-22-49.937285.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-09-22T07-22-49.937285.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-09-22T07-22-49.937285.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-09-22T07-22-49.937285.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-09-22T07-22-49.937285.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-09-22T07-22-49.937285.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-22T07-22-49.937285.parquet", "**/details_harness|hendrycksTest-virology|5_2023-09-22T07-22-49.937285.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-09-22T07-22-49.937285.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_09_22T07_22_49.937285", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-22T07-22-49.937285.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-22T07-22-49.937285.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_09_22T07_22_49.937285", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-09-22T07-22-49.937285.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-09-22T07-22-49.937285.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_09_22T07_22_49.937285", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-09-22T07-22-49.937285.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-09-22T07-22-49.937285.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_09_22T07_22_49.937285", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-09-22T07-22-49.937285.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-09-22T07-22-49.937285.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_09_22T07_22_49.937285", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-22T07-22-49.937285.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-22T07-22-49.937285.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_09_22T07_22_49.937285", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-09-22T07-22-49.937285.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-09-22T07-22-49.937285.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_09_22T07_22_49.937285", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-09-22T07-22-49.937285.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-09-22T07-22-49.937285.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_09_22T07_22_49.937285", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-09-22T07-22-49.937285.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-09-22T07-22-49.937285.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_09_22T07_22_49.937285", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-09-22T07-22-49.937285.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-09-22T07-22-49.937285.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_09_22T07_22_49.937285", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-09-22T07-22-49.937285.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-09-22T07-22-49.937285.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_09_22T07_22_49.937285", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-09-22T07-22-49.937285.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-09-22T07-22-49.937285.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_09_22T07_22_49.937285", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-09-22T07-22-49.937285.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-09-22T07-22-49.937285.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_09_22T07_22_49.937285", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-22T07-22-49.937285.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-22T07-22-49.937285.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_09_22T07_22_49.937285", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-09-22T07-22-49.937285.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-09-22T07-22-49.937285.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_09_22T07_22_49.937285", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-22T07-22-49.937285.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-22T07-22-49.937285.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_09_22T07_22_49.937285", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-22T07-22-49.937285.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-22T07-22-49.937285.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_09_22T07_22_49.937285", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-09-22T07-22-49.937285.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-09-22T07-22-49.937285.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_09_22T07_22_49.937285", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-09-22T07-22-49.937285.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-09-22T07-22-49.937285.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_09_22T07_22_49.937285", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-09-22T07-22-49.937285.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-09-22T07-22-49.937285.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_09_22T07_22_49.937285", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-22T07-22-49.937285.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-22T07-22-49.937285.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_09_22T07_22_49.937285", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-22T07-22-49.937285.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-22T07-22-49.937285.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_09_22T07_22_49.937285", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-22T07-22-49.937285.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-22T07-22-49.937285.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_09_22T07_22_49.937285", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-09-22T07-22-49.937285.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-09-22T07-22-49.937285.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_09_22T07_22_49.937285", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-22T07-22-49.937285.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-22T07-22-49.937285.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_09_22T07_22_49.937285", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-22T07-22-49.937285.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-22T07-22-49.937285.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_09_22T07_22_49.937285", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-22T07-22-49.937285.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-22T07-22-49.937285.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_09_22T07_22_49.937285", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-22T07-22-49.937285.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-22T07-22-49.937285.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_09_22T07_22_49.937285", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-09-22T07-22-49.937285.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-09-22T07-22-49.937285.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_09_22T07_22_49.937285", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-22T07-22-49.937285.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-22T07-22-49.937285.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_09_22T07_22_49.937285", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-22T07-22-49.937285.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-22T07-22-49.937285.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_09_22T07_22_49.937285", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-22T07-22-49.937285.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-22T07-22-49.937285.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_09_22T07_22_49.937285", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-22T07-22-49.937285.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-22T07-22-49.937285.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_09_22T07_22_49.937285", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-09-22T07-22-49.937285.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-09-22T07-22-49.937285.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_09_22T07_22_49.937285", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-09-22T07-22-49.937285.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-09-22T07-22-49.937285.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_09_22T07_22_49.937285", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-09-22T07-22-49.937285.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-09-22T07-22-49.937285.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_09_22T07_22_49.937285", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-09-22T07-22-49.937285.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-09-22T07-22-49.937285.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_09_22T07_22_49.937285", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-22T07-22-49.937285.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-22T07-22-49.937285.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_09_22T07_22_49.937285", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-09-22T07-22-49.937285.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-09-22T07-22-49.937285.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_09_22T07_22_49.937285", "path": ["**/details_harness|hendrycksTest-management|5_2023-09-22T07-22-49.937285.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-09-22T07-22-49.937285.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_09_22T07_22_49.937285", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-09-22T07-22-49.937285.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-09-22T07-22-49.937285.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_09_22T07_22_49.937285", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-09-22T07-22-49.937285.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-09-22T07-22-49.937285.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_09_22T07_22_49.937285", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-09-22T07-22-49.937285.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-09-22T07-22-49.937285.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_09_22T07_22_49.937285", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-09-22T07-22-49.937285.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-09-22T07-22-49.937285.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_09_22T07_22_49.937285", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-22T07-22-49.937285.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-22T07-22-49.937285.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_09_22T07_22_49.937285", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-09-22T07-22-49.937285.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-09-22T07-22-49.937285.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_09_22T07_22_49.937285", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-09-22T07-22-49.937285.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-09-22T07-22-49.937285.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_09_22T07_22_49.937285", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-09-22T07-22-49.937285.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-09-22T07-22-49.937285.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_09_22T07_22_49.937285", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-09-22T07-22-49.937285.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-09-22T07-22-49.937285.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_09_22T07_22_49.937285", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-09-22T07-22-49.937285.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-09-22T07-22-49.937285.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_09_22T07_22_49.937285", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-09-22T07-22-49.937285.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-09-22T07-22-49.937285.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_09_22T07_22_49.937285", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-09-22T07-22-49.937285.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-09-22T07-22-49.937285.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_09_22T07_22_49.937285", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-09-22T07-22-49.937285.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-09-22T07-22-49.937285.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_09_22T07_22_49.937285", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-09-22T07-22-49.937285.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-09-22T07-22-49.937285.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_09_22T07_22_49.937285", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-09-22T07-22-49.937285.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-09-22T07-22-49.937285.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_09_22T07_22_49.937285", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-22T07-22-49.937285.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-22T07-22-49.937285.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_09_22T07_22_49.937285", "path": ["**/details_harness|hendrycksTest-virology|5_2023-09-22T07-22-49.937285.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-09-22T07-22-49.937285.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_09_22T07_22_49.937285", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-09-22T07-22-49.937285.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-09-22T07-22-49.937285.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_09_22T07_22_49.937285", "path": ["**/details_harness|truthfulqa:mc|0_2023-09-22T07-22-49.937285.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-09-22T07-22-49.937285.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_25T03_07_36.932003", "path": ["**/details_harness|winogrande|5_2023-10-25T03-07-36.932003.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-25T03-07-36.932003.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_09_22T07_22_49.937285", "path": ["results_2023-09-22T07-22-49.937285.parquet"]}, {"split": "2023_10_25T03_07_36.932003", "path": ["results_2023-10-25T03-07-36.932003.parquet"]}, {"split": "latest", "path": ["results_2023-10-25T03-07-36.932003.parquet"]}]}]}
2023-10-25T02:07:50+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of willyninja30/ARIA-70B-French ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model willyninja30/ARIA-70B-French on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-10-25T03:07:36.932003(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of willyninja30/ARIA-70B-French", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model willyninja30/ARIA-70B-French on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-25T03:07:36.932003(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of willyninja30/ARIA-70B-French", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model willyninja30/ARIA-70B-French on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-25T03:07:36.932003(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 22, 31, 170, 66, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of willyninja30/ARIA-70B-French## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model willyninja30/ARIA-70B-French on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-25T03:07:36.932003(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
e21200a0d6b69687ed168414bc40fa8ec490248d
# Dataset Card for Evaluation run of KnutJaegersberg/deacon-13b ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/KnutJaegersberg/deacon-13b - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [KnutJaegersberg/deacon-13b](https://huggingface.co/KnutJaegersberg/deacon-13b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_KnutJaegersberg__deacon-13b", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-10-27T07:52:54.857198](https://huggingface.co/datasets/open-llm-leaderboard/details_KnutJaegersberg__deacon-13b/blob/main/results_2023-10-27T07-52-54.857198.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.0012583892617449664, "em_stderr": 0.0003630560893119389, "f1": 0.05671665268456401, "f1_stderr": 0.001312852180013837, "acc": 0.43354338539457016, "acc_stderr": 0.010175607297065709 }, "harness|drop|3": { "em": 0.0012583892617449664, "em_stderr": 0.0003630560893119389, "f1": 0.05671665268456401, "f1_stderr": 0.001312852180013837 }, "harness|gsm8k|5": { "acc": 0.10386656557998483, "acc_stderr": 0.008403622228924029 }, "harness|winogrande|5": { "acc": 0.7632202052091555, "acc_stderr": 0.011947592365207389 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_KnutJaegersberg__deacon-13b
[ "region:us" ]
2023-09-22T06:24:38+00:00
{"pretty_name": "Evaluation run of KnutJaegersberg/deacon-13b", "dataset_summary": "Dataset automatically created during the evaluation run of model [KnutJaegersberg/deacon-13b](https://huggingface.co/KnutJaegersberg/deacon-13b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_KnutJaegersberg__deacon-13b\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-27T07:52:54.857198](https://huggingface.co/datasets/open-llm-leaderboard/details_KnutJaegersberg__deacon-13b/blob/main/results_2023-10-27T07-52-54.857198.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0012583892617449664,\n \"em_stderr\": 0.0003630560893119389,\n \"f1\": 0.05671665268456401,\n \"f1_stderr\": 0.001312852180013837,\n \"acc\": 0.43354338539457016,\n \"acc_stderr\": 0.010175607297065709\n },\n \"harness|drop|3\": {\n \"em\": 0.0012583892617449664,\n \"em_stderr\": 0.0003630560893119389,\n \"f1\": 0.05671665268456401,\n \"f1_stderr\": 0.001312852180013837\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.10386656557998483,\n \"acc_stderr\": 0.008403622228924029\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7632202052091555,\n \"acc_stderr\": 0.011947592365207389\n }\n}\n```", "repo_url": "https://huggingface.co/KnutJaegersberg/deacon-13b", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_09_22T07_24_15.341487", "path": ["**/details_harness|arc:challenge|25_2023-09-22T07-24-15.341487.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-09-22T07-24-15.341487.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_27T07_52_54.857198", "path": ["**/details_harness|drop|3_2023-10-27T07-52-54.857198.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-27T07-52-54.857198.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_27T07_52_54.857198", "path": ["**/details_harness|gsm8k|5_2023-10-27T07-52-54.857198.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-27T07-52-54.857198.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_09_22T07_24_15.341487", "path": ["**/details_harness|hellaswag|10_2023-09-22T07-24-15.341487.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-09-22T07-24-15.341487.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_09_22T07_24_15.341487", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-22T07-24-15.341487.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-09-22T07-24-15.341487.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-09-22T07-24-15.341487.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-09-22T07-24-15.341487.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-22T07-24-15.341487.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-09-22T07-24-15.341487.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-09-22T07-24-15.341487.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-09-22T07-24-15.341487.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-09-22T07-24-15.341487.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-09-22T07-24-15.341487.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-09-22T07-24-15.341487.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-09-22T07-24-15.341487.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-22T07-24-15.341487.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-09-22T07-24-15.341487.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-22T07-24-15.341487.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-22T07-24-15.341487.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-09-22T07-24-15.341487.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-09-22T07-24-15.341487.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-09-22T07-24-15.341487.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-22T07-24-15.341487.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-22T07-24-15.341487.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-22T07-24-15.341487.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-09-22T07-24-15.341487.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-22T07-24-15.341487.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-22T07-24-15.341487.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-22T07-24-15.341487.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-22T07-24-15.341487.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-09-22T07-24-15.341487.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-22T07-24-15.341487.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-22T07-24-15.341487.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-22T07-24-15.341487.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-22T07-24-15.341487.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-09-22T07-24-15.341487.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-09-22T07-24-15.341487.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-09-22T07-24-15.341487.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-09-22T07-24-15.341487.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-22T07-24-15.341487.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-09-22T07-24-15.341487.parquet", "**/details_harness|hendrycksTest-management|5_2023-09-22T07-24-15.341487.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-09-22T07-24-15.341487.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-09-22T07-24-15.341487.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-09-22T07-24-15.341487.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-09-22T07-24-15.341487.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-22T07-24-15.341487.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-09-22T07-24-15.341487.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-09-22T07-24-15.341487.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-09-22T07-24-15.341487.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-09-22T07-24-15.341487.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-09-22T07-24-15.341487.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-09-22T07-24-15.341487.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-09-22T07-24-15.341487.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-09-22T07-24-15.341487.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-09-22T07-24-15.341487.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-09-22T07-24-15.341487.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-22T07-24-15.341487.parquet", "**/details_harness|hendrycksTest-virology|5_2023-09-22T07-24-15.341487.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-09-22T07-24-15.341487.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-22T07-24-15.341487.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-09-22T07-24-15.341487.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-09-22T07-24-15.341487.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-09-22T07-24-15.341487.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-22T07-24-15.341487.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-09-22T07-24-15.341487.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-09-22T07-24-15.341487.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-09-22T07-24-15.341487.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-09-22T07-24-15.341487.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-09-22T07-24-15.341487.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-09-22T07-24-15.341487.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-09-22T07-24-15.341487.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-22T07-24-15.341487.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-09-22T07-24-15.341487.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-22T07-24-15.341487.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-22T07-24-15.341487.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-09-22T07-24-15.341487.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-09-22T07-24-15.341487.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-09-22T07-24-15.341487.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-22T07-24-15.341487.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-22T07-24-15.341487.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-22T07-24-15.341487.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-09-22T07-24-15.341487.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-22T07-24-15.341487.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-22T07-24-15.341487.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-22T07-24-15.341487.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-22T07-24-15.341487.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-09-22T07-24-15.341487.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-22T07-24-15.341487.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-22T07-24-15.341487.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-22T07-24-15.341487.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-22T07-24-15.341487.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-09-22T07-24-15.341487.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-09-22T07-24-15.341487.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-09-22T07-24-15.341487.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-09-22T07-24-15.341487.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-22T07-24-15.341487.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-09-22T07-24-15.341487.parquet", "**/details_harness|hendrycksTest-management|5_2023-09-22T07-24-15.341487.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-09-22T07-24-15.341487.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-09-22T07-24-15.341487.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-09-22T07-24-15.341487.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-09-22T07-24-15.341487.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-22T07-24-15.341487.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-09-22T07-24-15.341487.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-09-22T07-24-15.341487.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-09-22T07-24-15.341487.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-09-22T07-24-15.341487.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-09-22T07-24-15.341487.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-09-22T07-24-15.341487.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-09-22T07-24-15.341487.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-09-22T07-24-15.341487.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-09-22T07-24-15.341487.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-09-22T07-24-15.341487.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-22T07-24-15.341487.parquet", "**/details_harness|hendrycksTest-virology|5_2023-09-22T07-24-15.341487.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-09-22T07-24-15.341487.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_09_22T07_24_15.341487", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-22T07-24-15.341487.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-22T07-24-15.341487.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_09_22T07_24_15.341487", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-09-22T07-24-15.341487.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-09-22T07-24-15.341487.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_09_22T07_24_15.341487", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-09-22T07-24-15.341487.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-09-22T07-24-15.341487.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_09_22T07_24_15.341487", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-09-22T07-24-15.341487.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-09-22T07-24-15.341487.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_09_22T07_24_15.341487", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-22T07-24-15.341487.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-22T07-24-15.341487.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_09_22T07_24_15.341487", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-09-22T07-24-15.341487.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-09-22T07-24-15.341487.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_09_22T07_24_15.341487", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-09-22T07-24-15.341487.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-09-22T07-24-15.341487.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_09_22T07_24_15.341487", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-09-22T07-24-15.341487.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-09-22T07-24-15.341487.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_09_22T07_24_15.341487", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-09-22T07-24-15.341487.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-09-22T07-24-15.341487.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_09_22T07_24_15.341487", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-09-22T07-24-15.341487.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-09-22T07-24-15.341487.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_09_22T07_24_15.341487", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-09-22T07-24-15.341487.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-09-22T07-24-15.341487.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_09_22T07_24_15.341487", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-09-22T07-24-15.341487.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-09-22T07-24-15.341487.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_09_22T07_24_15.341487", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-22T07-24-15.341487.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-22T07-24-15.341487.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_09_22T07_24_15.341487", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-09-22T07-24-15.341487.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-09-22T07-24-15.341487.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_09_22T07_24_15.341487", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-22T07-24-15.341487.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-22T07-24-15.341487.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_09_22T07_24_15.341487", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-22T07-24-15.341487.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-22T07-24-15.341487.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_09_22T07_24_15.341487", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-09-22T07-24-15.341487.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-09-22T07-24-15.341487.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_09_22T07_24_15.341487", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-09-22T07-24-15.341487.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-09-22T07-24-15.341487.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_09_22T07_24_15.341487", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-09-22T07-24-15.341487.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-09-22T07-24-15.341487.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_09_22T07_24_15.341487", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-22T07-24-15.341487.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-22T07-24-15.341487.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_09_22T07_24_15.341487", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-22T07-24-15.341487.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-22T07-24-15.341487.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_09_22T07_24_15.341487", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-22T07-24-15.341487.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-22T07-24-15.341487.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_09_22T07_24_15.341487", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-09-22T07-24-15.341487.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-09-22T07-24-15.341487.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_09_22T07_24_15.341487", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-22T07-24-15.341487.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-22T07-24-15.341487.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_09_22T07_24_15.341487", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-22T07-24-15.341487.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-22T07-24-15.341487.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_09_22T07_24_15.341487", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-22T07-24-15.341487.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-22T07-24-15.341487.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_09_22T07_24_15.341487", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-22T07-24-15.341487.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-22T07-24-15.341487.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_09_22T07_24_15.341487", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-09-22T07-24-15.341487.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-09-22T07-24-15.341487.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_09_22T07_24_15.341487", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-22T07-24-15.341487.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-22T07-24-15.341487.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_09_22T07_24_15.341487", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-22T07-24-15.341487.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-22T07-24-15.341487.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_09_22T07_24_15.341487", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-22T07-24-15.341487.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-22T07-24-15.341487.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_09_22T07_24_15.341487", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-22T07-24-15.341487.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-22T07-24-15.341487.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_09_22T07_24_15.341487", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-09-22T07-24-15.341487.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-09-22T07-24-15.341487.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_09_22T07_24_15.341487", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-09-22T07-24-15.341487.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-09-22T07-24-15.341487.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_09_22T07_24_15.341487", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-09-22T07-24-15.341487.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-09-22T07-24-15.341487.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_09_22T07_24_15.341487", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-09-22T07-24-15.341487.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-09-22T07-24-15.341487.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_09_22T07_24_15.341487", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-22T07-24-15.341487.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-22T07-24-15.341487.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_09_22T07_24_15.341487", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-09-22T07-24-15.341487.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-09-22T07-24-15.341487.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_09_22T07_24_15.341487", "path": ["**/details_harness|hendrycksTest-management|5_2023-09-22T07-24-15.341487.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-09-22T07-24-15.341487.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_09_22T07_24_15.341487", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-09-22T07-24-15.341487.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-09-22T07-24-15.341487.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_09_22T07_24_15.341487", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-09-22T07-24-15.341487.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-09-22T07-24-15.341487.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_09_22T07_24_15.341487", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-09-22T07-24-15.341487.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-09-22T07-24-15.341487.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_09_22T07_24_15.341487", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-09-22T07-24-15.341487.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-09-22T07-24-15.341487.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_09_22T07_24_15.341487", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-22T07-24-15.341487.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-22T07-24-15.341487.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_09_22T07_24_15.341487", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-09-22T07-24-15.341487.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-09-22T07-24-15.341487.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_09_22T07_24_15.341487", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-09-22T07-24-15.341487.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-09-22T07-24-15.341487.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_09_22T07_24_15.341487", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-09-22T07-24-15.341487.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-09-22T07-24-15.341487.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_09_22T07_24_15.341487", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-09-22T07-24-15.341487.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-09-22T07-24-15.341487.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_09_22T07_24_15.341487", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-09-22T07-24-15.341487.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-09-22T07-24-15.341487.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_09_22T07_24_15.341487", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-09-22T07-24-15.341487.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-09-22T07-24-15.341487.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_09_22T07_24_15.341487", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-09-22T07-24-15.341487.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-09-22T07-24-15.341487.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_09_22T07_24_15.341487", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-09-22T07-24-15.341487.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-09-22T07-24-15.341487.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_09_22T07_24_15.341487", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-09-22T07-24-15.341487.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-09-22T07-24-15.341487.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_09_22T07_24_15.341487", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-09-22T07-24-15.341487.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-09-22T07-24-15.341487.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_09_22T07_24_15.341487", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-22T07-24-15.341487.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-22T07-24-15.341487.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_09_22T07_24_15.341487", "path": ["**/details_harness|hendrycksTest-virology|5_2023-09-22T07-24-15.341487.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-09-22T07-24-15.341487.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_09_22T07_24_15.341487", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-09-22T07-24-15.341487.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-09-22T07-24-15.341487.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_09_22T07_24_15.341487", "path": ["**/details_harness|truthfulqa:mc|0_2023-09-22T07-24-15.341487.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-09-22T07-24-15.341487.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_27T07_52_54.857198", "path": ["**/details_harness|winogrande|5_2023-10-27T07-52-54.857198.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-27T07-52-54.857198.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_09_22T07_24_15.341487", "path": ["results_2023-09-22T07-24-15.341487.parquet"]}, {"split": "2023_10_27T07_52_54.857198", "path": ["results_2023-10-27T07-52-54.857198.parquet"]}, {"split": "latest", "path": ["results_2023-10-27T07-52-54.857198.parquet"]}]}]}
2023-10-27T06:53:09+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of KnutJaegersberg/deacon-13b ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model KnutJaegersberg/deacon-13b on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-10-27T07:52:54.857198(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of KnutJaegersberg/deacon-13b", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model KnutJaegersberg/deacon-13b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-27T07:52:54.857198(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of KnutJaegersberg/deacon-13b", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model KnutJaegersberg/deacon-13b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-27T07:52:54.857198(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 20, 31, 168, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of KnutJaegersberg/deacon-13b## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model KnutJaegersberg/deacon-13b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-27T07:52:54.857198(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
de999597b662856cd8e7cf7ad883a5abcbc266f7
# Dataset Card for "waermeVerteilen-50-undersampled" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
mboth/waermeVerteilen-50-undersampled
[ "region:us" ]
2023-09-22T06:26:44+00:00
{"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "test", "path": "data/test-*"}, {"split": "valid", "path": "data/valid-*"}]}], "dataset_info": {"features": [{"name": "Datatype", "dtype": "string"}, {"name": "Beschreibung", "dtype": "string"}, {"name": "Name", "dtype": "string"}, {"name": "Unit", "dtype": "string"}, {"name": "text", "dtype": "string"}, {"name": "Grundfunktion", "dtype": "string"}, {"name": "ZweiteGrundfunktion", "dtype": "string"}, {"name": "label", "dtype": {"class_label": {"names": {"0": "Druckhaltestation", "1": "HeizkreisAllgemein", "2": "Heizkurve", "3": "Kaeltemengenzaehler", "4": "Pumpe", "5": "Raum", "6": "Regler", "7": "Ruecklauf", "8": "Uebertrager", "9": "Ventil", "10": "Vorlauf", "11": "Waermemengenzaehler", "12": "Warmwasserbereitung"}}}}], "splits": [{"name": "train", "num_bytes": 114908.01213960546, "num_examples": 540}, {"name": "test", "num_bytes": 423002, "num_examples": 1978}, {"name": "valid", "num_bytes": 423002, "num_examples": 1978}], "download_size": 319448, "dataset_size": 960912.0121396055}}
2023-09-22T06:26:51+00:00
[]
[]
TAGS #region-us
# Dataset Card for "waermeVerteilen-50-undersampled" More Information needed
[ "# Dataset Card for \"waermeVerteilen-50-undersampled\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"waermeVerteilen-50-undersampled\"\n\nMore Information needed" ]
[ 6, 21 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"waermeVerteilen-50-undersampled\"\n\nMore Information needed" ]
6d80e46a328841efe2a72f3d22670e003f075928
# Dataset Card for "waermeVerteilen-100-undersampled" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
mboth/waermeVerteilen-100-undersampled
[ "region:us" ]
2023-09-22T06:26:51+00:00
{"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "test", "path": "data/test-*"}, {"split": "valid", "path": "data/valid-*"}]}], "dataset_info": {"features": [{"name": "Datatype", "dtype": "string"}, {"name": "Beschreibung", "dtype": "string"}, {"name": "Name", "dtype": "string"}, {"name": "Unit", "dtype": "string"}, {"name": "text", "dtype": "string"}, {"name": "Grundfunktion", "dtype": "string"}, {"name": "ZweiteGrundfunktion", "dtype": "string"}, {"name": "label", "dtype": {"class_label": {"names": {"0": "Druckhaltestation", "1": "HeizkreisAllgemein", "2": "Heizkurve", "3": "Kaeltemengenzaehler", "4": "Pumpe", "5": "Raum", "6": "Regler", "7": "Ruecklauf", "8": "Uebertrager", "9": "Ventil", "10": "Vorlauf", "11": "Waermemengenzaehler", "12": "Warmwasserbereitung"}}}}], "splits": [{"name": "train", "num_bytes": 216197.29691451695, "num_examples": 1016}, {"name": "test", "num_bytes": 423002, "num_examples": 1978}, {"name": "valid", "num_bytes": 423002, "num_examples": 1978}], "download_size": 353233, "dataset_size": 1062201.296914517}}
2023-09-22T06:26:58+00:00
[]
[]
TAGS #region-us
# Dataset Card for "waermeVerteilen-100-undersampled" More Information needed
[ "# Dataset Card for \"waermeVerteilen-100-undersampled\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"waermeVerteilen-100-undersampled\"\n\nMore Information needed" ]
[ 6, 21 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"waermeVerteilen-100-undersampled\"\n\nMore Information needed" ]
ba669109dbca8921b0024dcffba1f4a239364f77
# Dataset Card for "waermeVerteilen-200-undersampled" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
mboth/waermeVerteilen-200-undersampled
[ "region:us" ]
2023-09-22T06:26:58+00:00
{"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "test", "path": "data/test-*"}, {"split": "valid", "path": "data/valid-*"}]}], "dataset_info": {"features": [{"name": "Datatype", "dtype": "string"}, {"name": "Beschreibung", "dtype": "string"}, {"name": "Name", "dtype": "string"}, {"name": "Unit", "dtype": "string"}, {"name": "text", "dtype": "string"}, {"name": "Grundfunktion", "dtype": "string"}, {"name": "ZweiteGrundfunktion", "dtype": "string"}, {"name": "label", "dtype": {"class_label": {"names": {"0": "Druckhaltestation", "1": "HeizkreisAllgemein", "2": "Heizkurve", "3": "Kaeltemengenzaehler", "4": "Pumpe", "5": "Raum", "6": "Regler", "7": "Ruecklauf", "8": "Uebertrager", "9": "Ventil", "10": "Vorlauf", "11": "Waermemengenzaehler", "12": "Warmwasserbereitung"}}}}], "splits": [{"name": "train", "num_bytes": 407710.65048052603, "num_examples": 1916}, {"name": "test", "num_bytes": 423002, "num_examples": 1978}, {"name": "valid", "num_bytes": 423002, "num_examples": 1978}], "download_size": 411048, "dataset_size": 1253714.650480526}}
2023-09-22T06:27:04+00:00
[]
[]
TAGS #region-us
# Dataset Card for "waermeVerteilen-200-undersampled" More Information needed
[ "# Dataset Card for \"waermeVerteilen-200-undersampled\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"waermeVerteilen-200-undersampled\"\n\nMore Information needed" ]
[ 6, 21 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"waermeVerteilen-200-undersampled\"\n\nMore Information needed" ]
0a286dae07eade44b49c27d5d62dacbcc369d1d0
# Dataset Card for "village4kids_0_prompts" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
Falah/village4kids_0_prompts
[ "region:us" ]
2023-09-22T06:31:49+00:00
{"dataset_info": {"features": [{"name": "prompts", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 2702, "num_examples": 10}], "download_size": 4036, "dataset_size": 2702}}
2023-09-22T06:31:50+00:00
[]
[]
TAGS #region-us
# Dataset Card for "village4kids_0_prompts" More Information needed
[ "# Dataset Card for \"village4kids_0_prompts\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"village4kids_0_prompts\"\n\nMore Information needed" ]
[ 6, 21 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"village4kids_0_prompts\"\n\nMore Information needed" ]
bac25a6541d6c55ae9c313e433f05f7dbe84b9e0
# Dataset Card for "village4kids_1_prompts" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
Falah/village4kids_1_prompts
[ "region:us" ]
2023-09-22T06:31:51+00:00
{"dataset_info": {"features": [{"name": "prompts", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 2723, "num_examples": 11}], "download_size": 2840, "dataset_size": 2723}}
2023-09-22T06:31:52+00:00
[]
[]
TAGS #region-us
# Dataset Card for "village4kids_1_prompts" More Information needed
[ "# Dataset Card for \"village4kids_1_prompts\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"village4kids_1_prompts\"\n\nMore Information needed" ]
[ 6, 20 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"village4kids_1_prompts\"\n\nMore Information needed" ]
b87e82e1a23f7e8cd43017761ddf69f6df7f2f89
# Dataset Card for "village4kids_2_prompts" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
Falah/village4kids_2_prompts
[ "region:us" ]
2023-09-22T06:31:52+00:00
{"dataset_info": {"features": [{"name": "prompts", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 2094, "num_examples": 8}], "download_size": 2965, "dataset_size": 2094}}
2023-09-22T06:31:53+00:00
[]
[]
TAGS #region-us
# Dataset Card for "village4kids_2_prompts" More Information needed
[ "# Dataset Card for \"village4kids_2_prompts\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"village4kids_2_prompts\"\n\nMore Information needed" ]
[ 6, 21 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"village4kids_2_prompts\"\n\nMore Information needed" ]
b9483f03dbed17d99a5c9883d944e1308c68cef7
# Dataset Card for "neo-pop_surrealism" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
Falah/neo-pop_surrealism
[ "region:us" ]
2023-09-22T06:37:30+00:00
{"dataset_info": {"features": [{"name": "prompts", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 1590730, "num_examples": 10000}], "download_size": 18332, "dataset_size": 1590730}}
2023-09-22T06:37:31+00:00
[]
[]
TAGS #region-us
# Dataset Card for "neo-pop_surrealism" More Information needed
[ "# Dataset Card for \"neo-pop_surrealism\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"neo-pop_surrealism\"\n\nMore Information needed" ]
[ 6, 17 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"neo-pop_surrealism\"\n\nMore Information needed" ]