sha
stringlengths
40
40
text
stringlengths
1
13.4M
id
stringlengths
2
117
tags
listlengths
1
7.91k
created_at
stringlengths
25
25
metadata
stringlengths
2
875k
last_modified
stringlengths
25
25
arxiv
listlengths
0
25
languages
listlengths
0
7.91k
tags_str
stringlengths
17
159k
text_str
stringlengths
1
447k
text_lists
listlengths
0
352
processed_texts
listlengths
1
353
tokens_length
listlengths
1
353
input_texts
listlengths
1
40
ed7601a4f04be8d250e13003bafee77afc6d0acd
# Dataset Card for Evaluation run of BreadAi/gpt-YA-1-1_70M ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/BreadAi/gpt-YA-1-1_70M - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [BreadAi/gpt-YA-1-1_70M](https://huggingface.co/BreadAi/gpt-YA-1-1_70M) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_BreadAi__gpt-YA-1-1_70M", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-10-24T00:08:54.990074](https://huggingface.co/datasets/open-llm-leaderboard/details_BreadAi__gpt-YA-1-1_70M/blob/main/results_2023-10-24T00-08-54.990074.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.03617869127516778, "em_stderr": 0.0019123366108896051, "f1": 0.06183619966442954, "f1_stderr": 0.0021429123236932604, "acc": 0.25453827940015783, "acc_stderr": 0.007025085047248848 }, "harness|drop|3": { "em": 0.03617869127516778, "em_stderr": 0.0019123366108896051, "f1": 0.06183619966442954, "f1_stderr": 0.0021429123236932604 }, "harness|gsm8k|5": { "acc": 0.0, "acc_stderr": 0.0 }, "harness|winogrande|5": { "acc": 0.5090765588003157, "acc_stderr": 0.014050170094497697 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_BreadAi__gpt-YA-1-1_70M
[ "region:us" ]
2023-08-18T17:53:52+00:00
{"pretty_name": "Evaluation run of BreadAi/gpt-YA-1-1_70M", "dataset_summary": "Dataset automatically created during the evaluation run of model [BreadAi/gpt-YA-1-1_70M](https://huggingface.co/BreadAi/gpt-YA-1-1_70M) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_BreadAi__gpt-YA-1-1_70M\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-24T00:08:54.990074](https://huggingface.co/datasets/open-llm-leaderboard/details_BreadAi__gpt-YA-1-1_70M/blob/main/results_2023-10-24T00-08-54.990074.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.03617869127516778,\n \"em_stderr\": 0.0019123366108896051,\n \"f1\": 0.06183619966442954,\n \"f1_stderr\": 0.0021429123236932604,\n \"acc\": 0.25453827940015783,\n \"acc_stderr\": 0.007025085047248848\n },\n \"harness|drop|3\": {\n \"em\": 0.03617869127516778,\n \"em_stderr\": 0.0019123366108896051,\n \"f1\": 0.06183619966442954,\n \"f1_stderr\": 0.0021429123236932604\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\": 0.0\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.5090765588003157,\n \"acc_stderr\": 0.014050170094497697\n }\n}\n```", "repo_url": "https://huggingface.co/BreadAi/gpt-YA-1-1_70M", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_08_17T18_44_57.081356", "path": ["**/details_harness|arc:challenge|25_2023-08-17T18:44:57.081356.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-08-17T18:44:57.081356.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_24T00_08_54.990074", "path": ["**/details_harness|drop|3_2023-10-24T00-08-54.990074.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-24T00-08-54.990074.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_24T00_08_54.990074", "path": ["**/details_harness|gsm8k|5_2023-10-24T00-08-54.990074.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-24T00-08-54.990074.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_08_17T18_44_57.081356", "path": ["**/details_harness|hellaswag|10_2023-08-17T18:44:57.081356.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-08-17T18:44:57.081356.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_08_17T18_44_57.081356", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T18:44:57.081356.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-17T18:44:57.081356.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-17T18:44:57.081356.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T18:44:57.081356.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T18:44:57.081356.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-17T18:44:57.081356.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T18:44:57.081356.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T18:44:57.081356.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T18:44:57.081356.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T18:44:57.081356.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-17T18:44:57.081356.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-17T18:44:57.081356.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T18:44:57.081356.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-17T18:44:57.081356.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T18:44:57.081356.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T18:44:57.081356.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T18:44:57.081356.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-17T18:44:57.081356.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T18:44:57.081356.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T18:44:57.081356.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T18:44:57.081356.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T18:44:57.081356.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T18:44:57.081356.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T18:44:57.081356.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T18:44:57.081356.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T18:44:57.081356.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T18:44:57.081356.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T18:44:57.081356.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T18:44:57.081356.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T18:44:57.081356.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T18:44:57.081356.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T18:44:57.081356.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-17T18:44:57.081356.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T18:44:57.081356.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-17T18:44:57.081356.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T18:44:57.081356.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T18:44:57.081356.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T18:44:57.081356.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-17T18:44:57.081356.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-17T18:44:57.081356.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T18:44:57.081356.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T18:44:57.081356.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T18:44:57.081356.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T18:44:57.081356.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-17T18:44:57.081356.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-17T18:44:57.081356.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-17T18:44:57.081356.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T18:44:57.081356.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-17T18:44:57.081356.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T18:44:57.081356.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T18:44:57.081356.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-17T18:44:57.081356.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-17T18:44:57.081356.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-17T18:44:57.081356.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T18:44:57.081356.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-17T18:44:57.081356.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-17T18:44:57.081356.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T18:44:57.081356.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-17T18:44:57.081356.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-17T18:44:57.081356.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T18:44:57.081356.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T18:44:57.081356.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-17T18:44:57.081356.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T18:44:57.081356.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T18:44:57.081356.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T18:44:57.081356.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T18:44:57.081356.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-17T18:44:57.081356.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-17T18:44:57.081356.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T18:44:57.081356.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-17T18:44:57.081356.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T18:44:57.081356.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T18:44:57.081356.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T18:44:57.081356.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-17T18:44:57.081356.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T18:44:57.081356.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T18:44:57.081356.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T18:44:57.081356.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T18:44:57.081356.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T18:44:57.081356.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T18:44:57.081356.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T18:44:57.081356.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T18:44:57.081356.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T18:44:57.081356.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T18:44:57.081356.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T18:44:57.081356.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T18:44:57.081356.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T18:44:57.081356.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T18:44:57.081356.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-17T18:44:57.081356.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T18:44:57.081356.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-17T18:44:57.081356.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T18:44:57.081356.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T18:44:57.081356.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T18:44:57.081356.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-17T18:44:57.081356.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-17T18:44:57.081356.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T18:44:57.081356.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T18:44:57.081356.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T18:44:57.081356.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T18:44:57.081356.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-17T18:44:57.081356.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-17T18:44:57.081356.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-17T18:44:57.081356.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T18:44:57.081356.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-17T18:44:57.081356.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T18:44:57.081356.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T18:44:57.081356.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-17T18:44:57.081356.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-17T18:44:57.081356.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-17T18:44:57.081356.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T18:44:57.081356.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-17T18:44:57.081356.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-17T18:44:57.081356.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_08_17T18_44_57.081356", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T18:44:57.081356.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T18:44:57.081356.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_08_17T18_44_57.081356", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-17T18:44:57.081356.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-17T18:44:57.081356.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_08_17T18_44_57.081356", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-17T18:44:57.081356.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-17T18:44:57.081356.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_08_17T18_44_57.081356", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T18:44:57.081356.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T18:44:57.081356.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_08_17T18_44_57.081356", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T18:44:57.081356.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T18:44:57.081356.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_08_17T18_44_57.081356", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-17T18:44:57.081356.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-17T18:44:57.081356.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_08_17T18_44_57.081356", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T18:44:57.081356.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T18:44:57.081356.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_08_17T18_44_57.081356", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T18:44:57.081356.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T18:44:57.081356.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_08_17T18_44_57.081356", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T18:44:57.081356.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T18:44:57.081356.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_08_17T18_44_57.081356", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T18:44:57.081356.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T18:44:57.081356.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_08_17T18_44_57.081356", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-17T18:44:57.081356.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-17T18:44:57.081356.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_08_17T18_44_57.081356", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-17T18:44:57.081356.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-17T18:44:57.081356.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_08_17T18_44_57.081356", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T18:44:57.081356.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T18:44:57.081356.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_08_17T18_44_57.081356", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-17T18:44:57.081356.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-17T18:44:57.081356.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_08_17T18_44_57.081356", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T18:44:57.081356.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T18:44:57.081356.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_08_17T18_44_57.081356", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T18:44:57.081356.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T18:44:57.081356.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_08_17T18_44_57.081356", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T18:44:57.081356.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T18:44:57.081356.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_08_17T18_44_57.081356", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-17T18:44:57.081356.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-17T18:44:57.081356.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_08_17T18_44_57.081356", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T18:44:57.081356.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T18:44:57.081356.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_08_17T18_44_57.081356", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T18:44:57.081356.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T18:44:57.081356.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_08_17T18_44_57.081356", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T18:44:57.081356.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T18:44:57.081356.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_08_17T18_44_57.081356", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T18:44:57.081356.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T18:44:57.081356.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_08_17T18_44_57.081356", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T18:44:57.081356.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T18:44:57.081356.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_08_17T18_44_57.081356", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T18:44:57.081356.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T18:44:57.081356.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_08_17T18_44_57.081356", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T18:44:57.081356.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T18:44:57.081356.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_08_17T18_44_57.081356", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T18:44:57.081356.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T18:44:57.081356.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_08_17T18_44_57.081356", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T18:44:57.081356.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T18:44:57.081356.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_08_17T18_44_57.081356", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T18:44:57.081356.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T18:44:57.081356.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_08_17T18_44_57.081356", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T18:44:57.081356.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T18:44:57.081356.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_08_17T18_44_57.081356", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T18:44:57.081356.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T18:44:57.081356.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_08_17T18_44_57.081356", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T18:44:57.081356.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T18:44:57.081356.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_08_17T18_44_57.081356", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T18:44:57.081356.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T18:44:57.081356.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_08_17T18_44_57.081356", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-17T18:44:57.081356.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-17T18:44:57.081356.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_08_17T18_44_57.081356", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T18:44:57.081356.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T18:44:57.081356.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_08_17T18_44_57.081356", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-17T18:44:57.081356.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-17T18:44:57.081356.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_08_17T18_44_57.081356", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T18:44:57.081356.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T18:44:57.081356.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_08_17T18_44_57.081356", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T18:44:57.081356.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T18:44:57.081356.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_08_17T18_44_57.081356", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T18:44:57.081356.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T18:44:57.081356.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_08_17T18_44_57.081356", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-17T18:44:57.081356.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-17T18:44:57.081356.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_08_17T18_44_57.081356", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-17T18:44:57.081356.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-17T18:44:57.081356.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_08_17T18_44_57.081356", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T18:44:57.081356.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T18:44:57.081356.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_08_17T18_44_57.081356", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T18:44:57.081356.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T18:44:57.081356.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_08_17T18_44_57.081356", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T18:44:57.081356.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T18:44:57.081356.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_08_17T18_44_57.081356", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T18:44:57.081356.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T18:44:57.081356.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_08_17T18_44_57.081356", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-17T18:44:57.081356.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-17T18:44:57.081356.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_08_17T18_44_57.081356", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-17T18:44:57.081356.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-17T18:44:57.081356.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_08_17T18_44_57.081356", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-17T18:44:57.081356.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-17T18:44:57.081356.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_08_17T18_44_57.081356", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T18:44:57.081356.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T18:44:57.081356.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_08_17T18_44_57.081356", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-17T18:44:57.081356.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-17T18:44:57.081356.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_08_17T18_44_57.081356", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T18:44:57.081356.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T18:44:57.081356.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_08_17T18_44_57.081356", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T18:44:57.081356.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T18:44:57.081356.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_08_17T18_44_57.081356", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-17T18:44:57.081356.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-17T18:44:57.081356.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_08_17T18_44_57.081356", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-17T18:44:57.081356.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-17T18:44:57.081356.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_08_17T18_44_57.081356", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-17T18:44:57.081356.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-17T18:44:57.081356.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_08_17T18_44_57.081356", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T18:44:57.081356.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T18:44:57.081356.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_08_17T18_44_57.081356", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-17T18:44:57.081356.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-17T18:44:57.081356.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_08_17T18_44_57.081356", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-17T18:44:57.081356.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-17T18:44:57.081356.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_08_17T18_44_57.081356", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-17T18:44:57.081356.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-17T18:44:57.081356.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_24T00_08_54.990074", "path": ["**/details_harness|winogrande|5_2023-10-24T00-08-54.990074.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-24T00-08-54.990074.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_08_17T18_44_57.081356", "path": ["results_2023-08-17T18:44:57.081356.parquet"]}, {"split": "2023_10_24T00_08_54.990074", "path": ["results_2023-10-24T00-08-54.990074.parquet"]}, {"split": "latest", "path": ["results_2023-10-24T00-08-54.990074.parquet"]}]}]}
2023-10-23T23:09:06+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of BreadAi/gpt-YA-1-1_70M ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model BreadAi/gpt-YA-1-1_70M on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-10-24T00:08:54.990074(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of BreadAi/gpt-YA-1-1_70M", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model BreadAi/gpt-YA-1-1_70M on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-24T00:08:54.990074(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of BreadAi/gpt-YA-1-1_70M", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model BreadAi/gpt-YA-1-1_70M on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-24T00:08:54.990074(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 22, 31, 170, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of BreadAi/gpt-YA-1-1_70M## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model BreadAi/gpt-YA-1-1_70M on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-24T00:08:54.990074(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
cc2d4f1aa0baeadb850fde27213b2b7b9043c6ad
# Dataset Card for Evaluation run of yeontaek/Platypus2-13B-IA3 ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/yeontaek/Platypus2-13B-IA3 - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [yeontaek/Platypus2-13B-IA3](https://huggingface.co/yeontaek/Platypus2-13B-IA3) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_yeontaek__Platypus2-13B-IA3", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-10-16T22:47:30.949762](https://huggingface.co/datasets/open-llm-leaderboard/details_yeontaek__Platypus2-13B-IA3/blob/main/results_2023-10-16T22-47-30.949762.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.0019924496644295304, "em_stderr": 0.00045666764626669685, "f1": 0.057964555369127514, "f1_stderr": 0.0013475913067505284, "acc": 0.43493522214636066, "acc_stderr": 0.01038750232963205 }, "harness|drop|3": { "em": 0.0019924496644295304, "em_stderr": 0.00045666764626669685, "f1": 0.057964555369127514, "f1_stderr": 0.0013475913067505284 }, "harness|gsm8k|5": { "acc": 0.11296436694465505, "acc_stderr": 0.00871933902883306 }, "harness|winogrande|5": { "acc": 0.7569060773480663, "acc_stderr": 0.01205566563043104 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_yeontaek__Platypus2-13B-IA3
[ "region:us" ]
2023-08-18T17:54:03+00:00
{"pretty_name": "Evaluation run of yeontaek/Platypus2-13B-IA3", "dataset_summary": "Dataset automatically created during the evaluation run of model [yeontaek/Platypus2-13B-IA3](https://huggingface.co/yeontaek/Platypus2-13B-IA3) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_yeontaek__Platypus2-13B-IA3\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-16T22:47:30.949762](https://huggingface.co/datasets/open-llm-leaderboard/details_yeontaek__Platypus2-13B-IA3/blob/main/results_2023-10-16T22-47-30.949762.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0019924496644295304,\n \"em_stderr\": 0.00045666764626669685,\n \"f1\": 0.057964555369127514,\n \"f1_stderr\": 0.0013475913067505284,\n \"acc\": 0.43493522214636066,\n \"acc_stderr\": 0.01038750232963205\n },\n \"harness|drop|3\": {\n \"em\": 0.0019924496644295304,\n \"em_stderr\": 0.00045666764626669685,\n \"f1\": 0.057964555369127514,\n \"f1_stderr\": 0.0013475913067505284\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.11296436694465505,\n \"acc_stderr\": 0.00871933902883306\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7569060773480663,\n \"acc_stderr\": 0.01205566563043104\n }\n}\n```", "repo_url": "https://huggingface.co/yeontaek/Platypus2-13B-IA3", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_08_18T01_02_21.186475", "path": ["**/details_harness|arc:challenge|25_2023-08-18T01:02:21.186475.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-08-18T01:02:21.186475.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_16T22_47_30.949762", "path": ["**/details_harness|drop|3_2023-10-16T22-47-30.949762.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-16T22-47-30.949762.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_16T22_47_30.949762", "path": ["**/details_harness|gsm8k|5_2023-10-16T22-47-30.949762.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-16T22-47-30.949762.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_08_18T01_02_21.186475", "path": ["**/details_harness|hellaswag|10_2023-08-18T01:02:21.186475.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-08-18T01:02:21.186475.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_08_18T01_02_21.186475", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T01:02:21.186475.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-18T01:02:21.186475.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-18T01:02:21.186475.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T01:02:21.186475.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T01:02:21.186475.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-18T01:02:21.186475.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T01:02:21.186475.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T01:02:21.186475.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T01:02:21.186475.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T01:02:21.186475.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-18T01:02:21.186475.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-18T01:02:21.186475.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T01:02:21.186475.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-18T01:02:21.186475.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T01:02:21.186475.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T01:02:21.186475.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T01:02:21.186475.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-18T01:02:21.186475.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T01:02:21.186475.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T01:02:21.186475.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T01:02:21.186475.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T01:02:21.186475.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T01:02:21.186475.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T01:02:21.186475.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T01:02:21.186475.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T01:02:21.186475.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T01:02:21.186475.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T01:02:21.186475.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T01:02:21.186475.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T01:02:21.186475.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T01:02:21.186475.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T01:02:21.186475.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-18T01:02:21.186475.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T01:02:21.186475.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-18T01:02:21.186475.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T01:02:21.186475.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T01:02:21.186475.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T01:02:21.186475.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-18T01:02:21.186475.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-18T01:02:21.186475.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T01:02:21.186475.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T01:02:21.186475.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T01:02:21.186475.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T01:02:21.186475.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-18T01:02:21.186475.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-18T01:02:21.186475.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-18T01:02:21.186475.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T01:02:21.186475.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-18T01:02:21.186475.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T01:02:21.186475.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T01:02:21.186475.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-18T01:02:21.186475.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-18T01:02:21.186475.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-18T01:02:21.186475.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T01:02:21.186475.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-18T01:02:21.186475.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-18T01:02:21.186475.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T01:02:21.186475.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-18T01:02:21.186475.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-18T01:02:21.186475.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T01:02:21.186475.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T01:02:21.186475.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-18T01:02:21.186475.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T01:02:21.186475.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T01:02:21.186475.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T01:02:21.186475.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T01:02:21.186475.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-18T01:02:21.186475.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-18T01:02:21.186475.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T01:02:21.186475.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-18T01:02:21.186475.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T01:02:21.186475.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T01:02:21.186475.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T01:02:21.186475.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-18T01:02:21.186475.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T01:02:21.186475.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T01:02:21.186475.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T01:02:21.186475.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T01:02:21.186475.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T01:02:21.186475.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T01:02:21.186475.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T01:02:21.186475.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T01:02:21.186475.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T01:02:21.186475.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T01:02:21.186475.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T01:02:21.186475.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T01:02:21.186475.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T01:02:21.186475.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T01:02:21.186475.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-18T01:02:21.186475.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T01:02:21.186475.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-18T01:02:21.186475.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T01:02:21.186475.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T01:02:21.186475.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T01:02:21.186475.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-18T01:02:21.186475.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-18T01:02:21.186475.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T01:02:21.186475.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T01:02:21.186475.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T01:02:21.186475.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T01:02:21.186475.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-18T01:02:21.186475.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-18T01:02:21.186475.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-18T01:02:21.186475.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T01:02:21.186475.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-18T01:02:21.186475.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T01:02:21.186475.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T01:02:21.186475.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-18T01:02:21.186475.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-18T01:02:21.186475.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-18T01:02:21.186475.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T01:02:21.186475.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-18T01:02:21.186475.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-18T01:02:21.186475.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_08_18T01_02_21.186475", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T01:02:21.186475.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T01:02:21.186475.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_08_18T01_02_21.186475", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-18T01:02:21.186475.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-18T01:02:21.186475.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_08_18T01_02_21.186475", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-18T01:02:21.186475.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-18T01:02:21.186475.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_08_18T01_02_21.186475", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T01:02:21.186475.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T01:02:21.186475.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_08_18T01_02_21.186475", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T01:02:21.186475.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T01:02:21.186475.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_08_18T01_02_21.186475", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-18T01:02:21.186475.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-18T01:02:21.186475.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_08_18T01_02_21.186475", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T01:02:21.186475.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T01:02:21.186475.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_08_18T01_02_21.186475", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T01:02:21.186475.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T01:02:21.186475.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_08_18T01_02_21.186475", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T01:02:21.186475.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T01:02:21.186475.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_08_18T01_02_21.186475", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T01:02:21.186475.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T01:02:21.186475.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_08_18T01_02_21.186475", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-18T01:02:21.186475.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-18T01:02:21.186475.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_08_18T01_02_21.186475", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-18T01:02:21.186475.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-18T01:02:21.186475.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_08_18T01_02_21.186475", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T01:02:21.186475.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T01:02:21.186475.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_08_18T01_02_21.186475", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-18T01:02:21.186475.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-18T01:02:21.186475.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_08_18T01_02_21.186475", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T01:02:21.186475.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T01:02:21.186475.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_08_18T01_02_21.186475", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T01:02:21.186475.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T01:02:21.186475.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_08_18T01_02_21.186475", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T01:02:21.186475.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T01:02:21.186475.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_08_18T01_02_21.186475", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-18T01:02:21.186475.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-18T01:02:21.186475.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_08_18T01_02_21.186475", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T01:02:21.186475.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T01:02:21.186475.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_08_18T01_02_21.186475", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T01:02:21.186475.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T01:02:21.186475.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_08_18T01_02_21.186475", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T01:02:21.186475.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T01:02:21.186475.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_08_18T01_02_21.186475", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T01:02:21.186475.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T01:02:21.186475.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_08_18T01_02_21.186475", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T01:02:21.186475.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T01:02:21.186475.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_08_18T01_02_21.186475", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T01:02:21.186475.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T01:02:21.186475.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_08_18T01_02_21.186475", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T01:02:21.186475.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T01:02:21.186475.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_08_18T01_02_21.186475", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T01:02:21.186475.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T01:02:21.186475.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_08_18T01_02_21.186475", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T01:02:21.186475.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T01:02:21.186475.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_08_18T01_02_21.186475", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T01:02:21.186475.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T01:02:21.186475.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_08_18T01_02_21.186475", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T01:02:21.186475.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T01:02:21.186475.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_08_18T01_02_21.186475", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T01:02:21.186475.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T01:02:21.186475.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_08_18T01_02_21.186475", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T01:02:21.186475.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T01:02:21.186475.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_08_18T01_02_21.186475", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T01:02:21.186475.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T01:02:21.186475.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_08_18T01_02_21.186475", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-18T01:02:21.186475.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-18T01:02:21.186475.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_08_18T01_02_21.186475", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T01:02:21.186475.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T01:02:21.186475.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_08_18T01_02_21.186475", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-18T01:02:21.186475.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-18T01:02:21.186475.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_08_18T01_02_21.186475", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T01:02:21.186475.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T01:02:21.186475.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_08_18T01_02_21.186475", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T01:02:21.186475.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T01:02:21.186475.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_08_18T01_02_21.186475", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T01:02:21.186475.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T01:02:21.186475.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_08_18T01_02_21.186475", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-18T01:02:21.186475.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-18T01:02:21.186475.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_08_18T01_02_21.186475", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-18T01:02:21.186475.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-18T01:02:21.186475.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_08_18T01_02_21.186475", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T01:02:21.186475.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T01:02:21.186475.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_08_18T01_02_21.186475", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T01:02:21.186475.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T01:02:21.186475.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_08_18T01_02_21.186475", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T01:02:21.186475.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T01:02:21.186475.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_08_18T01_02_21.186475", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T01:02:21.186475.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T01:02:21.186475.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_08_18T01_02_21.186475", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-18T01:02:21.186475.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-18T01:02:21.186475.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_08_18T01_02_21.186475", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-18T01:02:21.186475.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-18T01:02:21.186475.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_08_18T01_02_21.186475", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-18T01:02:21.186475.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-18T01:02:21.186475.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_08_18T01_02_21.186475", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T01:02:21.186475.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T01:02:21.186475.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_08_18T01_02_21.186475", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-18T01:02:21.186475.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-18T01:02:21.186475.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_08_18T01_02_21.186475", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T01:02:21.186475.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T01:02:21.186475.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_08_18T01_02_21.186475", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T01:02:21.186475.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T01:02:21.186475.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_08_18T01_02_21.186475", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-18T01:02:21.186475.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-18T01:02:21.186475.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_08_18T01_02_21.186475", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-18T01:02:21.186475.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-18T01:02:21.186475.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_08_18T01_02_21.186475", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-18T01:02:21.186475.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-18T01:02:21.186475.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_08_18T01_02_21.186475", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T01:02:21.186475.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T01:02:21.186475.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_08_18T01_02_21.186475", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-18T01:02:21.186475.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-18T01:02:21.186475.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_08_18T01_02_21.186475", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-18T01:02:21.186475.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-18T01:02:21.186475.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_08_18T01_02_21.186475", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-18T01:02:21.186475.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-18T01:02:21.186475.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_16T22_47_30.949762", "path": ["**/details_harness|winogrande|5_2023-10-16T22-47-30.949762.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-16T22-47-30.949762.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_08_18T01_02_21.186475", "path": ["results_2023-08-18T01:02:21.186475.parquet"]}, {"split": "2023_10_16T22_47_30.949762", "path": ["results_2023-10-16T22-47-30.949762.parquet"]}, {"split": "latest", "path": ["results_2023-10-16T22-47-30.949762.parquet"]}]}]}
2023-10-16T21:47:43+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of yeontaek/Platypus2-13B-IA3 ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model yeontaek/Platypus2-13B-IA3 on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-10-16T22:47:30.949762(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of yeontaek/Platypus2-13B-IA3", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model yeontaek/Platypus2-13B-IA3 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-16T22:47:30.949762(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of yeontaek/Platypus2-13B-IA3", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model yeontaek/Platypus2-13B-IA3 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-16T22:47:30.949762(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 22, 31, 170, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of yeontaek/Platypus2-13B-IA3## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model yeontaek/Platypus2-13B-IA3 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-16T22:47:30.949762(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
ee44f19f446bc5beda197a7a9e6ed03bb4ee2b42
# Dataset Card for Evaluation run of yeontaek/llama-2-70b-IA3-guanaco ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/yeontaek/llama-2-70b-IA3-guanaco - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [yeontaek/llama-2-70b-IA3-guanaco](https://huggingface.co/yeontaek/llama-2-70b-IA3-guanaco) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_yeontaek__llama-2-70b-IA3-guanaco", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-10-23T01:35:02.299684](https://huggingface.co/datasets/open-llm-leaderboard/details_yeontaek__llama-2-70b-IA3-guanaco/blob/main/results_2023-10-23T01-35-02.299684.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.059354026845637585, "em_stderr": 0.0024197909382591906, "f1": 0.12265834731543575, "f1_stderr": 0.0026243794222964158, "acc": 0.5548770235038503, "acc_stderr": 0.011602676960733152 }, "harness|drop|3": { "em": 0.059354026845637585, "em_stderr": 0.0024197909382591906, "f1": 0.12265834731543575, "f1_stderr": 0.0026243794222964158 }, "harness|gsm8k|5": { "acc": 0.287338893100834, "acc_stderr": 0.012464677060107086 }, "harness|winogrande|5": { "acc": 0.8224151539068666, "acc_stderr": 0.01074067686135922 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_yeontaek__llama-2-70b-IA3-guanaco
[ "region:us" ]
2023-08-18T17:54:12+00:00
{"pretty_name": "Evaluation run of yeontaek/llama-2-70b-IA3-guanaco", "dataset_summary": "Dataset automatically created during the evaluation run of model [yeontaek/llama-2-70b-IA3-guanaco](https://huggingface.co/yeontaek/llama-2-70b-IA3-guanaco) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_yeontaek__llama-2-70b-IA3-guanaco\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-23T01:35:02.299684](https://huggingface.co/datasets/open-llm-leaderboard/details_yeontaek__llama-2-70b-IA3-guanaco/blob/main/results_2023-10-23T01-35-02.299684.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.059354026845637585,\n \"em_stderr\": 0.0024197909382591906,\n \"f1\": 0.12265834731543575,\n \"f1_stderr\": 0.0026243794222964158,\n \"acc\": 0.5548770235038503,\n \"acc_stderr\": 0.011602676960733152\n },\n \"harness|drop|3\": {\n \"em\": 0.059354026845637585,\n \"em_stderr\": 0.0024197909382591906,\n \"f1\": 0.12265834731543575,\n \"f1_stderr\": 0.0026243794222964158\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.287338893100834,\n \"acc_stderr\": 0.012464677060107086\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8224151539068666,\n \"acc_stderr\": 0.01074067686135922\n }\n}\n```", "repo_url": "https://huggingface.co/yeontaek/llama-2-70b-IA3-guanaco", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_08_18T03_44_14.521953", "path": ["**/details_harness|arc:challenge|25_2023-08-18T03:44:14.521953.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-08-18T03:44:14.521953.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_23T01_35_02.299684", "path": ["**/details_harness|drop|3_2023-10-23T01-35-02.299684.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-23T01-35-02.299684.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_23T01_35_02.299684", "path": ["**/details_harness|gsm8k|5_2023-10-23T01-35-02.299684.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-23T01-35-02.299684.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_08_18T03_44_14.521953", "path": ["**/details_harness|hellaswag|10_2023-08-18T03:44:14.521953.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-08-18T03:44:14.521953.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_08_18T03_44_14.521953", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T03:44:14.521953.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-18T03:44:14.521953.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-18T03:44:14.521953.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T03:44:14.521953.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T03:44:14.521953.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-18T03:44:14.521953.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T03:44:14.521953.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T03:44:14.521953.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T03:44:14.521953.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T03:44:14.521953.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-18T03:44:14.521953.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-18T03:44:14.521953.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T03:44:14.521953.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-18T03:44:14.521953.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T03:44:14.521953.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T03:44:14.521953.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T03:44:14.521953.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-18T03:44:14.521953.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T03:44:14.521953.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T03:44:14.521953.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T03:44:14.521953.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T03:44:14.521953.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T03:44:14.521953.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T03:44:14.521953.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T03:44:14.521953.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T03:44:14.521953.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T03:44:14.521953.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T03:44:14.521953.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T03:44:14.521953.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T03:44:14.521953.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T03:44:14.521953.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T03:44:14.521953.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-18T03:44:14.521953.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T03:44:14.521953.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-18T03:44:14.521953.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T03:44:14.521953.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T03:44:14.521953.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T03:44:14.521953.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-18T03:44:14.521953.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-18T03:44:14.521953.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T03:44:14.521953.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T03:44:14.521953.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T03:44:14.521953.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T03:44:14.521953.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-18T03:44:14.521953.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-18T03:44:14.521953.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-18T03:44:14.521953.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T03:44:14.521953.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-18T03:44:14.521953.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T03:44:14.521953.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T03:44:14.521953.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-18T03:44:14.521953.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-18T03:44:14.521953.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-18T03:44:14.521953.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T03:44:14.521953.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-18T03:44:14.521953.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-18T03:44:14.521953.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T03:44:14.521953.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-18T03:44:14.521953.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-18T03:44:14.521953.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T03:44:14.521953.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T03:44:14.521953.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-18T03:44:14.521953.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T03:44:14.521953.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T03:44:14.521953.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T03:44:14.521953.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T03:44:14.521953.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-18T03:44:14.521953.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-18T03:44:14.521953.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T03:44:14.521953.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-18T03:44:14.521953.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T03:44:14.521953.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T03:44:14.521953.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T03:44:14.521953.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-18T03:44:14.521953.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T03:44:14.521953.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T03:44:14.521953.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T03:44:14.521953.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T03:44:14.521953.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T03:44:14.521953.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T03:44:14.521953.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T03:44:14.521953.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T03:44:14.521953.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T03:44:14.521953.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T03:44:14.521953.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T03:44:14.521953.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T03:44:14.521953.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T03:44:14.521953.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T03:44:14.521953.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-18T03:44:14.521953.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T03:44:14.521953.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-18T03:44:14.521953.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T03:44:14.521953.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T03:44:14.521953.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T03:44:14.521953.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-18T03:44:14.521953.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-18T03:44:14.521953.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T03:44:14.521953.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T03:44:14.521953.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T03:44:14.521953.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T03:44:14.521953.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-18T03:44:14.521953.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-18T03:44:14.521953.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-18T03:44:14.521953.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T03:44:14.521953.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-18T03:44:14.521953.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T03:44:14.521953.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T03:44:14.521953.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-18T03:44:14.521953.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-18T03:44:14.521953.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-18T03:44:14.521953.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T03:44:14.521953.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-18T03:44:14.521953.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-18T03:44:14.521953.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_08_18T03_44_14.521953", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T03:44:14.521953.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T03:44:14.521953.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_08_18T03_44_14.521953", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-18T03:44:14.521953.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-18T03:44:14.521953.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_08_18T03_44_14.521953", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-18T03:44:14.521953.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-18T03:44:14.521953.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_08_18T03_44_14.521953", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T03:44:14.521953.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T03:44:14.521953.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_08_18T03_44_14.521953", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T03:44:14.521953.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T03:44:14.521953.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_08_18T03_44_14.521953", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-18T03:44:14.521953.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-18T03:44:14.521953.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_08_18T03_44_14.521953", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T03:44:14.521953.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T03:44:14.521953.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_08_18T03_44_14.521953", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T03:44:14.521953.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T03:44:14.521953.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_08_18T03_44_14.521953", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T03:44:14.521953.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T03:44:14.521953.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_08_18T03_44_14.521953", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T03:44:14.521953.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T03:44:14.521953.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_08_18T03_44_14.521953", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-18T03:44:14.521953.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-18T03:44:14.521953.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_08_18T03_44_14.521953", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-18T03:44:14.521953.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-18T03:44:14.521953.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_08_18T03_44_14.521953", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T03:44:14.521953.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T03:44:14.521953.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_08_18T03_44_14.521953", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-18T03:44:14.521953.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-18T03:44:14.521953.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_08_18T03_44_14.521953", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T03:44:14.521953.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T03:44:14.521953.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_08_18T03_44_14.521953", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T03:44:14.521953.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T03:44:14.521953.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_08_18T03_44_14.521953", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T03:44:14.521953.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T03:44:14.521953.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_08_18T03_44_14.521953", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-18T03:44:14.521953.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-18T03:44:14.521953.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_08_18T03_44_14.521953", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T03:44:14.521953.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T03:44:14.521953.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_08_18T03_44_14.521953", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T03:44:14.521953.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T03:44:14.521953.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_08_18T03_44_14.521953", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T03:44:14.521953.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T03:44:14.521953.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_08_18T03_44_14.521953", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T03:44:14.521953.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T03:44:14.521953.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_08_18T03_44_14.521953", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T03:44:14.521953.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T03:44:14.521953.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_08_18T03_44_14.521953", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T03:44:14.521953.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T03:44:14.521953.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_08_18T03_44_14.521953", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T03:44:14.521953.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T03:44:14.521953.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_08_18T03_44_14.521953", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T03:44:14.521953.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T03:44:14.521953.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_08_18T03_44_14.521953", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T03:44:14.521953.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T03:44:14.521953.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_08_18T03_44_14.521953", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T03:44:14.521953.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T03:44:14.521953.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_08_18T03_44_14.521953", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T03:44:14.521953.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T03:44:14.521953.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_08_18T03_44_14.521953", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T03:44:14.521953.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T03:44:14.521953.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_08_18T03_44_14.521953", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T03:44:14.521953.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T03:44:14.521953.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_08_18T03_44_14.521953", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T03:44:14.521953.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T03:44:14.521953.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_08_18T03_44_14.521953", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-18T03:44:14.521953.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-18T03:44:14.521953.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_08_18T03_44_14.521953", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T03:44:14.521953.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T03:44:14.521953.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_08_18T03_44_14.521953", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-18T03:44:14.521953.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-18T03:44:14.521953.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_08_18T03_44_14.521953", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T03:44:14.521953.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T03:44:14.521953.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_08_18T03_44_14.521953", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T03:44:14.521953.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T03:44:14.521953.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_08_18T03_44_14.521953", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T03:44:14.521953.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T03:44:14.521953.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_08_18T03_44_14.521953", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-18T03:44:14.521953.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-18T03:44:14.521953.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_08_18T03_44_14.521953", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-18T03:44:14.521953.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-18T03:44:14.521953.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_08_18T03_44_14.521953", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T03:44:14.521953.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T03:44:14.521953.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_08_18T03_44_14.521953", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T03:44:14.521953.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T03:44:14.521953.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_08_18T03_44_14.521953", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T03:44:14.521953.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T03:44:14.521953.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_08_18T03_44_14.521953", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T03:44:14.521953.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T03:44:14.521953.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_08_18T03_44_14.521953", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-18T03:44:14.521953.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-18T03:44:14.521953.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_08_18T03_44_14.521953", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-18T03:44:14.521953.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-18T03:44:14.521953.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_08_18T03_44_14.521953", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-18T03:44:14.521953.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-18T03:44:14.521953.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_08_18T03_44_14.521953", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T03:44:14.521953.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T03:44:14.521953.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_08_18T03_44_14.521953", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-18T03:44:14.521953.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-18T03:44:14.521953.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_08_18T03_44_14.521953", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T03:44:14.521953.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T03:44:14.521953.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_08_18T03_44_14.521953", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T03:44:14.521953.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T03:44:14.521953.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_08_18T03_44_14.521953", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-18T03:44:14.521953.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-18T03:44:14.521953.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_08_18T03_44_14.521953", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-18T03:44:14.521953.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-18T03:44:14.521953.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_08_18T03_44_14.521953", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-18T03:44:14.521953.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-18T03:44:14.521953.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_08_18T03_44_14.521953", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T03:44:14.521953.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T03:44:14.521953.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_08_18T03_44_14.521953", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-18T03:44:14.521953.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-18T03:44:14.521953.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_08_18T03_44_14.521953", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-18T03:44:14.521953.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-18T03:44:14.521953.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_08_18T03_44_14.521953", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-18T03:44:14.521953.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-18T03:44:14.521953.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_23T01_35_02.299684", "path": ["**/details_harness|winogrande|5_2023-10-23T01-35-02.299684.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-23T01-35-02.299684.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_08_18T03_44_14.521953", "path": ["results_2023-08-18T03:44:14.521953.parquet"]}, {"split": "2023_10_23T01_35_02.299684", "path": ["results_2023-10-23T01-35-02.299684.parquet"]}, {"split": "latest", "path": ["results_2023-10-23T01-35-02.299684.parquet"]}]}]}
2023-10-23T00:35:16+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of yeontaek/llama-2-70b-IA3-guanaco ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model yeontaek/llama-2-70b-IA3-guanaco on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-10-23T01:35:02.299684(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of yeontaek/llama-2-70b-IA3-guanaco", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model yeontaek/llama-2-70b-IA3-guanaco on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-23T01:35:02.299684(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of yeontaek/llama-2-70b-IA3-guanaco", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model yeontaek/llama-2-70b-IA3-guanaco on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-23T01:35:02.299684(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 24, 31, 172, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of yeontaek/llama-2-70b-IA3-guanaco## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model yeontaek/llama-2-70b-IA3-guanaco on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-23T01:35:02.299684(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
b20c4cbd9f386c2d7b3c852cd06c81a8f747a430
# Dataset Card for Evaluation run of yeontaek/Platypus2xOpenOrca-13B-IA3 ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/yeontaek/Platypus2xOpenOrca-13B-IA3 - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [yeontaek/Platypus2xOpenOrca-13B-IA3](https://huggingface.co/yeontaek/Platypus2xOpenOrca-13B-IA3) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_yeontaek__Platypus2xOpenOrca-13B-IA3", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-10-16T07:16:50.024498](https://huggingface.co/datasets/open-llm-leaderboard/details_yeontaek__Platypus2xOpenOrca-13B-IA3/blob/main/results_2023-10-16T07-16-50.024498.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.004299496644295302, "em_stderr": 0.0006700586558629855, "f1": 0.07570155201342293, "f1_stderr": 0.0016454710303896235, "acc": 0.44469214138811486, "acc_stderr": 0.010351218038230171 }, "harness|drop|3": { "em": 0.004299496644295302, "em_stderr": 0.0006700586558629855, "f1": 0.07570155201342293, "f1_stderr": 0.0016454710303896235 }, "harness|gsm8k|5": { "acc": 0.11827141774071266, "acc_stderr": 0.008895075852434951 }, "harness|winogrande|5": { "acc": 0.771112865035517, "acc_stderr": 0.01180736022402539 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_yeontaek__Platypus2xOpenOrca-13B-IA3
[ "region:us" ]
2023-08-18T17:54:21+00:00
{"pretty_name": "Evaluation run of yeontaek/Platypus2xOpenOrca-13B-IA3", "dataset_summary": "Dataset automatically created during the evaluation run of model [yeontaek/Platypus2xOpenOrca-13B-IA3](https://huggingface.co/yeontaek/Platypus2xOpenOrca-13B-IA3) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_yeontaek__Platypus2xOpenOrca-13B-IA3\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-16T07:16:50.024498](https://huggingface.co/datasets/open-llm-leaderboard/details_yeontaek__Platypus2xOpenOrca-13B-IA3/blob/main/results_2023-10-16T07-16-50.024498.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.004299496644295302,\n \"em_stderr\": 0.0006700586558629855,\n \"f1\": 0.07570155201342293,\n \"f1_stderr\": 0.0016454710303896235,\n \"acc\": 0.44469214138811486,\n \"acc_stderr\": 0.010351218038230171\n },\n \"harness|drop|3\": {\n \"em\": 0.004299496644295302,\n \"em_stderr\": 0.0006700586558629855,\n \"f1\": 0.07570155201342293,\n \"f1_stderr\": 0.0016454710303896235\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.11827141774071266,\n \"acc_stderr\": 0.008895075852434951\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.771112865035517,\n \"acc_stderr\": 0.01180736022402539\n }\n}\n```", "repo_url": "https://huggingface.co/yeontaek/Platypus2xOpenOrca-13B-IA3", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_08_18T07_56_15.654577", "path": ["**/details_harness|arc:challenge|25_2023-08-18T07:56:15.654577.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-08-18T07:56:15.654577.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_16T07_16_50.024498", "path": ["**/details_harness|drop|3_2023-10-16T07-16-50.024498.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-16T07-16-50.024498.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_16T07_16_50.024498", "path": ["**/details_harness|gsm8k|5_2023-10-16T07-16-50.024498.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-16T07-16-50.024498.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_08_18T07_56_15.654577", "path": ["**/details_harness|hellaswag|10_2023-08-18T07:56:15.654577.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-08-18T07:56:15.654577.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_08_18T07_56_15.654577", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T07:56:15.654577.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-18T07:56:15.654577.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-18T07:56:15.654577.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T07:56:15.654577.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T07:56:15.654577.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-18T07:56:15.654577.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T07:56:15.654577.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T07:56:15.654577.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T07:56:15.654577.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T07:56:15.654577.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-18T07:56:15.654577.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-18T07:56:15.654577.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T07:56:15.654577.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-18T07:56:15.654577.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T07:56:15.654577.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T07:56:15.654577.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T07:56:15.654577.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-18T07:56:15.654577.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T07:56:15.654577.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T07:56:15.654577.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T07:56:15.654577.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T07:56:15.654577.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T07:56:15.654577.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T07:56:15.654577.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T07:56:15.654577.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T07:56:15.654577.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T07:56:15.654577.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T07:56:15.654577.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T07:56:15.654577.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T07:56:15.654577.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T07:56:15.654577.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T07:56:15.654577.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-18T07:56:15.654577.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T07:56:15.654577.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-18T07:56:15.654577.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T07:56:15.654577.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T07:56:15.654577.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T07:56:15.654577.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-18T07:56:15.654577.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-18T07:56:15.654577.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T07:56:15.654577.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T07:56:15.654577.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T07:56:15.654577.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T07:56:15.654577.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-18T07:56:15.654577.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-18T07:56:15.654577.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-18T07:56:15.654577.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T07:56:15.654577.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-18T07:56:15.654577.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T07:56:15.654577.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T07:56:15.654577.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-18T07:56:15.654577.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-18T07:56:15.654577.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-18T07:56:15.654577.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T07:56:15.654577.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-18T07:56:15.654577.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-18T07:56:15.654577.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T07:56:15.654577.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-18T07:56:15.654577.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-18T07:56:15.654577.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T07:56:15.654577.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T07:56:15.654577.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-18T07:56:15.654577.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T07:56:15.654577.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T07:56:15.654577.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T07:56:15.654577.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T07:56:15.654577.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-18T07:56:15.654577.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-18T07:56:15.654577.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T07:56:15.654577.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-18T07:56:15.654577.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T07:56:15.654577.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T07:56:15.654577.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T07:56:15.654577.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-18T07:56:15.654577.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T07:56:15.654577.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T07:56:15.654577.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T07:56:15.654577.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T07:56:15.654577.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T07:56:15.654577.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T07:56:15.654577.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T07:56:15.654577.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T07:56:15.654577.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T07:56:15.654577.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T07:56:15.654577.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T07:56:15.654577.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T07:56:15.654577.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T07:56:15.654577.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T07:56:15.654577.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-18T07:56:15.654577.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T07:56:15.654577.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-18T07:56:15.654577.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T07:56:15.654577.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T07:56:15.654577.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T07:56:15.654577.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-18T07:56:15.654577.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-18T07:56:15.654577.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T07:56:15.654577.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T07:56:15.654577.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T07:56:15.654577.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T07:56:15.654577.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-18T07:56:15.654577.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-18T07:56:15.654577.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-18T07:56:15.654577.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T07:56:15.654577.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-18T07:56:15.654577.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T07:56:15.654577.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T07:56:15.654577.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-18T07:56:15.654577.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-18T07:56:15.654577.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-18T07:56:15.654577.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T07:56:15.654577.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-18T07:56:15.654577.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-18T07:56:15.654577.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_08_18T07_56_15.654577", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T07:56:15.654577.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T07:56:15.654577.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_08_18T07_56_15.654577", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-18T07:56:15.654577.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-18T07:56:15.654577.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_08_18T07_56_15.654577", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-18T07:56:15.654577.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-18T07:56:15.654577.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_08_18T07_56_15.654577", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T07:56:15.654577.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T07:56:15.654577.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_08_18T07_56_15.654577", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T07:56:15.654577.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T07:56:15.654577.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_08_18T07_56_15.654577", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-18T07:56:15.654577.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-18T07:56:15.654577.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_08_18T07_56_15.654577", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T07:56:15.654577.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T07:56:15.654577.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_08_18T07_56_15.654577", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T07:56:15.654577.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T07:56:15.654577.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_08_18T07_56_15.654577", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T07:56:15.654577.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T07:56:15.654577.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_08_18T07_56_15.654577", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T07:56:15.654577.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T07:56:15.654577.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_08_18T07_56_15.654577", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-18T07:56:15.654577.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-18T07:56:15.654577.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_08_18T07_56_15.654577", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-18T07:56:15.654577.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-18T07:56:15.654577.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_08_18T07_56_15.654577", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T07:56:15.654577.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T07:56:15.654577.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_08_18T07_56_15.654577", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-18T07:56:15.654577.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-18T07:56:15.654577.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_08_18T07_56_15.654577", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T07:56:15.654577.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T07:56:15.654577.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_08_18T07_56_15.654577", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T07:56:15.654577.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T07:56:15.654577.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_08_18T07_56_15.654577", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T07:56:15.654577.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T07:56:15.654577.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_08_18T07_56_15.654577", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-18T07:56:15.654577.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-18T07:56:15.654577.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_08_18T07_56_15.654577", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T07:56:15.654577.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T07:56:15.654577.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_08_18T07_56_15.654577", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T07:56:15.654577.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T07:56:15.654577.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_08_18T07_56_15.654577", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T07:56:15.654577.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T07:56:15.654577.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_08_18T07_56_15.654577", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T07:56:15.654577.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T07:56:15.654577.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_08_18T07_56_15.654577", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T07:56:15.654577.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T07:56:15.654577.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_08_18T07_56_15.654577", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T07:56:15.654577.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T07:56:15.654577.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_08_18T07_56_15.654577", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T07:56:15.654577.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T07:56:15.654577.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_08_18T07_56_15.654577", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T07:56:15.654577.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T07:56:15.654577.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_08_18T07_56_15.654577", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T07:56:15.654577.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T07:56:15.654577.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_08_18T07_56_15.654577", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T07:56:15.654577.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T07:56:15.654577.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_08_18T07_56_15.654577", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T07:56:15.654577.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T07:56:15.654577.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_08_18T07_56_15.654577", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T07:56:15.654577.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T07:56:15.654577.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_08_18T07_56_15.654577", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T07:56:15.654577.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T07:56:15.654577.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_08_18T07_56_15.654577", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T07:56:15.654577.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T07:56:15.654577.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_08_18T07_56_15.654577", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-18T07:56:15.654577.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-18T07:56:15.654577.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_08_18T07_56_15.654577", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T07:56:15.654577.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T07:56:15.654577.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_08_18T07_56_15.654577", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-18T07:56:15.654577.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-18T07:56:15.654577.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_08_18T07_56_15.654577", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T07:56:15.654577.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T07:56:15.654577.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_08_18T07_56_15.654577", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T07:56:15.654577.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T07:56:15.654577.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_08_18T07_56_15.654577", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T07:56:15.654577.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T07:56:15.654577.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_08_18T07_56_15.654577", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-18T07:56:15.654577.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-18T07:56:15.654577.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_08_18T07_56_15.654577", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-18T07:56:15.654577.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-18T07:56:15.654577.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_08_18T07_56_15.654577", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T07:56:15.654577.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T07:56:15.654577.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_08_18T07_56_15.654577", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T07:56:15.654577.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T07:56:15.654577.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_08_18T07_56_15.654577", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T07:56:15.654577.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T07:56:15.654577.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_08_18T07_56_15.654577", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T07:56:15.654577.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T07:56:15.654577.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_08_18T07_56_15.654577", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-18T07:56:15.654577.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-18T07:56:15.654577.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_08_18T07_56_15.654577", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-18T07:56:15.654577.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-18T07:56:15.654577.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_08_18T07_56_15.654577", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-18T07:56:15.654577.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-18T07:56:15.654577.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_08_18T07_56_15.654577", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T07:56:15.654577.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T07:56:15.654577.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_08_18T07_56_15.654577", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-18T07:56:15.654577.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-18T07:56:15.654577.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_08_18T07_56_15.654577", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T07:56:15.654577.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T07:56:15.654577.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_08_18T07_56_15.654577", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T07:56:15.654577.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T07:56:15.654577.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_08_18T07_56_15.654577", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-18T07:56:15.654577.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-18T07:56:15.654577.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_08_18T07_56_15.654577", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-18T07:56:15.654577.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-18T07:56:15.654577.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_08_18T07_56_15.654577", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-18T07:56:15.654577.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-18T07:56:15.654577.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_08_18T07_56_15.654577", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T07:56:15.654577.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T07:56:15.654577.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_08_18T07_56_15.654577", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-18T07:56:15.654577.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-18T07:56:15.654577.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_08_18T07_56_15.654577", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-18T07:56:15.654577.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-18T07:56:15.654577.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_08_18T07_56_15.654577", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-18T07:56:15.654577.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-18T07:56:15.654577.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_16T07_16_50.024498", "path": ["**/details_harness|winogrande|5_2023-10-16T07-16-50.024498.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-16T07-16-50.024498.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_08_18T07_56_15.654577", "path": ["results_2023-08-18T07:56:15.654577.parquet"]}, {"split": "2023_10_16T07_16_50.024498", "path": ["results_2023-10-16T07-16-50.024498.parquet"]}, {"split": "latest", "path": ["results_2023-10-16T07-16-50.024498.parquet"]}]}]}
2023-10-16T06:17:03+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of yeontaek/Platypus2xOpenOrca-13B-IA3 ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model yeontaek/Platypus2xOpenOrca-13B-IA3 on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-10-16T07:16:50.024498(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of yeontaek/Platypus2xOpenOrca-13B-IA3", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model yeontaek/Platypus2xOpenOrca-13B-IA3 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-16T07:16:50.024498(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of yeontaek/Platypus2xOpenOrca-13B-IA3", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model yeontaek/Platypus2xOpenOrca-13B-IA3 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-16T07:16:50.024498(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 26, 31, 174, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of yeontaek/Platypus2xOpenOrca-13B-IA3## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model yeontaek/Platypus2xOpenOrca-13B-IA3 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-16T07:16:50.024498(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
c053185cbc7793e7c7ffe365934e62c68d347d6c
# Dataset Card for Evaluation run of yeontaek/Platypus2-13B-LoRa ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/yeontaek/Platypus2-13B-LoRa - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [yeontaek/Platypus2-13B-LoRa](https://huggingface.co/yeontaek/Platypus2-13B-LoRa) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_yeontaek__Platypus2-13B-LoRa", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-09-18T02:18:05.535474](https://huggingface.co/datasets/open-llm-leaderboard/details_yeontaek__Platypus2-13B-LoRa/blob/main/results_2023-09-18T02-18-05.535474.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.029991610738255032, "em_stderr": 0.0017467360834755531, "f1": 0.09242449664429517, "f1_stderr": 0.0021342871324921244, "acc": 0.417165368277252, "acc_stderr": 0.009636596178855414 }, "harness|drop|3": { "em": 0.029991610738255032, "em_stderr": 0.0017467360834755531, "f1": 0.09242449664429517, "f1_stderr": 0.0021342871324921244 }, "harness|gsm8k|5": { "acc": 0.07505686125852919, "acc_stderr": 0.007257633145486642 }, "harness|winogrande|5": { "acc": 0.7592738752959748, "acc_stderr": 0.012015559212224185 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_yeontaek__Platypus2-13B-LoRa
[ "region:us" ]
2023-08-18T17:54:30+00:00
{"pretty_name": "Evaluation run of yeontaek/Platypus2-13B-LoRa", "dataset_summary": "Dataset automatically created during the evaluation run of model [yeontaek/Platypus2-13B-LoRa](https://huggingface.co/yeontaek/Platypus2-13B-LoRa) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_yeontaek__Platypus2-13B-LoRa\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-09-18T02:18:05.535474](https://huggingface.co/datasets/open-llm-leaderboard/details_yeontaek__Platypus2-13B-LoRa/blob/main/results_2023-09-18T02-18-05.535474.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.029991610738255032,\n \"em_stderr\": 0.0017467360834755531,\n \"f1\": 0.09242449664429517,\n \"f1_stderr\": 0.0021342871324921244,\n \"acc\": 0.417165368277252,\n \"acc_stderr\": 0.009636596178855414\n },\n \"harness|drop|3\": {\n \"em\": 0.029991610738255032,\n \"em_stderr\": 0.0017467360834755531,\n \"f1\": 0.09242449664429517,\n \"f1_stderr\": 0.0021342871324921244\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.07505686125852919,\n \"acc_stderr\": 0.007257633145486642\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7592738752959748,\n \"acc_stderr\": 0.012015559212224185\n }\n}\n```", "repo_url": "https://huggingface.co/yeontaek/Platypus2-13B-LoRa", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_08_18T01_15_43.856388", "path": ["**/details_harness|arc:challenge|25_2023-08-18T01:15:43.856388.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-08-18T01:15:43.856388.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_09_18T02_18_05.535474", "path": ["**/details_harness|drop|3_2023-09-18T02-18-05.535474.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-09-18T02-18-05.535474.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_09_18T02_18_05.535474", "path": ["**/details_harness|gsm8k|5_2023-09-18T02-18-05.535474.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-09-18T02-18-05.535474.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_08_18T01_15_43.856388", "path": ["**/details_harness|hellaswag|10_2023-08-18T01:15:43.856388.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-08-18T01:15:43.856388.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_08_18T01_15_43.856388", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T01:15:43.856388.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-18T01:15:43.856388.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-18T01:15:43.856388.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T01:15:43.856388.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T01:15:43.856388.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-18T01:15:43.856388.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T01:15:43.856388.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T01:15:43.856388.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T01:15:43.856388.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T01:15:43.856388.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-18T01:15:43.856388.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-18T01:15:43.856388.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T01:15:43.856388.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-18T01:15:43.856388.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T01:15:43.856388.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T01:15:43.856388.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T01:15:43.856388.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-18T01:15:43.856388.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T01:15:43.856388.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T01:15:43.856388.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T01:15:43.856388.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T01:15:43.856388.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T01:15:43.856388.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T01:15:43.856388.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T01:15:43.856388.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T01:15:43.856388.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T01:15:43.856388.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T01:15:43.856388.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T01:15:43.856388.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T01:15:43.856388.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T01:15:43.856388.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T01:15:43.856388.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-18T01:15:43.856388.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T01:15:43.856388.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-18T01:15:43.856388.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T01:15:43.856388.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T01:15:43.856388.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T01:15:43.856388.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-18T01:15:43.856388.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-18T01:15:43.856388.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T01:15:43.856388.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T01:15:43.856388.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T01:15:43.856388.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T01:15:43.856388.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-18T01:15:43.856388.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-18T01:15:43.856388.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-18T01:15:43.856388.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T01:15:43.856388.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-18T01:15:43.856388.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T01:15:43.856388.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T01:15:43.856388.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-18T01:15:43.856388.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-18T01:15:43.856388.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-18T01:15:43.856388.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T01:15:43.856388.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-18T01:15:43.856388.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-18T01:15:43.856388.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T01:15:43.856388.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-18T01:15:43.856388.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-18T01:15:43.856388.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T01:15:43.856388.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T01:15:43.856388.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-18T01:15:43.856388.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T01:15:43.856388.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T01:15:43.856388.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T01:15:43.856388.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T01:15:43.856388.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-18T01:15:43.856388.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-18T01:15:43.856388.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T01:15:43.856388.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-18T01:15:43.856388.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T01:15:43.856388.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T01:15:43.856388.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T01:15:43.856388.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-18T01:15:43.856388.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T01:15:43.856388.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T01:15:43.856388.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T01:15:43.856388.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T01:15:43.856388.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T01:15:43.856388.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T01:15:43.856388.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T01:15:43.856388.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T01:15:43.856388.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T01:15:43.856388.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T01:15:43.856388.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T01:15:43.856388.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T01:15:43.856388.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T01:15:43.856388.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T01:15:43.856388.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-18T01:15:43.856388.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T01:15:43.856388.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-18T01:15:43.856388.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T01:15:43.856388.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T01:15:43.856388.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T01:15:43.856388.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-18T01:15:43.856388.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-18T01:15:43.856388.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T01:15:43.856388.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T01:15:43.856388.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T01:15:43.856388.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T01:15:43.856388.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-18T01:15:43.856388.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-18T01:15:43.856388.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-18T01:15:43.856388.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T01:15:43.856388.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-18T01:15:43.856388.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T01:15:43.856388.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T01:15:43.856388.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-18T01:15:43.856388.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-18T01:15:43.856388.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-18T01:15:43.856388.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T01:15:43.856388.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-18T01:15:43.856388.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-18T01:15:43.856388.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_08_18T01_15_43.856388", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T01:15:43.856388.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T01:15:43.856388.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_08_18T01_15_43.856388", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-18T01:15:43.856388.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-18T01:15:43.856388.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_08_18T01_15_43.856388", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-18T01:15:43.856388.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-18T01:15:43.856388.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_08_18T01_15_43.856388", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T01:15:43.856388.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T01:15:43.856388.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_08_18T01_15_43.856388", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T01:15:43.856388.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T01:15:43.856388.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_08_18T01_15_43.856388", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-18T01:15:43.856388.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-18T01:15:43.856388.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_08_18T01_15_43.856388", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T01:15:43.856388.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T01:15:43.856388.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_08_18T01_15_43.856388", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T01:15:43.856388.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T01:15:43.856388.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_08_18T01_15_43.856388", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T01:15:43.856388.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T01:15:43.856388.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_08_18T01_15_43.856388", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T01:15:43.856388.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T01:15:43.856388.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_08_18T01_15_43.856388", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-18T01:15:43.856388.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-18T01:15:43.856388.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_08_18T01_15_43.856388", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-18T01:15:43.856388.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-18T01:15:43.856388.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_08_18T01_15_43.856388", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T01:15:43.856388.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T01:15:43.856388.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_08_18T01_15_43.856388", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-18T01:15:43.856388.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-18T01:15:43.856388.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_08_18T01_15_43.856388", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T01:15:43.856388.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T01:15:43.856388.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_08_18T01_15_43.856388", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T01:15:43.856388.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T01:15:43.856388.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_08_18T01_15_43.856388", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T01:15:43.856388.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T01:15:43.856388.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_08_18T01_15_43.856388", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-18T01:15:43.856388.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-18T01:15:43.856388.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_08_18T01_15_43.856388", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T01:15:43.856388.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T01:15:43.856388.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_08_18T01_15_43.856388", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T01:15:43.856388.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T01:15:43.856388.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_08_18T01_15_43.856388", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T01:15:43.856388.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T01:15:43.856388.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_08_18T01_15_43.856388", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T01:15:43.856388.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T01:15:43.856388.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_08_18T01_15_43.856388", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T01:15:43.856388.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T01:15:43.856388.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_08_18T01_15_43.856388", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T01:15:43.856388.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T01:15:43.856388.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_08_18T01_15_43.856388", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T01:15:43.856388.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T01:15:43.856388.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_08_18T01_15_43.856388", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T01:15:43.856388.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T01:15:43.856388.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_08_18T01_15_43.856388", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T01:15:43.856388.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T01:15:43.856388.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_08_18T01_15_43.856388", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T01:15:43.856388.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T01:15:43.856388.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_08_18T01_15_43.856388", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T01:15:43.856388.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T01:15:43.856388.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_08_18T01_15_43.856388", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T01:15:43.856388.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T01:15:43.856388.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_08_18T01_15_43.856388", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T01:15:43.856388.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T01:15:43.856388.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_08_18T01_15_43.856388", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T01:15:43.856388.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T01:15:43.856388.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_08_18T01_15_43.856388", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-18T01:15:43.856388.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-18T01:15:43.856388.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_08_18T01_15_43.856388", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T01:15:43.856388.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T01:15:43.856388.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_08_18T01_15_43.856388", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-18T01:15:43.856388.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-18T01:15:43.856388.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_08_18T01_15_43.856388", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T01:15:43.856388.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T01:15:43.856388.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_08_18T01_15_43.856388", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T01:15:43.856388.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T01:15:43.856388.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_08_18T01_15_43.856388", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T01:15:43.856388.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T01:15:43.856388.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_08_18T01_15_43.856388", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-18T01:15:43.856388.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-18T01:15:43.856388.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_08_18T01_15_43.856388", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-18T01:15:43.856388.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-18T01:15:43.856388.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_08_18T01_15_43.856388", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T01:15:43.856388.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T01:15:43.856388.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_08_18T01_15_43.856388", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T01:15:43.856388.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T01:15:43.856388.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_08_18T01_15_43.856388", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T01:15:43.856388.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T01:15:43.856388.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_08_18T01_15_43.856388", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T01:15:43.856388.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T01:15:43.856388.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_08_18T01_15_43.856388", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-18T01:15:43.856388.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-18T01:15:43.856388.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_08_18T01_15_43.856388", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-18T01:15:43.856388.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-18T01:15:43.856388.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_08_18T01_15_43.856388", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-18T01:15:43.856388.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-18T01:15:43.856388.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_08_18T01_15_43.856388", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T01:15:43.856388.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T01:15:43.856388.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_08_18T01_15_43.856388", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-18T01:15:43.856388.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-18T01:15:43.856388.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_08_18T01_15_43.856388", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T01:15:43.856388.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T01:15:43.856388.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_08_18T01_15_43.856388", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T01:15:43.856388.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T01:15:43.856388.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_08_18T01_15_43.856388", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-18T01:15:43.856388.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-18T01:15:43.856388.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_08_18T01_15_43.856388", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-18T01:15:43.856388.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-18T01:15:43.856388.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_08_18T01_15_43.856388", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-18T01:15:43.856388.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-18T01:15:43.856388.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_08_18T01_15_43.856388", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T01:15:43.856388.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T01:15:43.856388.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_08_18T01_15_43.856388", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-18T01:15:43.856388.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-18T01:15:43.856388.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_08_18T01_15_43.856388", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-18T01:15:43.856388.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-18T01:15:43.856388.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_08_18T01_15_43.856388", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-18T01:15:43.856388.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-18T01:15:43.856388.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_09_18T02_18_05.535474", "path": ["**/details_harness|winogrande|5_2023-09-18T02-18-05.535474.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-09-18T02-18-05.535474.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_08_18T01_15_43.856388", "path": ["results_2023-08-18T01:15:43.856388.parquet"]}, {"split": "2023_09_18T02_18_05.535474", "path": ["results_2023-09-18T02-18-05.535474.parquet"]}, {"split": "latest", "path": ["results_2023-09-18T02-18-05.535474.parquet"]}]}]}
2023-09-18T01:18:17+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of yeontaek/Platypus2-13B-LoRa ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model yeontaek/Platypus2-13B-LoRa on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-09-18T02:18:05.535474(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of yeontaek/Platypus2-13B-LoRa", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model yeontaek/Platypus2-13B-LoRa on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-09-18T02:18:05.535474(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of yeontaek/Platypus2-13B-LoRa", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model yeontaek/Platypus2-13B-LoRa on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-09-18T02:18:05.535474(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 22, 31, 170, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of yeontaek/Platypus2-13B-LoRa## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model yeontaek/Platypus2-13B-LoRa on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-09-18T02:18:05.535474(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
740dc6d6cf1af22e4b8fae666a2db30c3af04870
# Dataset Card for Evaluation run of yeontaek/Platypus2xOpenOrca-13B-LoRa ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/yeontaek/Platypus2xOpenOrca-13B-LoRa - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [yeontaek/Platypus2xOpenOrca-13B-LoRa](https://huggingface.co/yeontaek/Platypus2xOpenOrca-13B-LoRa) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_yeontaek__Platypus2xOpenOrca-13B-LoRa", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-10-22T19:15:43.027910](https://huggingface.co/datasets/open-llm-leaderboard/details_yeontaek__Platypus2xOpenOrca-13B-LoRa/blob/main/results_2023-10-22T19-15-43.027910.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.05253775167785235, "em_stderr": 0.0022848429307723603, "f1": 0.13231438758389208, "f1_stderr": 0.0027053109392376827, "acc": 0.4207948548713987, "acc_stderr": 0.009454053864896365 }, "harness|drop|3": { "em": 0.05253775167785235, "em_stderr": 0.0022848429307723603, "f1": 0.13231438758389208, "f1_stderr": 0.0027053109392376827 }, "harness|gsm8k|5": { "acc": 0.0712661106899166, "acc_stderr": 0.007086462127954497 }, "harness|winogrande|5": { "acc": 0.7703235990528808, "acc_stderr": 0.011821645601838232 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_yeontaek__Platypus2xOpenOrca-13B-LoRa
[ "region:us" ]
2023-08-18T17:54:38+00:00
{"pretty_name": "Evaluation run of yeontaek/Platypus2xOpenOrca-13B-LoRa", "dataset_summary": "Dataset automatically created during the evaluation run of model [yeontaek/Platypus2xOpenOrca-13B-LoRa](https://huggingface.co/yeontaek/Platypus2xOpenOrca-13B-LoRa) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_yeontaek__Platypus2xOpenOrca-13B-LoRa\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-22T19:15:43.027910](https://huggingface.co/datasets/open-llm-leaderboard/details_yeontaek__Platypus2xOpenOrca-13B-LoRa/blob/main/results_2023-10-22T19-15-43.027910.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.05253775167785235,\n \"em_stderr\": 0.0022848429307723603,\n \"f1\": 0.13231438758389208,\n \"f1_stderr\": 0.0027053109392376827,\n \"acc\": 0.4207948548713987,\n \"acc_stderr\": 0.009454053864896365\n },\n \"harness|drop|3\": {\n \"em\": 0.05253775167785235,\n \"em_stderr\": 0.0022848429307723603,\n \"f1\": 0.13231438758389208,\n \"f1_stderr\": 0.0027053109392376827\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0712661106899166,\n \"acc_stderr\": 0.007086462127954497\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7703235990528808,\n \"acc_stderr\": 0.011821645601838232\n }\n}\n```", "repo_url": "https://huggingface.co/yeontaek/Platypus2xOpenOrca-13B-LoRa", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_08_18T14_49_25.189557", "path": ["**/details_harness|arc:challenge|25_2023-08-18T14:49:25.189557.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-08-18T14:49:25.189557.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_22T19_15_43.027910", "path": ["**/details_harness|drop|3_2023-10-22T19-15-43.027910.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-22T19-15-43.027910.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_22T19_15_43.027910", "path": ["**/details_harness|gsm8k|5_2023-10-22T19-15-43.027910.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-22T19-15-43.027910.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_08_18T14_49_25.189557", "path": ["**/details_harness|hellaswag|10_2023-08-18T14:49:25.189557.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-08-18T14:49:25.189557.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_08_18T14_49_25.189557", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T14:49:25.189557.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-18T14:49:25.189557.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-18T14:49:25.189557.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T14:49:25.189557.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T14:49:25.189557.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-18T14:49:25.189557.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T14:49:25.189557.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T14:49:25.189557.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T14:49:25.189557.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T14:49:25.189557.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-18T14:49:25.189557.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-18T14:49:25.189557.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T14:49:25.189557.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-18T14:49:25.189557.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T14:49:25.189557.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T14:49:25.189557.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T14:49:25.189557.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-18T14:49:25.189557.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T14:49:25.189557.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T14:49:25.189557.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T14:49:25.189557.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T14:49:25.189557.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T14:49:25.189557.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T14:49:25.189557.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T14:49:25.189557.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T14:49:25.189557.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T14:49:25.189557.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T14:49:25.189557.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T14:49:25.189557.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T14:49:25.189557.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T14:49:25.189557.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T14:49:25.189557.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-18T14:49:25.189557.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T14:49:25.189557.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-18T14:49:25.189557.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T14:49:25.189557.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T14:49:25.189557.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T14:49:25.189557.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-18T14:49:25.189557.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-18T14:49:25.189557.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T14:49:25.189557.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T14:49:25.189557.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T14:49:25.189557.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T14:49:25.189557.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-18T14:49:25.189557.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-18T14:49:25.189557.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-18T14:49:25.189557.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T14:49:25.189557.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-18T14:49:25.189557.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T14:49:25.189557.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T14:49:25.189557.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-18T14:49:25.189557.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-18T14:49:25.189557.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-18T14:49:25.189557.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T14:49:25.189557.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-18T14:49:25.189557.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-18T14:49:25.189557.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T14:49:25.189557.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-18T14:49:25.189557.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-18T14:49:25.189557.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T14:49:25.189557.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T14:49:25.189557.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-18T14:49:25.189557.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T14:49:25.189557.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T14:49:25.189557.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T14:49:25.189557.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T14:49:25.189557.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-18T14:49:25.189557.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-18T14:49:25.189557.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T14:49:25.189557.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-18T14:49:25.189557.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T14:49:25.189557.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T14:49:25.189557.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T14:49:25.189557.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-18T14:49:25.189557.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T14:49:25.189557.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T14:49:25.189557.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T14:49:25.189557.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T14:49:25.189557.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T14:49:25.189557.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T14:49:25.189557.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T14:49:25.189557.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T14:49:25.189557.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T14:49:25.189557.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T14:49:25.189557.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T14:49:25.189557.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T14:49:25.189557.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T14:49:25.189557.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T14:49:25.189557.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-18T14:49:25.189557.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T14:49:25.189557.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-18T14:49:25.189557.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T14:49:25.189557.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T14:49:25.189557.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T14:49:25.189557.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-18T14:49:25.189557.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-18T14:49:25.189557.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T14:49:25.189557.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T14:49:25.189557.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T14:49:25.189557.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T14:49:25.189557.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-18T14:49:25.189557.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-18T14:49:25.189557.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-18T14:49:25.189557.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T14:49:25.189557.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-18T14:49:25.189557.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T14:49:25.189557.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T14:49:25.189557.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-18T14:49:25.189557.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-18T14:49:25.189557.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-18T14:49:25.189557.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T14:49:25.189557.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-18T14:49:25.189557.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-18T14:49:25.189557.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_08_18T14_49_25.189557", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T14:49:25.189557.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T14:49:25.189557.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_08_18T14_49_25.189557", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-18T14:49:25.189557.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-18T14:49:25.189557.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_08_18T14_49_25.189557", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-18T14:49:25.189557.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-18T14:49:25.189557.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_08_18T14_49_25.189557", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T14:49:25.189557.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T14:49:25.189557.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_08_18T14_49_25.189557", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T14:49:25.189557.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T14:49:25.189557.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_08_18T14_49_25.189557", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-18T14:49:25.189557.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-18T14:49:25.189557.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_08_18T14_49_25.189557", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T14:49:25.189557.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T14:49:25.189557.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_08_18T14_49_25.189557", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T14:49:25.189557.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T14:49:25.189557.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_08_18T14_49_25.189557", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T14:49:25.189557.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T14:49:25.189557.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_08_18T14_49_25.189557", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T14:49:25.189557.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T14:49:25.189557.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_08_18T14_49_25.189557", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-18T14:49:25.189557.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-18T14:49:25.189557.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_08_18T14_49_25.189557", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-18T14:49:25.189557.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-18T14:49:25.189557.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_08_18T14_49_25.189557", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T14:49:25.189557.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T14:49:25.189557.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_08_18T14_49_25.189557", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-18T14:49:25.189557.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-18T14:49:25.189557.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_08_18T14_49_25.189557", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T14:49:25.189557.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T14:49:25.189557.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_08_18T14_49_25.189557", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T14:49:25.189557.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T14:49:25.189557.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_08_18T14_49_25.189557", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T14:49:25.189557.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T14:49:25.189557.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_08_18T14_49_25.189557", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-18T14:49:25.189557.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-18T14:49:25.189557.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_08_18T14_49_25.189557", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T14:49:25.189557.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T14:49:25.189557.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_08_18T14_49_25.189557", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T14:49:25.189557.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T14:49:25.189557.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_08_18T14_49_25.189557", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T14:49:25.189557.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T14:49:25.189557.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_08_18T14_49_25.189557", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T14:49:25.189557.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T14:49:25.189557.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_08_18T14_49_25.189557", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T14:49:25.189557.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T14:49:25.189557.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_08_18T14_49_25.189557", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T14:49:25.189557.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T14:49:25.189557.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_08_18T14_49_25.189557", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T14:49:25.189557.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T14:49:25.189557.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_08_18T14_49_25.189557", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T14:49:25.189557.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T14:49:25.189557.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_08_18T14_49_25.189557", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T14:49:25.189557.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T14:49:25.189557.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_08_18T14_49_25.189557", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T14:49:25.189557.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T14:49:25.189557.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_08_18T14_49_25.189557", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T14:49:25.189557.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T14:49:25.189557.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_08_18T14_49_25.189557", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T14:49:25.189557.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T14:49:25.189557.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_08_18T14_49_25.189557", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T14:49:25.189557.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T14:49:25.189557.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_08_18T14_49_25.189557", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T14:49:25.189557.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T14:49:25.189557.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_08_18T14_49_25.189557", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-18T14:49:25.189557.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-18T14:49:25.189557.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_08_18T14_49_25.189557", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T14:49:25.189557.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T14:49:25.189557.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_08_18T14_49_25.189557", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-18T14:49:25.189557.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-18T14:49:25.189557.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_08_18T14_49_25.189557", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T14:49:25.189557.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T14:49:25.189557.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_08_18T14_49_25.189557", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T14:49:25.189557.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T14:49:25.189557.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_08_18T14_49_25.189557", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T14:49:25.189557.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T14:49:25.189557.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_08_18T14_49_25.189557", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-18T14:49:25.189557.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-18T14:49:25.189557.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_08_18T14_49_25.189557", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-18T14:49:25.189557.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-18T14:49:25.189557.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_08_18T14_49_25.189557", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T14:49:25.189557.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T14:49:25.189557.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_08_18T14_49_25.189557", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T14:49:25.189557.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T14:49:25.189557.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_08_18T14_49_25.189557", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T14:49:25.189557.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T14:49:25.189557.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_08_18T14_49_25.189557", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T14:49:25.189557.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T14:49:25.189557.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_08_18T14_49_25.189557", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-18T14:49:25.189557.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-18T14:49:25.189557.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_08_18T14_49_25.189557", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-18T14:49:25.189557.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-18T14:49:25.189557.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_08_18T14_49_25.189557", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-18T14:49:25.189557.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-18T14:49:25.189557.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_08_18T14_49_25.189557", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T14:49:25.189557.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T14:49:25.189557.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_08_18T14_49_25.189557", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-18T14:49:25.189557.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-18T14:49:25.189557.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_08_18T14_49_25.189557", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T14:49:25.189557.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T14:49:25.189557.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_08_18T14_49_25.189557", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T14:49:25.189557.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T14:49:25.189557.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_08_18T14_49_25.189557", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-18T14:49:25.189557.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-18T14:49:25.189557.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_08_18T14_49_25.189557", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-18T14:49:25.189557.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-18T14:49:25.189557.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_08_18T14_49_25.189557", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-18T14:49:25.189557.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-18T14:49:25.189557.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_08_18T14_49_25.189557", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T14:49:25.189557.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T14:49:25.189557.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_08_18T14_49_25.189557", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-18T14:49:25.189557.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-18T14:49:25.189557.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_08_18T14_49_25.189557", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-18T14:49:25.189557.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-18T14:49:25.189557.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_08_18T14_49_25.189557", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-18T14:49:25.189557.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-18T14:49:25.189557.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_22T19_15_43.027910", "path": ["**/details_harness|winogrande|5_2023-10-22T19-15-43.027910.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-22T19-15-43.027910.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_08_18T14_49_25.189557", "path": ["results_2023-08-18T14:49:25.189557.parquet"]}, {"split": "2023_10_22T19_15_43.027910", "path": ["results_2023-10-22T19-15-43.027910.parquet"]}, {"split": "latest", "path": ["results_2023-10-22T19-15-43.027910.parquet"]}]}]}
2023-10-22T18:15:55+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of yeontaek/Platypus2xOpenOrca-13B-LoRa ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model yeontaek/Platypus2xOpenOrca-13B-LoRa on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-10-22T19:15:43.027910(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of yeontaek/Platypus2xOpenOrca-13B-LoRa", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model yeontaek/Platypus2xOpenOrca-13B-LoRa on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-22T19:15:43.027910(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of yeontaek/Platypus2xOpenOrca-13B-LoRa", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model yeontaek/Platypus2xOpenOrca-13B-LoRa on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-22T19:15:43.027910(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 26, 31, 174, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of yeontaek/Platypus2xOpenOrca-13B-LoRa## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model yeontaek/Platypus2xOpenOrca-13B-LoRa on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-22T19:15:43.027910(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
eba66f7b183bdef7fbae16a59c602a2291ba61f2
# Dataset Card for Evaluation run of yeontaek/Platypus2-13B-QLoRa ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/yeontaek/Platypus2-13B-QLoRa - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [yeontaek/Platypus2-13B-QLoRa](https://huggingface.co/yeontaek/Platypus2-13B-QLoRa) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_yeontaek__Platypus2-13B-QLoRa", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-10-22T02:07:21.128388](https://huggingface.co/datasets/open-llm-leaderboard/details_yeontaek__Platypus2-13B-QLoRa/blob/main/results_2023-10-22T02-07-21.128388.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.007445469798657718, "em_stderr": 0.0008803652515899919, "f1": 0.06792785234899322, "f1_stderr": 0.001576095719649218, "acc": 0.4082075883226931, "acc_stderr": 0.008948818415880626 }, "harness|drop|3": { "em": 0.007445469798657718, "em_stderr": 0.0008803652515899919, "f1": 0.06792785234899322, "f1_stderr": 0.001576095719649218 }, "harness|gsm8k|5": { "acc": 0.050037907505686124, "acc_stderr": 0.006005442354577729 }, "harness|winogrande|5": { "acc": 0.7663772691397001, "acc_stderr": 0.011892194477183524 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_yeontaek__Platypus2-13B-QLoRa
[ "region:us" ]
2023-08-18T17:54:47+00:00
{"pretty_name": "Evaluation run of yeontaek/Platypus2-13B-QLoRa", "dataset_summary": "Dataset automatically created during the evaluation run of model [yeontaek/Platypus2-13B-QLoRa](https://huggingface.co/yeontaek/Platypus2-13B-QLoRa) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_yeontaek__Platypus2-13B-QLoRa\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-22T02:07:21.128388](https://huggingface.co/datasets/open-llm-leaderboard/details_yeontaek__Platypus2-13B-QLoRa/blob/main/results_2023-10-22T02-07-21.128388.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.007445469798657718,\n \"em_stderr\": 0.0008803652515899919,\n \"f1\": 0.06792785234899322,\n \"f1_stderr\": 0.001576095719649218,\n \"acc\": 0.4082075883226931,\n \"acc_stderr\": 0.008948818415880626\n },\n \"harness|drop|3\": {\n \"em\": 0.007445469798657718,\n \"em_stderr\": 0.0008803652515899919,\n \"f1\": 0.06792785234899322,\n \"f1_stderr\": 0.001576095719649218\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.050037907505686124,\n \"acc_stderr\": 0.006005442354577729\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7663772691397001,\n \"acc_stderr\": 0.011892194477183524\n }\n}\n```", "repo_url": "https://huggingface.co/yeontaek/Platypus2-13B-QLoRa", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_08_18T03_06_05.909035", "path": ["**/details_harness|arc:challenge|25_2023-08-18T03:06:05.909035.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-08-18T03:06:05.909035.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_22T02_07_21.128388", "path": ["**/details_harness|drop|3_2023-10-22T02-07-21.128388.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-22T02-07-21.128388.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_22T02_07_21.128388", "path": ["**/details_harness|gsm8k|5_2023-10-22T02-07-21.128388.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-22T02-07-21.128388.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_08_18T03_06_05.909035", "path": ["**/details_harness|hellaswag|10_2023-08-18T03:06:05.909035.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-08-18T03:06:05.909035.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_08_18T03_06_05.909035", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T03:06:05.909035.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-18T03:06:05.909035.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-18T03:06:05.909035.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T03:06:05.909035.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T03:06:05.909035.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-18T03:06:05.909035.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T03:06:05.909035.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T03:06:05.909035.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T03:06:05.909035.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T03:06:05.909035.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-18T03:06:05.909035.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-18T03:06:05.909035.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T03:06:05.909035.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-18T03:06:05.909035.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T03:06:05.909035.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T03:06:05.909035.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T03:06:05.909035.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-18T03:06:05.909035.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T03:06:05.909035.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T03:06:05.909035.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T03:06:05.909035.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T03:06:05.909035.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T03:06:05.909035.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T03:06:05.909035.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T03:06:05.909035.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T03:06:05.909035.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T03:06:05.909035.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T03:06:05.909035.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T03:06:05.909035.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T03:06:05.909035.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T03:06:05.909035.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T03:06:05.909035.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-18T03:06:05.909035.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T03:06:05.909035.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-18T03:06:05.909035.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T03:06:05.909035.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T03:06:05.909035.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T03:06:05.909035.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-18T03:06:05.909035.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-18T03:06:05.909035.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T03:06:05.909035.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T03:06:05.909035.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T03:06:05.909035.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T03:06:05.909035.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-18T03:06:05.909035.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-18T03:06:05.909035.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-18T03:06:05.909035.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T03:06:05.909035.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-18T03:06:05.909035.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T03:06:05.909035.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T03:06:05.909035.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-18T03:06:05.909035.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-18T03:06:05.909035.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-18T03:06:05.909035.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T03:06:05.909035.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-18T03:06:05.909035.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-18T03:06:05.909035.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T03:06:05.909035.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-18T03:06:05.909035.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-18T03:06:05.909035.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T03:06:05.909035.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T03:06:05.909035.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-18T03:06:05.909035.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T03:06:05.909035.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T03:06:05.909035.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T03:06:05.909035.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T03:06:05.909035.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-18T03:06:05.909035.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-18T03:06:05.909035.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T03:06:05.909035.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-18T03:06:05.909035.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T03:06:05.909035.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T03:06:05.909035.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T03:06:05.909035.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-18T03:06:05.909035.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T03:06:05.909035.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T03:06:05.909035.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T03:06:05.909035.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T03:06:05.909035.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T03:06:05.909035.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T03:06:05.909035.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T03:06:05.909035.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T03:06:05.909035.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T03:06:05.909035.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T03:06:05.909035.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T03:06:05.909035.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T03:06:05.909035.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T03:06:05.909035.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T03:06:05.909035.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-18T03:06:05.909035.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T03:06:05.909035.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-18T03:06:05.909035.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T03:06:05.909035.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T03:06:05.909035.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T03:06:05.909035.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-18T03:06:05.909035.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-18T03:06:05.909035.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T03:06:05.909035.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T03:06:05.909035.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T03:06:05.909035.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T03:06:05.909035.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-18T03:06:05.909035.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-18T03:06:05.909035.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-18T03:06:05.909035.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T03:06:05.909035.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-18T03:06:05.909035.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T03:06:05.909035.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T03:06:05.909035.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-18T03:06:05.909035.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-18T03:06:05.909035.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-18T03:06:05.909035.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T03:06:05.909035.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-18T03:06:05.909035.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-18T03:06:05.909035.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_08_18T03_06_05.909035", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T03:06:05.909035.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T03:06:05.909035.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_08_18T03_06_05.909035", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-18T03:06:05.909035.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-18T03:06:05.909035.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_08_18T03_06_05.909035", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-18T03:06:05.909035.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-18T03:06:05.909035.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_08_18T03_06_05.909035", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T03:06:05.909035.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T03:06:05.909035.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_08_18T03_06_05.909035", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T03:06:05.909035.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T03:06:05.909035.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_08_18T03_06_05.909035", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-18T03:06:05.909035.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-18T03:06:05.909035.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_08_18T03_06_05.909035", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T03:06:05.909035.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T03:06:05.909035.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_08_18T03_06_05.909035", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T03:06:05.909035.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T03:06:05.909035.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_08_18T03_06_05.909035", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T03:06:05.909035.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T03:06:05.909035.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_08_18T03_06_05.909035", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T03:06:05.909035.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T03:06:05.909035.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_08_18T03_06_05.909035", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-18T03:06:05.909035.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-18T03:06:05.909035.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_08_18T03_06_05.909035", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-18T03:06:05.909035.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-18T03:06:05.909035.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_08_18T03_06_05.909035", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T03:06:05.909035.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T03:06:05.909035.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_08_18T03_06_05.909035", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-18T03:06:05.909035.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-18T03:06:05.909035.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_08_18T03_06_05.909035", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T03:06:05.909035.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T03:06:05.909035.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_08_18T03_06_05.909035", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T03:06:05.909035.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T03:06:05.909035.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_08_18T03_06_05.909035", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T03:06:05.909035.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T03:06:05.909035.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_08_18T03_06_05.909035", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-18T03:06:05.909035.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-18T03:06:05.909035.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_08_18T03_06_05.909035", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T03:06:05.909035.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T03:06:05.909035.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_08_18T03_06_05.909035", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T03:06:05.909035.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T03:06:05.909035.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_08_18T03_06_05.909035", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T03:06:05.909035.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T03:06:05.909035.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_08_18T03_06_05.909035", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T03:06:05.909035.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T03:06:05.909035.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_08_18T03_06_05.909035", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T03:06:05.909035.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T03:06:05.909035.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_08_18T03_06_05.909035", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T03:06:05.909035.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T03:06:05.909035.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_08_18T03_06_05.909035", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T03:06:05.909035.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T03:06:05.909035.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_08_18T03_06_05.909035", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T03:06:05.909035.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T03:06:05.909035.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_08_18T03_06_05.909035", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T03:06:05.909035.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T03:06:05.909035.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_08_18T03_06_05.909035", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T03:06:05.909035.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T03:06:05.909035.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_08_18T03_06_05.909035", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T03:06:05.909035.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T03:06:05.909035.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_08_18T03_06_05.909035", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T03:06:05.909035.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T03:06:05.909035.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_08_18T03_06_05.909035", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T03:06:05.909035.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T03:06:05.909035.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_08_18T03_06_05.909035", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T03:06:05.909035.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T03:06:05.909035.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_08_18T03_06_05.909035", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-18T03:06:05.909035.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-18T03:06:05.909035.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_08_18T03_06_05.909035", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T03:06:05.909035.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T03:06:05.909035.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_08_18T03_06_05.909035", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-18T03:06:05.909035.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-18T03:06:05.909035.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_08_18T03_06_05.909035", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T03:06:05.909035.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T03:06:05.909035.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_08_18T03_06_05.909035", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T03:06:05.909035.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T03:06:05.909035.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_08_18T03_06_05.909035", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T03:06:05.909035.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T03:06:05.909035.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_08_18T03_06_05.909035", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-18T03:06:05.909035.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-18T03:06:05.909035.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_08_18T03_06_05.909035", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-18T03:06:05.909035.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-18T03:06:05.909035.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_08_18T03_06_05.909035", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T03:06:05.909035.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T03:06:05.909035.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_08_18T03_06_05.909035", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T03:06:05.909035.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T03:06:05.909035.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_08_18T03_06_05.909035", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T03:06:05.909035.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T03:06:05.909035.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_08_18T03_06_05.909035", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T03:06:05.909035.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T03:06:05.909035.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_08_18T03_06_05.909035", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-18T03:06:05.909035.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-18T03:06:05.909035.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_08_18T03_06_05.909035", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-18T03:06:05.909035.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-18T03:06:05.909035.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_08_18T03_06_05.909035", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-18T03:06:05.909035.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-18T03:06:05.909035.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_08_18T03_06_05.909035", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T03:06:05.909035.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T03:06:05.909035.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_08_18T03_06_05.909035", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-18T03:06:05.909035.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-18T03:06:05.909035.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_08_18T03_06_05.909035", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T03:06:05.909035.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T03:06:05.909035.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_08_18T03_06_05.909035", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T03:06:05.909035.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T03:06:05.909035.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_08_18T03_06_05.909035", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-18T03:06:05.909035.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-18T03:06:05.909035.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_08_18T03_06_05.909035", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-18T03:06:05.909035.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-18T03:06:05.909035.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_08_18T03_06_05.909035", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-18T03:06:05.909035.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-18T03:06:05.909035.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_08_18T03_06_05.909035", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T03:06:05.909035.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T03:06:05.909035.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_08_18T03_06_05.909035", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-18T03:06:05.909035.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-18T03:06:05.909035.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_08_18T03_06_05.909035", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-18T03:06:05.909035.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-18T03:06:05.909035.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_08_18T03_06_05.909035", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-18T03:06:05.909035.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-18T03:06:05.909035.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_22T02_07_21.128388", "path": ["**/details_harness|winogrande|5_2023-10-22T02-07-21.128388.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-22T02-07-21.128388.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_08_18T03_06_05.909035", "path": ["results_2023-08-18T03:06:05.909035.parquet"]}, {"split": "2023_10_22T02_07_21.128388", "path": ["results_2023-10-22T02-07-21.128388.parquet"]}, {"split": "latest", "path": ["results_2023-10-22T02-07-21.128388.parquet"]}]}]}
2023-10-22T01:07:33+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of yeontaek/Platypus2-13B-QLoRa ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model yeontaek/Platypus2-13B-QLoRa on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-10-22T02:07:21.128388(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of yeontaek/Platypus2-13B-QLoRa", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model yeontaek/Platypus2-13B-QLoRa on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-22T02:07:21.128388(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of yeontaek/Platypus2-13B-QLoRa", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model yeontaek/Platypus2-13B-QLoRa on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-22T02:07:21.128388(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 23, 31, 171, 66, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of yeontaek/Platypus2-13B-QLoRa## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model yeontaek/Platypus2-13B-QLoRa on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-22T02:07:21.128388(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
72307b00549194d174fc873cb4a9e77ac3f820ba
# Dataset Card for Evaluation run of Aspik101/trurl-2-13b-pl-instruct_unload ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/Aspik101/trurl-2-13b-pl-instruct_unload - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [Aspik101/trurl-2-13b-pl-instruct_unload](https://huggingface.co/Aspik101/trurl-2-13b-pl-instruct_unload) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_Aspik101__trurl-2-13b-pl-instruct_unload", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-10-15T18:21:08.741261](https://huggingface.co/datasets/open-llm-leaderboard/details_Aspik101__trurl-2-13b-pl-instruct_unload/blob/main/results_2023-10-15T18-21-08.741261.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.3252936241610738, "em_stderr": 0.004797719286876321, "f1": 0.42710885067114435, "f1_stderr": 0.004610322827124305, "acc": 0.4327753619762885, "acc_stderr": 0.010645351487263238 }, "harness|drop|3": { "em": 0.3252936241610738, "em_stderr": 0.004797719286876321, "f1": 0.42710885067114435, "f1_stderr": 0.004610322827124305 }, "harness|gsm8k|5": { "acc": 0.12206216830932524, "acc_stderr": 0.009017054965766493 }, "harness|winogrande|5": { "acc": 0.7434885556432518, "acc_stderr": 0.012273648008759982 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_Aspik101__trurl-2-13b-pl-instruct_unload
[ "region:us" ]
2023-08-18T17:54:56+00:00
{"pretty_name": "Evaluation run of Aspik101/trurl-2-13b-pl-instruct_unload", "dataset_summary": "Dataset automatically created during the evaluation run of model [Aspik101/trurl-2-13b-pl-instruct_unload](https://huggingface.co/Aspik101/trurl-2-13b-pl-instruct_unload) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Aspik101__trurl-2-13b-pl-instruct_unload\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-15T18:21:08.741261](https://huggingface.co/datasets/open-llm-leaderboard/details_Aspik101__trurl-2-13b-pl-instruct_unload/blob/main/results_2023-10-15T18-21-08.741261.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.3252936241610738,\n \"em_stderr\": 0.004797719286876321,\n \"f1\": 0.42710885067114435,\n \"f1_stderr\": 0.004610322827124305,\n \"acc\": 0.4327753619762885,\n \"acc_stderr\": 0.010645351487263238\n },\n \"harness|drop|3\": {\n \"em\": 0.3252936241610738,\n \"em_stderr\": 0.004797719286876321,\n \"f1\": 0.42710885067114435,\n \"f1_stderr\": 0.004610322827124305\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.12206216830932524,\n \"acc_stderr\": 0.009017054965766493\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7434885556432518,\n \"acc_stderr\": 0.012273648008759982\n }\n}\n```", "repo_url": "https://huggingface.co/Aspik101/trurl-2-13b-pl-instruct_unload", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_08_18T09_28_28.841723", "path": ["**/details_harness|arc:challenge|25_2023-08-18T09:28:28.841723.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-08-18T09:28:28.841723.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_15T18_21_08.741261", "path": ["**/details_harness|drop|3_2023-10-15T18-21-08.741261.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-15T18-21-08.741261.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_15T18_21_08.741261", "path": ["**/details_harness|gsm8k|5_2023-10-15T18-21-08.741261.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-15T18-21-08.741261.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_08_18T09_28_28.841723", "path": ["**/details_harness|hellaswag|10_2023-08-18T09:28:28.841723.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-08-18T09:28:28.841723.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_08_18T09_28_28.841723", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T09:28:28.841723.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-18T09:28:28.841723.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-18T09:28:28.841723.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T09:28:28.841723.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T09:28:28.841723.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-18T09:28:28.841723.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T09:28:28.841723.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T09:28:28.841723.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T09:28:28.841723.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T09:28:28.841723.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-18T09:28:28.841723.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-18T09:28:28.841723.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T09:28:28.841723.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-18T09:28:28.841723.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T09:28:28.841723.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T09:28:28.841723.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T09:28:28.841723.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-18T09:28:28.841723.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T09:28:28.841723.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T09:28:28.841723.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T09:28:28.841723.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T09:28:28.841723.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T09:28:28.841723.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T09:28:28.841723.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T09:28:28.841723.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T09:28:28.841723.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T09:28:28.841723.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T09:28:28.841723.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T09:28:28.841723.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T09:28:28.841723.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T09:28:28.841723.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T09:28:28.841723.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-18T09:28:28.841723.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T09:28:28.841723.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-18T09:28:28.841723.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T09:28:28.841723.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T09:28:28.841723.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T09:28:28.841723.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-18T09:28:28.841723.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-18T09:28:28.841723.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T09:28:28.841723.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T09:28:28.841723.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T09:28:28.841723.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T09:28:28.841723.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-18T09:28:28.841723.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-18T09:28:28.841723.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-18T09:28:28.841723.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T09:28:28.841723.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-18T09:28:28.841723.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T09:28:28.841723.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T09:28:28.841723.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-18T09:28:28.841723.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-18T09:28:28.841723.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-18T09:28:28.841723.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T09:28:28.841723.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-18T09:28:28.841723.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-18T09:28:28.841723.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T09:28:28.841723.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-18T09:28:28.841723.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-18T09:28:28.841723.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T09:28:28.841723.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T09:28:28.841723.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-18T09:28:28.841723.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T09:28:28.841723.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T09:28:28.841723.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T09:28:28.841723.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T09:28:28.841723.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-18T09:28:28.841723.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-18T09:28:28.841723.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T09:28:28.841723.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-18T09:28:28.841723.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T09:28:28.841723.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T09:28:28.841723.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T09:28:28.841723.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-18T09:28:28.841723.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T09:28:28.841723.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T09:28:28.841723.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T09:28:28.841723.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T09:28:28.841723.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T09:28:28.841723.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T09:28:28.841723.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T09:28:28.841723.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T09:28:28.841723.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T09:28:28.841723.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T09:28:28.841723.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T09:28:28.841723.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T09:28:28.841723.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T09:28:28.841723.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T09:28:28.841723.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-18T09:28:28.841723.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T09:28:28.841723.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-18T09:28:28.841723.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T09:28:28.841723.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T09:28:28.841723.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T09:28:28.841723.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-18T09:28:28.841723.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-18T09:28:28.841723.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T09:28:28.841723.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T09:28:28.841723.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T09:28:28.841723.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T09:28:28.841723.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-18T09:28:28.841723.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-18T09:28:28.841723.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-18T09:28:28.841723.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T09:28:28.841723.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-18T09:28:28.841723.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T09:28:28.841723.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T09:28:28.841723.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-18T09:28:28.841723.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-18T09:28:28.841723.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-18T09:28:28.841723.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T09:28:28.841723.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-18T09:28:28.841723.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-18T09:28:28.841723.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_08_18T09_28_28.841723", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T09:28:28.841723.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T09:28:28.841723.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_08_18T09_28_28.841723", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-18T09:28:28.841723.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-18T09:28:28.841723.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_08_18T09_28_28.841723", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-18T09:28:28.841723.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-18T09:28:28.841723.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_08_18T09_28_28.841723", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T09:28:28.841723.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T09:28:28.841723.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_08_18T09_28_28.841723", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T09:28:28.841723.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T09:28:28.841723.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_08_18T09_28_28.841723", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-18T09:28:28.841723.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-18T09:28:28.841723.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_08_18T09_28_28.841723", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T09:28:28.841723.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T09:28:28.841723.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_08_18T09_28_28.841723", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T09:28:28.841723.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T09:28:28.841723.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_08_18T09_28_28.841723", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T09:28:28.841723.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T09:28:28.841723.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_08_18T09_28_28.841723", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T09:28:28.841723.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T09:28:28.841723.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_08_18T09_28_28.841723", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-18T09:28:28.841723.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-18T09:28:28.841723.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_08_18T09_28_28.841723", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-18T09:28:28.841723.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-18T09:28:28.841723.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_08_18T09_28_28.841723", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T09:28:28.841723.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T09:28:28.841723.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_08_18T09_28_28.841723", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-18T09:28:28.841723.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-18T09:28:28.841723.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_08_18T09_28_28.841723", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T09:28:28.841723.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T09:28:28.841723.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_08_18T09_28_28.841723", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T09:28:28.841723.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T09:28:28.841723.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_08_18T09_28_28.841723", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T09:28:28.841723.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T09:28:28.841723.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_08_18T09_28_28.841723", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-18T09:28:28.841723.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-18T09:28:28.841723.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_08_18T09_28_28.841723", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T09:28:28.841723.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T09:28:28.841723.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_08_18T09_28_28.841723", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T09:28:28.841723.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T09:28:28.841723.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_08_18T09_28_28.841723", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T09:28:28.841723.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T09:28:28.841723.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_08_18T09_28_28.841723", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T09:28:28.841723.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T09:28:28.841723.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_08_18T09_28_28.841723", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T09:28:28.841723.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T09:28:28.841723.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_08_18T09_28_28.841723", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T09:28:28.841723.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T09:28:28.841723.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_08_18T09_28_28.841723", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T09:28:28.841723.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T09:28:28.841723.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_08_18T09_28_28.841723", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T09:28:28.841723.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T09:28:28.841723.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_08_18T09_28_28.841723", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T09:28:28.841723.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T09:28:28.841723.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_08_18T09_28_28.841723", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T09:28:28.841723.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T09:28:28.841723.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_08_18T09_28_28.841723", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T09:28:28.841723.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T09:28:28.841723.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_08_18T09_28_28.841723", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T09:28:28.841723.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T09:28:28.841723.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_08_18T09_28_28.841723", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T09:28:28.841723.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T09:28:28.841723.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_08_18T09_28_28.841723", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T09:28:28.841723.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T09:28:28.841723.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_08_18T09_28_28.841723", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-18T09:28:28.841723.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-18T09:28:28.841723.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_08_18T09_28_28.841723", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T09:28:28.841723.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T09:28:28.841723.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_08_18T09_28_28.841723", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-18T09:28:28.841723.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-18T09:28:28.841723.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_08_18T09_28_28.841723", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T09:28:28.841723.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T09:28:28.841723.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_08_18T09_28_28.841723", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T09:28:28.841723.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T09:28:28.841723.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_08_18T09_28_28.841723", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T09:28:28.841723.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T09:28:28.841723.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_08_18T09_28_28.841723", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-18T09:28:28.841723.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-18T09:28:28.841723.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_08_18T09_28_28.841723", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-18T09:28:28.841723.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-18T09:28:28.841723.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_08_18T09_28_28.841723", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T09:28:28.841723.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T09:28:28.841723.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_08_18T09_28_28.841723", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T09:28:28.841723.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T09:28:28.841723.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_08_18T09_28_28.841723", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T09:28:28.841723.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T09:28:28.841723.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_08_18T09_28_28.841723", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T09:28:28.841723.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T09:28:28.841723.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_08_18T09_28_28.841723", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-18T09:28:28.841723.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-18T09:28:28.841723.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_08_18T09_28_28.841723", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-18T09:28:28.841723.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-18T09:28:28.841723.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_08_18T09_28_28.841723", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-18T09:28:28.841723.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-18T09:28:28.841723.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_08_18T09_28_28.841723", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T09:28:28.841723.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T09:28:28.841723.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_08_18T09_28_28.841723", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-18T09:28:28.841723.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-18T09:28:28.841723.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_08_18T09_28_28.841723", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T09:28:28.841723.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T09:28:28.841723.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_08_18T09_28_28.841723", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T09:28:28.841723.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T09:28:28.841723.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_08_18T09_28_28.841723", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-18T09:28:28.841723.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-18T09:28:28.841723.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_08_18T09_28_28.841723", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-18T09:28:28.841723.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-18T09:28:28.841723.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_08_18T09_28_28.841723", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-18T09:28:28.841723.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-18T09:28:28.841723.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_08_18T09_28_28.841723", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T09:28:28.841723.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T09:28:28.841723.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_08_18T09_28_28.841723", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-18T09:28:28.841723.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-18T09:28:28.841723.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_08_18T09_28_28.841723", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-18T09:28:28.841723.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-18T09:28:28.841723.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_08_18T09_28_28.841723", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-18T09:28:28.841723.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-18T09:28:28.841723.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_15T18_21_08.741261", "path": ["**/details_harness|winogrande|5_2023-10-15T18-21-08.741261.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-15T18-21-08.741261.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_08_18T09_28_28.841723", "path": ["results_2023-08-18T09:28:28.841723.parquet"]}, {"split": "2023_10_15T18_21_08.741261", "path": ["results_2023-10-15T18-21-08.741261.parquet"]}, {"split": "latest", "path": ["results_2023-10-15T18-21-08.741261.parquet"]}]}]}
2023-10-15T17:21:20+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of Aspik101/trurl-2-13b-pl-instruct_unload ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model Aspik101/trurl-2-13b-pl-instruct_unload on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-10-15T18:21:08.741261(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of Aspik101/trurl-2-13b-pl-instruct_unload", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model Aspik101/trurl-2-13b-pl-instruct_unload on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-15T18:21:08.741261(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of Aspik101/trurl-2-13b-pl-instruct_unload", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model Aspik101/trurl-2-13b-pl-instruct_unload on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-15T18:21:08.741261(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 26, 31, 174, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of Aspik101/trurl-2-13b-pl-instruct_unload## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model Aspik101/trurl-2-13b-pl-instruct_unload on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-15T18:21:08.741261(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
3025b2a6ec102e488f286c4beee5c30068a593aa
# Dataset Card for Evaluation run of Aspik101/trurl-2-7b-pl-instruct_unload ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/Aspik101/trurl-2-7b-pl-instruct_unload - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [Aspik101/trurl-2-7b-pl-instruct_unload](https://huggingface.co/Aspik101/trurl-2-7b-pl-instruct_unload) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_Aspik101__trurl-2-7b-pl-instruct_unload", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-09-22T23:43:45.355114](https://huggingface.co/datasets/open-llm-leaderboard/details_Aspik101__trurl-2-7b-pl-instruct_unload/blob/main/results_2023-09-22T23-43-45.355114.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.16096895973154363, "em_stderr": 0.0037635677120072437, "f1": 0.22060822147650985, "f1_stderr": 0.0038388662178584767, "acc": 0.3986331756197593, "acc_stderr": 0.009900867594093993 }, "harness|drop|3": { "em": 0.16096895973154363, "em_stderr": 0.0037635677120072437, "f1": 0.22060822147650985, "f1_stderr": 0.0038388662178584767 }, "harness|gsm8k|5": { "acc": 0.07429871114480667, "acc_stderr": 0.007223844172845576 }, "harness|winogrande|5": { "acc": 0.7229676400947119, "acc_stderr": 0.012577891015342412 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_Aspik101__trurl-2-7b-pl-instruct_unload
[ "region:us" ]
2023-08-18T17:55:05+00:00
{"pretty_name": "Evaluation run of Aspik101/trurl-2-7b-pl-instruct_unload", "dataset_summary": "Dataset automatically created during the evaluation run of model [Aspik101/trurl-2-7b-pl-instruct_unload](https://huggingface.co/Aspik101/trurl-2-7b-pl-instruct_unload) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Aspik101__trurl-2-7b-pl-instruct_unload\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-09-22T23:43:45.355114](https://huggingface.co/datasets/open-llm-leaderboard/details_Aspik101__trurl-2-7b-pl-instruct_unload/blob/main/results_2023-09-22T23-43-45.355114.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.16096895973154363,\n \"em_stderr\": 0.0037635677120072437,\n \"f1\": 0.22060822147650985,\n \"f1_stderr\": 0.0038388662178584767,\n \"acc\": 0.3986331756197593,\n \"acc_stderr\": 0.009900867594093993\n },\n \"harness|drop|3\": {\n \"em\": 0.16096895973154363,\n \"em_stderr\": 0.0037635677120072437,\n \"f1\": 0.22060822147650985,\n \"f1_stderr\": 0.0038388662178584767\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.07429871114480667,\n \"acc_stderr\": 0.007223844172845576\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7229676400947119,\n \"acc_stderr\": 0.012577891015342412\n }\n}\n```", "repo_url": "https://huggingface.co/Aspik101/trurl-2-7b-pl-instruct_unload", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_08_17T14_40_19.486608", "path": ["**/details_harness|arc:challenge|25_2023-08-17T14:40:19.486608.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-08-17T14:40:19.486608.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_09_22T23_43_45.355114", "path": ["**/details_harness|drop|3_2023-09-22T23-43-45.355114.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-09-22T23-43-45.355114.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_09_22T23_43_45.355114", "path": ["**/details_harness|gsm8k|5_2023-09-22T23-43-45.355114.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-09-22T23-43-45.355114.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_08_17T14_40_19.486608", "path": ["**/details_harness|hellaswag|10_2023-08-17T14:40:19.486608.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-08-17T14:40:19.486608.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_08_17T14_40_19.486608", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T14:40:19.486608.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-17T14:40:19.486608.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-17T14:40:19.486608.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T14:40:19.486608.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T14:40:19.486608.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-17T14:40:19.486608.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T14:40:19.486608.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T14:40:19.486608.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T14:40:19.486608.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T14:40:19.486608.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-17T14:40:19.486608.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-17T14:40:19.486608.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T14:40:19.486608.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-17T14:40:19.486608.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T14:40:19.486608.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T14:40:19.486608.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T14:40:19.486608.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-17T14:40:19.486608.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T14:40:19.486608.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T14:40:19.486608.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T14:40:19.486608.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T14:40:19.486608.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T14:40:19.486608.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T14:40:19.486608.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T14:40:19.486608.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T14:40:19.486608.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T14:40:19.486608.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T14:40:19.486608.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T14:40:19.486608.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T14:40:19.486608.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T14:40:19.486608.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T14:40:19.486608.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-17T14:40:19.486608.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T14:40:19.486608.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-17T14:40:19.486608.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T14:40:19.486608.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T14:40:19.486608.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T14:40:19.486608.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-17T14:40:19.486608.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-17T14:40:19.486608.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T14:40:19.486608.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T14:40:19.486608.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T14:40:19.486608.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T14:40:19.486608.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-17T14:40:19.486608.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-17T14:40:19.486608.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-17T14:40:19.486608.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T14:40:19.486608.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-17T14:40:19.486608.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T14:40:19.486608.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T14:40:19.486608.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-17T14:40:19.486608.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-17T14:40:19.486608.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-17T14:40:19.486608.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T14:40:19.486608.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-17T14:40:19.486608.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-17T14:40:19.486608.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T14:40:19.486608.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-17T14:40:19.486608.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-17T14:40:19.486608.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T14:40:19.486608.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T14:40:19.486608.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-17T14:40:19.486608.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T14:40:19.486608.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T14:40:19.486608.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T14:40:19.486608.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T14:40:19.486608.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-17T14:40:19.486608.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-17T14:40:19.486608.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T14:40:19.486608.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-17T14:40:19.486608.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T14:40:19.486608.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T14:40:19.486608.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T14:40:19.486608.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-17T14:40:19.486608.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T14:40:19.486608.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T14:40:19.486608.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T14:40:19.486608.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T14:40:19.486608.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T14:40:19.486608.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T14:40:19.486608.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T14:40:19.486608.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T14:40:19.486608.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T14:40:19.486608.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T14:40:19.486608.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T14:40:19.486608.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T14:40:19.486608.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T14:40:19.486608.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T14:40:19.486608.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-17T14:40:19.486608.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T14:40:19.486608.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-17T14:40:19.486608.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T14:40:19.486608.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T14:40:19.486608.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T14:40:19.486608.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-17T14:40:19.486608.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-17T14:40:19.486608.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T14:40:19.486608.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T14:40:19.486608.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T14:40:19.486608.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T14:40:19.486608.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-17T14:40:19.486608.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-17T14:40:19.486608.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-17T14:40:19.486608.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T14:40:19.486608.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-17T14:40:19.486608.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T14:40:19.486608.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T14:40:19.486608.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-17T14:40:19.486608.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-17T14:40:19.486608.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-17T14:40:19.486608.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T14:40:19.486608.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-17T14:40:19.486608.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-17T14:40:19.486608.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_08_17T14_40_19.486608", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T14:40:19.486608.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T14:40:19.486608.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_08_17T14_40_19.486608", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-17T14:40:19.486608.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-17T14:40:19.486608.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_08_17T14_40_19.486608", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-17T14:40:19.486608.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-17T14:40:19.486608.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_08_17T14_40_19.486608", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T14:40:19.486608.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T14:40:19.486608.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_08_17T14_40_19.486608", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T14:40:19.486608.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T14:40:19.486608.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_08_17T14_40_19.486608", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-17T14:40:19.486608.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-17T14:40:19.486608.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_08_17T14_40_19.486608", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T14:40:19.486608.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T14:40:19.486608.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_08_17T14_40_19.486608", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T14:40:19.486608.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T14:40:19.486608.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_08_17T14_40_19.486608", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T14:40:19.486608.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T14:40:19.486608.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_08_17T14_40_19.486608", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T14:40:19.486608.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T14:40:19.486608.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_08_17T14_40_19.486608", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-17T14:40:19.486608.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-17T14:40:19.486608.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_08_17T14_40_19.486608", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-17T14:40:19.486608.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-17T14:40:19.486608.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_08_17T14_40_19.486608", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T14:40:19.486608.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T14:40:19.486608.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_08_17T14_40_19.486608", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-17T14:40:19.486608.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-17T14:40:19.486608.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_08_17T14_40_19.486608", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T14:40:19.486608.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T14:40:19.486608.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_08_17T14_40_19.486608", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T14:40:19.486608.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T14:40:19.486608.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_08_17T14_40_19.486608", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T14:40:19.486608.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T14:40:19.486608.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_08_17T14_40_19.486608", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-17T14:40:19.486608.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-17T14:40:19.486608.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_08_17T14_40_19.486608", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T14:40:19.486608.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T14:40:19.486608.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_08_17T14_40_19.486608", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T14:40:19.486608.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T14:40:19.486608.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_08_17T14_40_19.486608", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T14:40:19.486608.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T14:40:19.486608.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_08_17T14_40_19.486608", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T14:40:19.486608.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T14:40:19.486608.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_08_17T14_40_19.486608", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T14:40:19.486608.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T14:40:19.486608.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_08_17T14_40_19.486608", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T14:40:19.486608.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T14:40:19.486608.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_08_17T14_40_19.486608", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T14:40:19.486608.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T14:40:19.486608.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_08_17T14_40_19.486608", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T14:40:19.486608.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T14:40:19.486608.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_08_17T14_40_19.486608", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T14:40:19.486608.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T14:40:19.486608.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_08_17T14_40_19.486608", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T14:40:19.486608.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T14:40:19.486608.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_08_17T14_40_19.486608", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T14:40:19.486608.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T14:40:19.486608.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_08_17T14_40_19.486608", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T14:40:19.486608.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T14:40:19.486608.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_08_17T14_40_19.486608", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T14:40:19.486608.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T14:40:19.486608.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_08_17T14_40_19.486608", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T14:40:19.486608.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T14:40:19.486608.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_08_17T14_40_19.486608", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-17T14:40:19.486608.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-17T14:40:19.486608.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_08_17T14_40_19.486608", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T14:40:19.486608.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T14:40:19.486608.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_08_17T14_40_19.486608", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-17T14:40:19.486608.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-17T14:40:19.486608.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_08_17T14_40_19.486608", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T14:40:19.486608.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T14:40:19.486608.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_08_17T14_40_19.486608", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T14:40:19.486608.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T14:40:19.486608.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_08_17T14_40_19.486608", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T14:40:19.486608.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T14:40:19.486608.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_08_17T14_40_19.486608", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-17T14:40:19.486608.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-17T14:40:19.486608.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_08_17T14_40_19.486608", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-17T14:40:19.486608.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-17T14:40:19.486608.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_08_17T14_40_19.486608", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T14:40:19.486608.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T14:40:19.486608.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_08_17T14_40_19.486608", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T14:40:19.486608.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T14:40:19.486608.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_08_17T14_40_19.486608", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T14:40:19.486608.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T14:40:19.486608.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_08_17T14_40_19.486608", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T14:40:19.486608.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T14:40:19.486608.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_08_17T14_40_19.486608", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-17T14:40:19.486608.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-17T14:40:19.486608.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_08_17T14_40_19.486608", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-17T14:40:19.486608.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-17T14:40:19.486608.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_08_17T14_40_19.486608", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-17T14:40:19.486608.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-17T14:40:19.486608.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_08_17T14_40_19.486608", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T14:40:19.486608.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T14:40:19.486608.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_08_17T14_40_19.486608", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-17T14:40:19.486608.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-17T14:40:19.486608.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_08_17T14_40_19.486608", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T14:40:19.486608.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T14:40:19.486608.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_08_17T14_40_19.486608", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T14:40:19.486608.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T14:40:19.486608.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_08_17T14_40_19.486608", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-17T14:40:19.486608.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-17T14:40:19.486608.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_08_17T14_40_19.486608", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-17T14:40:19.486608.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-17T14:40:19.486608.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_08_17T14_40_19.486608", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-17T14:40:19.486608.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-17T14:40:19.486608.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_08_17T14_40_19.486608", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T14:40:19.486608.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T14:40:19.486608.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_08_17T14_40_19.486608", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-17T14:40:19.486608.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-17T14:40:19.486608.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_08_17T14_40_19.486608", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-17T14:40:19.486608.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-17T14:40:19.486608.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_08_17T14_40_19.486608", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-17T14:40:19.486608.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-17T14:40:19.486608.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_09_22T23_43_45.355114", "path": ["**/details_harness|winogrande|5_2023-09-22T23-43-45.355114.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-09-22T23-43-45.355114.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_08_17T14_40_19.486608", "path": ["results_2023-08-17T14:40:19.486608.parquet"]}, {"split": "2023_09_22T23_43_45.355114", "path": ["results_2023-09-22T23-43-45.355114.parquet"]}, {"split": "latest", "path": ["results_2023-09-22T23-43-45.355114.parquet"]}]}]}
2023-09-22T22:43:57+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of Aspik101/trurl-2-7b-pl-instruct_unload ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model Aspik101/trurl-2-7b-pl-instruct_unload on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-09-22T23:43:45.355114(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of Aspik101/trurl-2-7b-pl-instruct_unload", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model Aspik101/trurl-2-7b-pl-instruct_unload on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-09-22T23:43:45.355114(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of Aspik101/trurl-2-7b-pl-instruct_unload", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model Aspik101/trurl-2-7b-pl-instruct_unload on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-09-22T23:43:45.355114(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 26, 31, 174, 66, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of Aspik101/trurl-2-7b-pl-instruct_unload## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model Aspik101/trurl-2-7b-pl-instruct_unload on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-09-22T23:43:45.355114(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
601c9fdb11bbfece354de1d8160d431c9b6da0f8
# Dataset Card for Evaluation run of jslin09/bloom-560m-finetuned-fraud ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/jslin09/bloom-560m-finetuned-fraud - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [jslin09/bloom-560m-finetuned-fraud](https://huggingface.co/jslin09/bloom-560m-finetuned-fraud) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_jslin09__bloom-560m-finetuned-fraud", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-09-17T09:10:48.065151](https://huggingface.co/datasets/open-llm-leaderboard/details_jslin09__bloom-560m-finetuned-fraud/blob/main/results_2023-09-17T09-10-48.065151.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.0026216442953020135, "em_stderr": 0.0005236685642965815, "f1": 0.0032707634228187916, "f1_stderr": 0.0005552444547661462, "acc": 0.24191002367797948, "acc_stderr": 0.0070225630654893005 }, "harness|drop|3": { "em": 0.0026216442953020135, "em_stderr": 0.0005236685642965815, "f1": 0.0032707634228187916, "f1_stderr": 0.0005552444547661462 }, "harness|gsm8k|5": { "acc": 0.0, "acc_stderr": 0.0 }, "harness|winogrande|5": { "acc": 0.48382004735595896, "acc_stderr": 0.014045126130978601 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_jslin09__bloom-560m-finetuned-fraud
[ "region:us" ]
2023-08-18T17:55:14+00:00
{"pretty_name": "Evaluation run of jslin09/bloom-560m-finetuned-fraud", "dataset_summary": "Dataset automatically created during the evaluation run of model [jslin09/bloom-560m-finetuned-fraud](https://huggingface.co/jslin09/bloom-560m-finetuned-fraud) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_jslin09__bloom-560m-finetuned-fraud\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-09-17T09:10:48.065151](https://huggingface.co/datasets/open-llm-leaderboard/details_jslin09__bloom-560m-finetuned-fraud/blob/main/results_2023-09-17T09-10-48.065151.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0026216442953020135,\n \"em_stderr\": 0.0005236685642965815,\n \"f1\": 0.0032707634228187916,\n \"f1_stderr\": 0.0005552444547661462,\n \"acc\": 0.24191002367797948,\n \"acc_stderr\": 0.0070225630654893005\n },\n \"harness|drop|3\": {\n \"em\": 0.0026216442953020135,\n \"em_stderr\": 0.0005236685642965815,\n \"f1\": 0.0032707634228187916,\n \"f1_stderr\": 0.0005552444547661462\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\": 0.0\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.48382004735595896,\n \"acc_stderr\": 0.014045126130978601\n }\n}\n```", "repo_url": "https://huggingface.co/jslin09/bloom-560m-finetuned-fraud", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_08_17T18_20_24.088120", "path": ["**/details_harness|arc:challenge|25_2023-08-17T18:20:24.088120.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-08-17T18:20:24.088120.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_09_17T09_10_48.065151", "path": ["**/details_harness|drop|3_2023-09-17T09-10-48.065151.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-09-17T09-10-48.065151.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_09_17T09_10_48.065151", "path": ["**/details_harness|gsm8k|5_2023-09-17T09-10-48.065151.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-09-17T09-10-48.065151.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_08_17T18_20_24.088120", "path": ["**/details_harness|hellaswag|10_2023-08-17T18:20:24.088120.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-08-17T18:20:24.088120.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_08_17T18_20_24.088120", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T18:20:24.088120.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-17T18:20:24.088120.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-17T18:20:24.088120.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T18:20:24.088120.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T18:20:24.088120.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-17T18:20:24.088120.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T18:20:24.088120.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T18:20:24.088120.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T18:20:24.088120.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T18:20:24.088120.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-17T18:20:24.088120.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-17T18:20:24.088120.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T18:20:24.088120.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-17T18:20:24.088120.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T18:20:24.088120.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T18:20:24.088120.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T18:20:24.088120.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-17T18:20:24.088120.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T18:20:24.088120.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T18:20:24.088120.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T18:20:24.088120.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T18:20:24.088120.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T18:20:24.088120.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T18:20:24.088120.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T18:20:24.088120.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T18:20:24.088120.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T18:20:24.088120.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T18:20:24.088120.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T18:20:24.088120.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T18:20:24.088120.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T18:20:24.088120.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T18:20:24.088120.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-17T18:20:24.088120.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T18:20:24.088120.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-17T18:20:24.088120.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T18:20:24.088120.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T18:20:24.088120.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T18:20:24.088120.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-17T18:20:24.088120.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-17T18:20:24.088120.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T18:20:24.088120.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T18:20:24.088120.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T18:20:24.088120.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T18:20:24.088120.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-17T18:20:24.088120.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-17T18:20:24.088120.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-17T18:20:24.088120.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T18:20:24.088120.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-17T18:20:24.088120.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T18:20:24.088120.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T18:20:24.088120.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-17T18:20:24.088120.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-17T18:20:24.088120.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-17T18:20:24.088120.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T18:20:24.088120.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-17T18:20:24.088120.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-17T18:20:24.088120.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T18:20:24.088120.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-17T18:20:24.088120.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-17T18:20:24.088120.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T18:20:24.088120.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T18:20:24.088120.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-17T18:20:24.088120.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T18:20:24.088120.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T18:20:24.088120.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T18:20:24.088120.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T18:20:24.088120.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-17T18:20:24.088120.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-17T18:20:24.088120.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T18:20:24.088120.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-17T18:20:24.088120.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T18:20:24.088120.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T18:20:24.088120.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T18:20:24.088120.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-17T18:20:24.088120.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T18:20:24.088120.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T18:20:24.088120.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T18:20:24.088120.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T18:20:24.088120.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T18:20:24.088120.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T18:20:24.088120.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T18:20:24.088120.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T18:20:24.088120.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T18:20:24.088120.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T18:20:24.088120.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T18:20:24.088120.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T18:20:24.088120.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T18:20:24.088120.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T18:20:24.088120.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-17T18:20:24.088120.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T18:20:24.088120.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-17T18:20:24.088120.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T18:20:24.088120.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T18:20:24.088120.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T18:20:24.088120.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-17T18:20:24.088120.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-17T18:20:24.088120.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T18:20:24.088120.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T18:20:24.088120.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T18:20:24.088120.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T18:20:24.088120.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-17T18:20:24.088120.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-17T18:20:24.088120.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-17T18:20:24.088120.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T18:20:24.088120.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-17T18:20:24.088120.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T18:20:24.088120.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T18:20:24.088120.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-17T18:20:24.088120.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-17T18:20:24.088120.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-17T18:20:24.088120.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T18:20:24.088120.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-17T18:20:24.088120.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-17T18:20:24.088120.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_08_17T18_20_24.088120", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T18:20:24.088120.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T18:20:24.088120.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_08_17T18_20_24.088120", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-17T18:20:24.088120.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-17T18:20:24.088120.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_08_17T18_20_24.088120", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-17T18:20:24.088120.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-17T18:20:24.088120.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_08_17T18_20_24.088120", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T18:20:24.088120.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T18:20:24.088120.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_08_17T18_20_24.088120", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T18:20:24.088120.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T18:20:24.088120.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_08_17T18_20_24.088120", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-17T18:20:24.088120.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-17T18:20:24.088120.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_08_17T18_20_24.088120", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T18:20:24.088120.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T18:20:24.088120.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_08_17T18_20_24.088120", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T18:20:24.088120.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T18:20:24.088120.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_08_17T18_20_24.088120", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T18:20:24.088120.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T18:20:24.088120.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_08_17T18_20_24.088120", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T18:20:24.088120.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T18:20:24.088120.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_08_17T18_20_24.088120", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-17T18:20:24.088120.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-17T18:20:24.088120.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_08_17T18_20_24.088120", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-17T18:20:24.088120.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-17T18:20:24.088120.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_08_17T18_20_24.088120", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T18:20:24.088120.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T18:20:24.088120.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_08_17T18_20_24.088120", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-17T18:20:24.088120.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-17T18:20:24.088120.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_08_17T18_20_24.088120", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T18:20:24.088120.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T18:20:24.088120.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_08_17T18_20_24.088120", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T18:20:24.088120.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T18:20:24.088120.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_08_17T18_20_24.088120", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T18:20:24.088120.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T18:20:24.088120.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_08_17T18_20_24.088120", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-17T18:20:24.088120.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-17T18:20:24.088120.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_08_17T18_20_24.088120", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T18:20:24.088120.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T18:20:24.088120.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_08_17T18_20_24.088120", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T18:20:24.088120.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T18:20:24.088120.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_08_17T18_20_24.088120", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T18:20:24.088120.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T18:20:24.088120.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_08_17T18_20_24.088120", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T18:20:24.088120.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T18:20:24.088120.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_08_17T18_20_24.088120", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T18:20:24.088120.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T18:20:24.088120.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_08_17T18_20_24.088120", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T18:20:24.088120.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T18:20:24.088120.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_08_17T18_20_24.088120", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T18:20:24.088120.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T18:20:24.088120.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_08_17T18_20_24.088120", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T18:20:24.088120.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T18:20:24.088120.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_08_17T18_20_24.088120", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T18:20:24.088120.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T18:20:24.088120.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_08_17T18_20_24.088120", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T18:20:24.088120.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T18:20:24.088120.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_08_17T18_20_24.088120", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T18:20:24.088120.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T18:20:24.088120.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_08_17T18_20_24.088120", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T18:20:24.088120.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T18:20:24.088120.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_08_17T18_20_24.088120", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T18:20:24.088120.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T18:20:24.088120.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_08_17T18_20_24.088120", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T18:20:24.088120.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T18:20:24.088120.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_08_17T18_20_24.088120", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-17T18:20:24.088120.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-17T18:20:24.088120.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_08_17T18_20_24.088120", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T18:20:24.088120.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T18:20:24.088120.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_08_17T18_20_24.088120", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-17T18:20:24.088120.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-17T18:20:24.088120.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_08_17T18_20_24.088120", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T18:20:24.088120.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T18:20:24.088120.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_08_17T18_20_24.088120", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T18:20:24.088120.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T18:20:24.088120.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_08_17T18_20_24.088120", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T18:20:24.088120.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T18:20:24.088120.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_08_17T18_20_24.088120", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-17T18:20:24.088120.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-17T18:20:24.088120.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_08_17T18_20_24.088120", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-17T18:20:24.088120.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-17T18:20:24.088120.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_08_17T18_20_24.088120", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T18:20:24.088120.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T18:20:24.088120.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_08_17T18_20_24.088120", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T18:20:24.088120.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T18:20:24.088120.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_08_17T18_20_24.088120", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T18:20:24.088120.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T18:20:24.088120.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_08_17T18_20_24.088120", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T18:20:24.088120.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T18:20:24.088120.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_08_17T18_20_24.088120", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-17T18:20:24.088120.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-17T18:20:24.088120.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_08_17T18_20_24.088120", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-17T18:20:24.088120.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-17T18:20:24.088120.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_08_17T18_20_24.088120", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-17T18:20:24.088120.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-17T18:20:24.088120.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_08_17T18_20_24.088120", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T18:20:24.088120.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T18:20:24.088120.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_08_17T18_20_24.088120", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-17T18:20:24.088120.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-17T18:20:24.088120.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_08_17T18_20_24.088120", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T18:20:24.088120.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T18:20:24.088120.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_08_17T18_20_24.088120", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T18:20:24.088120.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T18:20:24.088120.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_08_17T18_20_24.088120", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-17T18:20:24.088120.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-17T18:20:24.088120.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_08_17T18_20_24.088120", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-17T18:20:24.088120.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-17T18:20:24.088120.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_08_17T18_20_24.088120", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-17T18:20:24.088120.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-17T18:20:24.088120.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_08_17T18_20_24.088120", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T18:20:24.088120.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T18:20:24.088120.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_08_17T18_20_24.088120", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-17T18:20:24.088120.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-17T18:20:24.088120.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_08_17T18_20_24.088120", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-17T18:20:24.088120.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-17T18:20:24.088120.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_08_17T18_20_24.088120", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-17T18:20:24.088120.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-17T18:20:24.088120.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_09_17T09_10_48.065151", "path": ["**/details_harness|winogrande|5_2023-09-17T09-10-48.065151.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-09-17T09-10-48.065151.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_08_17T18_20_24.088120", "path": ["results_2023-08-17T18:20:24.088120.parquet"]}, {"split": "2023_09_17T09_10_48.065151", "path": ["results_2023-09-17T09-10-48.065151.parquet"]}, {"split": "latest", "path": ["results_2023-09-17T09-10-48.065151.parquet"]}]}]}
2023-09-17T08:10:59+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of jslin09/bloom-560m-finetuned-fraud ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model jslin09/bloom-560m-finetuned-fraud on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-09-17T09:10:48.065151(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of jslin09/bloom-560m-finetuned-fraud", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model jslin09/bloom-560m-finetuned-fraud on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-09-17T09:10:48.065151(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of jslin09/bloom-560m-finetuned-fraud", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model jslin09/bloom-560m-finetuned-fraud on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-09-17T09:10:48.065151(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 26, 31, 174, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of jslin09/bloom-560m-finetuned-fraud## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model jslin09/bloom-560m-finetuned-fraud on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-09-17T09:10:48.065151(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
4da75f834c70ef4ef746afb8eee09bdb87915d07
# Dataset Card for Evaluation run of LinkSoul/Chinese-Llama-2-7b ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/LinkSoul/Chinese-Llama-2-7b - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [LinkSoul/Chinese-Llama-2-7b](https://huggingface.co/LinkSoul/Chinese-Llama-2-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_LinkSoul__Chinese-Llama-2-7b", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-09-17T12:57:13.908145](https://huggingface.co/datasets/open-llm-leaderboard/details_LinkSoul__Chinese-Llama-2-7b/blob/main/results_2023-09-17T12-57-13.908145.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.03271812080536913, "em_stderr": 0.0018218405030911197, "f1": 0.0883210989932887, "f1_stderr": 0.002162976048256438, "acc": 0.43625495385576474, "acc_stderr": 0.011101966395253314 }, "harness|drop|3": { "em": 0.03271812080536913, "em_stderr": 0.0018218405030911197, "f1": 0.0883210989932887, "f1_stderr": 0.002162976048256438 }, "harness|gsm8k|5": { "acc": 0.14480667172100076, "acc_stderr": 0.009693234799052694 }, "harness|winogrande|5": { "acc": 0.7277032359905288, "acc_stderr": 0.012510697991453934 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_LinkSoul__Chinese-Llama-2-7b
[ "region:us" ]
2023-08-18T17:55:24+00:00
{"pretty_name": "Evaluation run of LinkSoul/Chinese-Llama-2-7b", "dataset_summary": "Dataset automatically created during the evaluation run of model [LinkSoul/Chinese-Llama-2-7b](https://huggingface.co/LinkSoul/Chinese-Llama-2-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_LinkSoul__Chinese-Llama-2-7b\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-09-17T12:57:13.908145](https://huggingface.co/datasets/open-llm-leaderboard/details_LinkSoul__Chinese-Llama-2-7b/blob/main/results_2023-09-17T12-57-13.908145.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.03271812080536913,\n \"em_stderr\": 0.0018218405030911197,\n \"f1\": 0.0883210989932887,\n \"f1_stderr\": 0.002162976048256438,\n \"acc\": 0.43625495385576474,\n \"acc_stderr\": 0.011101966395253314\n },\n \"harness|drop|3\": {\n \"em\": 0.03271812080536913,\n \"em_stderr\": 0.0018218405030911197,\n \"f1\": 0.0883210989932887,\n \"f1_stderr\": 0.002162976048256438\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.14480667172100076,\n \"acc_stderr\": 0.009693234799052694\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7277032359905288,\n \"acc_stderr\": 0.012510697991453934\n }\n}\n```", "repo_url": "https://huggingface.co/LinkSoul/Chinese-Llama-2-7b", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_08_17T18_27_31.562743", "path": ["**/details_harness|arc:challenge|25_2023-08-17T18:27:31.562743.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-08-17T18:27:31.562743.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_09_17T12_57_13.908145", "path": ["**/details_harness|drop|3_2023-09-17T12-57-13.908145.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-09-17T12-57-13.908145.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_09_17T12_57_13.908145", "path": ["**/details_harness|gsm8k|5_2023-09-17T12-57-13.908145.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-09-17T12-57-13.908145.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_08_17T18_27_31.562743", "path": ["**/details_harness|hellaswag|10_2023-08-17T18:27:31.562743.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-08-17T18:27:31.562743.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_08_17T18_27_31.562743", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T18:27:31.562743.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-17T18:27:31.562743.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-17T18:27:31.562743.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T18:27:31.562743.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T18:27:31.562743.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-17T18:27:31.562743.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T18:27:31.562743.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T18:27:31.562743.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T18:27:31.562743.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T18:27:31.562743.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-17T18:27:31.562743.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-17T18:27:31.562743.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T18:27:31.562743.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-17T18:27:31.562743.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T18:27:31.562743.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T18:27:31.562743.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T18:27:31.562743.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-17T18:27:31.562743.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T18:27:31.562743.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T18:27:31.562743.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T18:27:31.562743.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T18:27:31.562743.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T18:27:31.562743.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T18:27:31.562743.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T18:27:31.562743.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T18:27:31.562743.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T18:27:31.562743.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T18:27:31.562743.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T18:27:31.562743.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T18:27:31.562743.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T18:27:31.562743.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T18:27:31.562743.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-17T18:27:31.562743.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T18:27:31.562743.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-17T18:27:31.562743.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T18:27:31.562743.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T18:27:31.562743.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T18:27:31.562743.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-17T18:27:31.562743.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-17T18:27:31.562743.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T18:27:31.562743.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T18:27:31.562743.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T18:27:31.562743.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T18:27:31.562743.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-17T18:27:31.562743.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-17T18:27:31.562743.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-17T18:27:31.562743.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T18:27:31.562743.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-17T18:27:31.562743.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T18:27:31.562743.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T18:27:31.562743.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-17T18:27:31.562743.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-17T18:27:31.562743.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-17T18:27:31.562743.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T18:27:31.562743.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-17T18:27:31.562743.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-17T18:27:31.562743.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T18:27:31.562743.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-17T18:27:31.562743.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-17T18:27:31.562743.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T18:27:31.562743.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T18:27:31.562743.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-17T18:27:31.562743.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T18:27:31.562743.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T18:27:31.562743.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T18:27:31.562743.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T18:27:31.562743.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-17T18:27:31.562743.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-17T18:27:31.562743.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T18:27:31.562743.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-17T18:27:31.562743.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T18:27:31.562743.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T18:27:31.562743.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T18:27:31.562743.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-17T18:27:31.562743.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T18:27:31.562743.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T18:27:31.562743.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T18:27:31.562743.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T18:27:31.562743.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T18:27:31.562743.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T18:27:31.562743.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T18:27:31.562743.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T18:27:31.562743.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T18:27:31.562743.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T18:27:31.562743.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T18:27:31.562743.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T18:27:31.562743.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T18:27:31.562743.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T18:27:31.562743.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-17T18:27:31.562743.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T18:27:31.562743.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-17T18:27:31.562743.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T18:27:31.562743.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T18:27:31.562743.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T18:27:31.562743.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-17T18:27:31.562743.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-17T18:27:31.562743.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T18:27:31.562743.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T18:27:31.562743.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T18:27:31.562743.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T18:27:31.562743.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-17T18:27:31.562743.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-17T18:27:31.562743.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-17T18:27:31.562743.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T18:27:31.562743.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-17T18:27:31.562743.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T18:27:31.562743.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T18:27:31.562743.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-17T18:27:31.562743.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-17T18:27:31.562743.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-17T18:27:31.562743.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T18:27:31.562743.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-17T18:27:31.562743.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-17T18:27:31.562743.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_08_17T18_27_31.562743", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T18:27:31.562743.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T18:27:31.562743.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_08_17T18_27_31.562743", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-17T18:27:31.562743.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-17T18:27:31.562743.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_08_17T18_27_31.562743", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-17T18:27:31.562743.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-17T18:27:31.562743.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_08_17T18_27_31.562743", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T18:27:31.562743.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T18:27:31.562743.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_08_17T18_27_31.562743", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T18:27:31.562743.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T18:27:31.562743.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_08_17T18_27_31.562743", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-17T18:27:31.562743.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-17T18:27:31.562743.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_08_17T18_27_31.562743", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T18:27:31.562743.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T18:27:31.562743.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_08_17T18_27_31.562743", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T18:27:31.562743.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T18:27:31.562743.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_08_17T18_27_31.562743", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T18:27:31.562743.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T18:27:31.562743.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_08_17T18_27_31.562743", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T18:27:31.562743.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T18:27:31.562743.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_08_17T18_27_31.562743", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-17T18:27:31.562743.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-17T18:27:31.562743.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_08_17T18_27_31.562743", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-17T18:27:31.562743.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-17T18:27:31.562743.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_08_17T18_27_31.562743", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T18:27:31.562743.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T18:27:31.562743.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_08_17T18_27_31.562743", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-17T18:27:31.562743.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-17T18:27:31.562743.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_08_17T18_27_31.562743", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T18:27:31.562743.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T18:27:31.562743.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_08_17T18_27_31.562743", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T18:27:31.562743.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T18:27:31.562743.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_08_17T18_27_31.562743", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T18:27:31.562743.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T18:27:31.562743.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_08_17T18_27_31.562743", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-17T18:27:31.562743.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-17T18:27:31.562743.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_08_17T18_27_31.562743", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T18:27:31.562743.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T18:27:31.562743.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_08_17T18_27_31.562743", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T18:27:31.562743.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T18:27:31.562743.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_08_17T18_27_31.562743", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T18:27:31.562743.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T18:27:31.562743.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_08_17T18_27_31.562743", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T18:27:31.562743.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T18:27:31.562743.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_08_17T18_27_31.562743", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T18:27:31.562743.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T18:27:31.562743.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_08_17T18_27_31.562743", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T18:27:31.562743.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T18:27:31.562743.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_08_17T18_27_31.562743", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T18:27:31.562743.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T18:27:31.562743.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_08_17T18_27_31.562743", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T18:27:31.562743.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T18:27:31.562743.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_08_17T18_27_31.562743", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T18:27:31.562743.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T18:27:31.562743.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_08_17T18_27_31.562743", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T18:27:31.562743.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T18:27:31.562743.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_08_17T18_27_31.562743", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T18:27:31.562743.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T18:27:31.562743.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_08_17T18_27_31.562743", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T18:27:31.562743.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T18:27:31.562743.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_08_17T18_27_31.562743", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T18:27:31.562743.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T18:27:31.562743.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_08_17T18_27_31.562743", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T18:27:31.562743.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T18:27:31.562743.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_08_17T18_27_31.562743", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-17T18:27:31.562743.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-17T18:27:31.562743.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_08_17T18_27_31.562743", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T18:27:31.562743.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T18:27:31.562743.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_08_17T18_27_31.562743", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-17T18:27:31.562743.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-17T18:27:31.562743.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_08_17T18_27_31.562743", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T18:27:31.562743.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T18:27:31.562743.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_08_17T18_27_31.562743", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T18:27:31.562743.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T18:27:31.562743.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_08_17T18_27_31.562743", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T18:27:31.562743.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T18:27:31.562743.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_08_17T18_27_31.562743", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-17T18:27:31.562743.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-17T18:27:31.562743.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_08_17T18_27_31.562743", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-17T18:27:31.562743.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-17T18:27:31.562743.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_08_17T18_27_31.562743", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T18:27:31.562743.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T18:27:31.562743.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_08_17T18_27_31.562743", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T18:27:31.562743.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T18:27:31.562743.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_08_17T18_27_31.562743", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T18:27:31.562743.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T18:27:31.562743.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_08_17T18_27_31.562743", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T18:27:31.562743.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T18:27:31.562743.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_08_17T18_27_31.562743", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-17T18:27:31.562743.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-17T18:27:31.562743.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_08_17T18_27_31.562743", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-17T18:27:31.562743.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-17T18:27:31.562743.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_08_17T18_27_31.562743", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-17T18:27:31.562743.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-17T18:27:31.562743.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_08_17T18_27_31.562743", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T18:27:31.562743.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T18:27:31.562743.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_08_17T18_27_31.562743", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-17T18:27:31.562743.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-17T18:27:31.562743.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_08_17T18_27_31.562743", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T18:27:31.562743.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T18:27:31.562743.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_08_17T18_27_31.562743", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T18:27:31.562743.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T18:27:31.562743.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_08_17T18_27_31.562743", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-17T18:27:31.562743.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-17T18:27:31.562743.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_08_17T18_27_31.562743", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-17T18:27:31.562743.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-17T18:27:31.562743.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_08_17T18_27_31.562743", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-17T18:27:31.562743.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-17T18:27:31.562743.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_08_17T18_27_31.562743", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T18:27:31.562743.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T18:27:31.562743.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_08_17T18_27_31.562743", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-17T18:27:31.562743.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-17T18:27:31.562743.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_08_17T18_27_31.562743", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-17T18:27:31.562743.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-17T18:27:31.562743.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_08_17T18_27_31.562743", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-17T18:27:31.562743.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-17T18:27:31.562743.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_09_17T12_57_13.908145", "path": ["**/details_harness|winogrande|5_2023-09-17T12-57-13.908145.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-09-17T12-57-13.908145.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_08_17T18_27_31.562743", "path": ["results_2023-08-17T18:27:31.562743.parquet"]}, {"split": "2023_09_17T12_57_13.908145", "path": ["results_2023-09-17T12-57-13.908145.parquet"]}, {"split": "latest", "path": ["results_2023-09-17T12-57-13.908145.parquet"]}]}]}
2023-09-17T11:57:26+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of LinkSoul/Chinese-Llama-2-7b ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model LinkSoul/Chinese-Llama-2-7b on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-09-17T12:57:13.908145(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of LinkSoul/Chinese-Llama-2-7b", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model LinkSoul/Chinese-Llama-2-7b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-09-17T12:57:13.908145(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of LinkSoul/Chinese-Llama-2-7b", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model LinkSoul/Chinese-Llama-2-7b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-09-17T12:57:13.908145(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 21, 31, 169, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of LinkSoul/Chinese-Llama-2-7b## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model LinkSoul/Chinese-Llama-2-7b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-09-17T12:57:13.908145(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
d8a6f92712b8f52d016729d83ac16998146614af
# Dataset Card for Evaluation run of FelixChao/llama2-13b-math1.2 ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/FelixChao/llama2-13b-math1.2 - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [FelixChao/llama2-13b-math1.2](https://huggingface.co/FelixChao/llama2-13b-math1.2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_FelixChao__llama2-13b-math1.2", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-10-17T02:02:50.506714](https://huggingface.co/datasets/open-llm-leaderboard/details_FelixChao__llama2-13b-math1.2/blob/main/results_2023-10-17T02-02-50.506714.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.08536073825503356, "em_stderr": 0.002861499356149465, "f1": 0.16235318791946293, "f1_stderr": 0.003133455634092774, "acc": 0.4263155280751903, "acc_stderr": 0.010451092603365564 }, "harness|drop|3": { "em": 0.08536073825503356, "em_stderr": 0.002861499356149465, "f1": 0.16235318791946293, "f1_stderr": 0.003133455634092774 }, "harness|gsm8k|5": { "acc": 0.10993176648976498, "acc_stderr": 0.008616195587865416 }, "harness|winogrande|5": { "acc": 0.7426992896606156, "acc_stderr": 0.012285989618865713 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_FelixChao__llama2-13b-math1.2
[ "region:us" ]
2023-08-18T17:55:32+00:00
{"pretty_name": "Evaluation run of FelixChao/llama2-13b-math1.2", "dataset_summary": "Dataset automatically created during the evaluation run of model [FelixChao/llama2-13b-math1.2](https://huggingface.co/FelixChao/llama2-13b-math1.2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_FelixChao__llama2-13b-math1.2\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-17T02:02:50.506714](https://huggingface.co/datasets/open-llm-leaderboard/details_FelixChao__llama2-13b-math1.2/blob/main/results_2023-10-17T02-02-50.506714.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.08536073825503356,\n \"em_stderr\": 0.002861499356149465,\n \"f1\": 0.16235318791946293,\n \"f1_stderr\": 0.003133455634092774,\n \"acc\": 0.4263155280751903,\n \"acc_stderr\": 0.010451092603365564\n },\n \"harness|drop|3\": {\n \"em\": 0.08536073825503356,\n \"em_stderr\": 0.002861499356149465,\n \"f1\": 0.16235318791946293,\n \"f1_stderr\": 0.003133455634092774\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.10993176648976498,\n \"acc_stderr\": 0.008616195587865416\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7426992896606156,\n \"acc_stderr\": 0.012285989618865713\n }\n}\n```", "repo_url": "https://huggingface.co/FelixChao/llama2-13b-math1.2", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_08_18T11_24_31.239858", "path": ["**/details_harness|arc:challenge|25_2023-08-18T11:24:31.239858.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-08-18T11:24:31.239858.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_16T16_38_50.665467", "path": ["**/details_harness|drop|3_2023-10-16T16-38-50.665467.parquet"]}, {"split": "2023_10_17T02_02_50.506714", "path": ["**/details_harness|drop|3_2023-10-17T02-02-50.506714.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-17T02-02-50.506714.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_16T16_38_50.665467", "path": ["**/details_harness|gsm8k|5_2023-10-16T16-38-50.665467.parquet"]}, {"split": "2023_10_17T02_02_50.506714", "path": ["**/details_harness|gsm8k|5_2023-10-17T02-02-50.506714.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-17T02-02-50.506714.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_08_18T11_24_31.239858", "path": ["**/details_harness|hellaswag|10_2023-08-18T11:24:31.239858.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-08-18T11:24:31.239858.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_08_18T11_24_31.239858", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T11:24:31.239858.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-18T11:24:31.239858.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-18T11:24:31.239858.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T11:24:31.239858.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T11:24:31.239858.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-18T11:24:31.239858.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T11:24:31.239858.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T11:24:31.239858.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T11:24:31.239858.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T11:24:31.239858.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-18T11:24:31.239858.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-18T11:24:31.239858.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T11:24:31.239858.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-18T11:24:31.239858.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T11:24:31.239858.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T11:24:31.239858.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T11:24:31.239858.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-18T11:24:31.239858.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T11:24:31.239858.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T11:24:31.239858.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T11:24:31.239858.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T11:24:31.239858.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T11:24:31.239858.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T11:24:31.239858.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T11:24:31.239858.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T11:24:31.239858.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T11:24:31.239858.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T11:24:31.239858.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T11:24:31.239858.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T11:24:31.239858.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T11:24:31.239858.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T11:24:31.239858.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-18T11:24:31.239858.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T11:24:31.239858.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-18T11:24:31.239858.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T11:24:31.239858.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T11:24:31.239858.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T11:24:31.239858.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-18T11:24:31.239858.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-18T11:24:31.239858.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T11:24:31.239858.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T11:24:31.239858.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T11:24:31.239858.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T11:24:31.239858.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-18T11:24:31.239858.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-18T11:24:31.239858.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-18T11:24:31.239858.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T11:24:31.239858.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-18T11:24:31.239858.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T11:24:31.239858.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T11:24:31.239858.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-18T11:24:31.239858.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-18T11:24:31.239858.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-18T11:24:31.239858.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T11:24:31.239858.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-18T11:24:31.239858.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-18T11:24:31.239858.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T11:24:31.239858.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-18T11:24:31.239858.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-18T11:24:31.239858.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T11:24:31.239858.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T11:24:31.239858.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-18T11:24:31.239858.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T11:24:31.239858.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T11:24:31.239858.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T11:24:31.239858.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T11:24:31.239858.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-18T11:24:31.239858.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-18T11:24:31.239858.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T11:24:31.239858.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-18T11:24:31.239858.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T11:24:31.239858.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T11:24:31.239858.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T11:24:31.239858.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-18T11:24:31.239858.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T11:24:31.239858.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T11:24:31.239858.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T11:24:31.239858.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T11:24:31.239858.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T11:24:31.239858.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T11:24:31.239858.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T11:24:31.239858.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T11:24:31.239858.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T11:24:31.239858.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T11:24:31.239858.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T11:24:31.239858.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T11:24:31.239858.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T11:24:31.239858.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T11:24:31.239858.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-18T11:24:31.239858.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T11:24:31.239858.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-18T11:24:31.239858.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T11:24:31.239858.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T11:24:31.239858.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T11:24:31.239858.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-18T11:24:31.239858.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-18T11:24:31.239858.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T11:24:31.239858.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T11:24:31.239858.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T11:24:31.239858.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T11:24:31.239858.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-18T11:24:31.239858.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-18T11:24:31.239858.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-18T11:24:31.239858.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T11:24:31.239858.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-18T11:24:31.239858.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T11:24:31.239858.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T11:24:31.239858.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-18T11:24:31.239858.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-18T11:24:31.239858.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-18T11:24:31.239858.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T11:24:31.239858.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-18T11:24:31.239858.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-18T11:24:31.239858.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_08_18T11_24_31.239858", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T11:24:31.239858.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T11:24:31.239858.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_08_18T11_24_31.239858", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-18T11:24:31.239858.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-18T11:24:31.239858.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_08_18T11_24_31.239858", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-18T11:24:31.239858.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-18T11:24:31.239858.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_08_18T11_24_31.239858", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T11:24:31.239858.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T11:24:31.239858.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_08_18T11_24_31.239858", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T11:24:31.239858.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T11:24:31.239858.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_08_18T11_24_31.239858", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-18T11:24:31.239858.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-18T11:24:31.239858.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_08_18T11_24_31.239858", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T11:24:31.239858.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T11:24:31.239858.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_08_18T11_24_31.239858", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T11:24:31.239858.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T11:24:31.239858.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_08_18T11_24_31.239858", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T11:24:31.239858.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T11:24:31.239858.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_08_18T11_24_31.239858", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T11:24:31.239858.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T11:24:31.239858.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_08_18T11_24_31.239858", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-18T11:24:31.239858.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-18T11:24:31.239858.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_08_18T11_24_31.239858", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-18T11:24:31.239858.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-18T11:24:31.239858.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_08_18T11_24_31.239858", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T11:24:31.239858.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T11:24:31.239858.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_08_18T11_24_31.239858", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-18T11:24:31.239858.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-18T11:24:31.239858.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_08_18T11_24_31.239858", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T11:24:31.239858.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T11:24:31.239858.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_08_18T11_24_31.239858", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T11:24:31.239858.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T11:24:31.239858.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_08_18T11_24_31.239858", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T11:24:31.239858.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T11:24:31.239858.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_08_18T11_24_31.239858", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-18T11:24:31.239858.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-18T11:24:31.239858.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_08_18T11_24_31.239858", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T11:24:31.239858.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T11:24:31.239858.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_08_18T11_24_31.239858", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T11:24:31.239858.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T11:24:31.239858.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_08_18T11_24_31.239858", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T11:24:31.239858.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T11:24:31.239858.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_08_18T11_24_31.239858", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T11:24:31.239858.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T11:24:31.239858.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_08_18T11_24_31.239858", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T11:24:31.239858.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T11:24:31.239858.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_08_18T11_24_31.239858", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T11:24:31.239858.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T11:24:31.239858.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_08_18T11_24_31.239858", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T11:24:31.239858.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T11:24:31.239858.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_08_18T11_24_31.239858", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T11:24:31.239858.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T11:24:31.239858.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_08_18T11_24_31.239858", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T11:24:31.239858.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T11:24:31.239858.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_08_18T11_24_31.239858", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T11:24:31.239858.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T11:24:31.239858.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_08_18T11_24_31.239858", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T11:24:31.239858.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T11:24:31.239858.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_08_18T11_24_31.239858", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T11:24:31.239858.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T11:24:31.239858.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_08_18T11_24_31.239858", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T11:24:31.239858.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T11:24:31.239858.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_08_18T11_24_31.239858", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T11:24:31.239858.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T11:24:31.239858.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_08_18T11_24_31.239858", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-18T11:24:31.239858.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-18T11:24:31.239858.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_08_18T11_24_31.239858", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T11:24:31.239858.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T11:24:31.239858.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_08_18T11_24_31.239858", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-18T11:24:31.239858.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-18T11:24:31.239858.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_08_18T11_24_31.239858", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T11:24:31.239858.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T11:24:31.239858.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_08_18T11_24_31.239858", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T11:24:31.239858.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T11:24:31.239858.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_08_18T11_24_31.239858", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T11:24:31.239858.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T11:24:31.239858.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_08_18T11_24_31.239858", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-18T11:24:31.239858.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-18T11:24:31.239858.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_08_18T11_24_31.239858", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-18T11:24:31.239858.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-18T11:24:31.239858.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_08_18T11_24_31.239858", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T11:24:31.239858.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T11:24:31.239858.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_08_18T11_24_31.239858", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T11:24:31.239858.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T11:24:31.239858.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_08_18T11_24_31.239858", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T11:24:31.239858.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T11:24:31.239858.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_08_18T11_24_31.239858", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T11:24:31.239858.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T11:24:31.239858.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_08_18T11_24_31.239858", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-18T11:24:31.239858.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-18T11:24:31.239858.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_08_18T11_24_31.239858", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-18T11:24:31.239858.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-18T11:24:31.239858.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_08_18T11_24_31.239858", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-18T11:24:31.239858.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-18T11:24:31.239858.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_08_18T11_24_31.239858", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T11:24:31.239858.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T11:24:31.239858.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_08_18T11_24_31.239858", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-18T11:24:31.239858.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-18T11:24:31.239858.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_08_18T11_24_31.239858", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T11:24:31.239858.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T11:24:31.239858.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_08_18T11_24_31.239858", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T11:24:31.239858.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T11:24:31.239858.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_08_18T11_24_31.239858", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-18T11:24:31.239858.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-18T11:24:31.239858.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_08_18T11_24_31.239858", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-18T11:24:31.239858.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-18T11:24:31.239858.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_08_18T11_24_31.239858", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-18T11:24:31.239858.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-18T11:24:31.239858.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_08_18T11_24_31.239858", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T11:24:31.239858.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T11:24:31.239858.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_08_18T11_24_31.239858", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-18T11:24:31.239858.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-18T11:24:31.239858.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_08_18T11_24_31.239858", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-18T11:24:31.239858.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-18T11:24:31.239858.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_08_18T11_24_31.239858", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-18T11:24:31.239858.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-18T11:24:31.239858.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_16T16_38_50.665467", "path": ["**/details_harness|winogrande|5_2023-10-16T16-38-50.665467.parquet"]}, {"split": "2023_10_17T02_02_50.506714", "path": ["**/details_harness|winogrande|5_2023-10-17T02-02-50.506714.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-17T02-02-50.506714.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_08_18T11_24_31.239858", "path": ["results_2023-08-18T11:24:31.239858.parquet"]}, {"split": "2023_10_16T16_38_50.665467", "path": ["results_2023-10-16T16-38-50.665467.parquet"]}, {"split": "2023_10_17T02_02_50.506714", "path": ["results_2023-10-17T02-02-50.506714.parquet"]}, {"split": "latest", "path": ["results_2023-10-17T02-02-50.506714.parquet"]}]}]}
2023-10-17T01:03:03+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of FelixChao/llama2-13b-math1.2 ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model FelixChao/llama2-13b-math1.2 on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-10-17T02:02:50.506714(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of FelixChao/llama2-13b-math1.2", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model FelixChao/llama2-13b-math1.2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-17T02:02:50.506714(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of FelixChao/llama2-13b-math1.2", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model FelixChao/llama2-13b-math1.2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-17T02:02:50.506714(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 21, 31, 169, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of FelixChao/llama2-13b-math1.2## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model FelixChao/llama2-13b-math1.2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-17T02:02:50.506714(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
b976e006b13f52c2e7ad0cd84061a64ccbe14607
# Dataset Card for Evaluation run of FelixChao/vicuna-7B-physics ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/FelixChao/vicuna-7B-physics - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [FelixChao/vicuna-7B-physics](https://huggingface.co/FelixChao/vicuna-7B-physics) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_FelixChao__vicuna-7B-physics", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-10-16T22:59:16.293660](https://huggingface.co/datasets/open-llm-leaderboard/details_FelixChao__vicuna-7B-physics/blob/main/results_2023-10-16T22-59-16.293660.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.001153523489932886, "em_stderr": 0.0003476179896857083, "f1": 0.0649444211409399, "f1_stderr": 0.001377707459666866, "acc": 0.3681106025528177, "acc_stderr": 0.009254111861396252 }, "harness|drop|3": { "em": 0.001153523489932886, "em_stderr": 0.0003476179896857083, "f1": 0.0649444211409399, "f1_stderr": 0.001377707459666866 }, "harness|gsm8k|5": { "acc": 0.04245640636846096, "acc_stderr": 0.005553837749990046 }, "harness|winogrande|5": { "acc": 0.6937647987371744, "acc_stderr": 0.012954385972802457 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_FelixChao__vicuna-7B-physics
[ "region:us" ]
2023-08-18T17:55:44+00:00
{"pretty_name": "Evaluation run of FelixChao/vicuna-7B-physics", "dataset_summary": "Dataset automatically created during the evaluation run of model [FelixChao/vicuna-7B-physics](https://huggingface.co/FelixChao/vicuna-7B-physics) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_FelixChao__vicuna-7B-physics\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-16T22:59:16.293660](https://huggingface.co/datasets/open-llm-leaderboard/details_FelixChao__vicuna-7B-physics/blob/main/results_2023-10-16T22-59-16.293660.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.001153523489932886,\n \"em_stderr\": 0.0003476179896857083,\n \"f1\": 0.0649444211409399,\n \"f1_stderr\": 0.001377707459666866,\n \"acc\": 0.3681106025528177,\n \"acc_stderr\": 0.009254111861396252\n },\n \"harness|drop|3\": {\n \"em\": 0.001153523489932886,\n \"em_stderr\": 0.0003476179896857083,\n \"f1\": 0.0649444211409399,\n \"f1_stderr\": 0.001377707459666866\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.04245640636846096,\n \"acc_stderr\": 0.005553837749990046\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.6937647987371744,\n \"acc_stderr\": 0.012954385972802457\n }\n}\n```", "repo_url": "https://huggingface.co/FelixChao/vicuna-7B-physics", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_08_18T10_17_03.743373", "path": ["**/details_harness|arc:challenge|25_2023-08-18T10:17:03.743373.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-08-18T10:17:03.743373.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_16T22_59_16.293660", "path": ["**/details_harness|drop|3_2023-10-16T22-59-16.293660.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-16T22-59-16.293660.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_16T22_59_16.293660", "path": ["**/details_harness|gsm8k|5_2023-10-16T22-59-16.293660.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-16T22-59-16.293660.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_08_18T10_17_03.743373", "path": ["**/details_harness|hellaswag|10_2023-08-18T10:17:03.743373.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-08-18T10:17:03.743373.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_08_18T10_17_03.743373", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T10:17:03.743373.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-18T10:17:03.743373.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-18T10:17:03.743373.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T10:17:03.743373.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T10:17:03.743373.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-18T10:17:03.743373.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T10:17:03.743373.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T10:17:03.743373.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T10:17:03.743373.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T10:17:03.743373.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-18T10:17:03.743373.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-18T10:17:03.743373.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T10:17:03.743373.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-18T10:17:03.743373.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T10:17:03.743373.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T10:17:03.743373.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T10:17:03.743373.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-18T10:17:03.743373.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T10:17:03.743373.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T10:17:03.743373.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T10:17:03.743373.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T10:17:03.743373.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T10:17:03.743373.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T10:17:03.743373.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T10:17:03.743373.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T10:17:03.743373.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T10:17:03.743373.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T10:17:03.743373.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T10:17:03.743373.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T10:17:03.743373.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T10:17:03.743373.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T10:17:03.743373.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-18T10:17:03.743373.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T10:17:03.743373.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-18T10:17:03.743373.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T10:17:03.743373.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T10:17:03.743373.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T10:17:03.743373.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-18T10:17:03.743373.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-18T10:17:03.743373.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T10:17:03.743373.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T10:17:03.743373.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T10:17:03.743373.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T10:17:03.743373.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-18T10:17:03.743373.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-18T10:17:03.743373.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-18T10:17:03.743373.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T10:17:03.743373.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-18T10:17:03.743373.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T10:17:03.743373.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T10:17:03.743373.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-18T10:17:03.743373.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-18T10:17:03.743373.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-18T10:17:03.743373.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T10:17:03.743373.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-18T10:17:03.743373.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-18T10:17:03.743373.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T10:17:03.743373.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-18T10:17:03.743373.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-18T10:17:03.743373.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T10:17:03.743373.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T10:17:03.743373.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-18T10:17:03.743373.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T10:17:03.743373.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T10:17:03.743373.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T10:17:03.743373.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T10:17:03.743373.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-18T10:17:03.743373.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-18T10:17:03.743373.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T10:17:03.743373.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-18T10:17:03.743373.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T10:17:03.743373.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T10:17:03.743373.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T10:17:03.743373.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-18T10:17:03.743373.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T10:17:03.743373.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T10:17:03.743373.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T10:17:03.743373.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T10:17:03.743373.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T10:17:03.743373.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T10:17:03.743373.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T10:17:03.743373.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T10:17:03.743373.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T10:17:03.743373.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T10:17:03.743373.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T10:17:03.743373.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T10:17:03.743373.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T10:17:03.743373.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T10:17:03.743373.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-18T10:17:03.743373.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T10:17:03.743373.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-18T10:17:03.743373.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T10:17:03.743373.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T10:17:03.743373.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T10:17:03.743373.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-18T10:17:03.743373.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-18T10:17:03.743373.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T10:17:03.743373.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T10:17:03.743373.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T10:17:03.743373.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T10:17:03.743373.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-18T10:17:03.743373.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-18T10:17:03.743373.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-18T10:17:03.743373.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T10:17:03.743373.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-18T10:17:03.743373.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T10:17:03.743373.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T10:17:03.743373.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-18T10:17:03.743373.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-18T10:17:03.743373.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-18T10:17:03.743373.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T10:17:03.743373.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-18T10:17:03.743373.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-18T10:17:03.743373.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_08_18T10_17_03.743373", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T10:17:03.743373.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T10:17:03.743373.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_08_18T10_17_03.743373", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-18T10:17:03.743373.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-18T10:17:03.743373.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_08_18T10_17_03.743373", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-18T10:17:03.743373.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-18T10:17:03.743373.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_08_18T10_17_03.743373", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T10:17:03.743373.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T10:17:03.743373.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_08_18T10_17_03.743373", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T10:17:03.743373.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T10:17:03.743373.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_08_18T10_17_03.743373", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-18T10:17:03.743373.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-18T10:17:03.743373.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_08_18T10_17_03.743373", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T10:17:03.743373.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T10:17:03.743373.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_08_18T10_17_03.743373", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T10:17:03.743373.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T10:17:03.743373.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_08_18T10_17_03.743373", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T10:17:03.743373.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T10:17:03.743373.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_08_18T10_17_03.743373", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T10:17:03.743373.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T10:17:03.743373.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_08_18T10_17_03.743373", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-18T10:17:03.743373.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-18T10:17:03.743373.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_08_18T10_17_03.743373", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-18T10:17:03.743373.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-18T10:17:03.743373.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_08_18T10_17_03.743373", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T10:17:03.743373.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T10:17:03.743373.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_08_18T10_17_03.743373", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-18T10:17:03.743373.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-18T10:17:03.743373.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_08_18T10_17_03.743373", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T10:17:03.743373.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T10:17:03.743373.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_08_18T10_17_03.743373", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T10:17:03.743373.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T10:17:03.743373.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_08_18T10_17_03.743373", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T10:17:03.743373.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T10:17:03.743373.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_08_18T10_17_03.743373", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-18T10:17:03.743373.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-18T10:17:03.743373.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_08_18T10_17_03.743373", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T10:17:03.743373.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T10:17:03.743373.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_08_18T10_17_03.743373", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T10:17:03.743373.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T10:17:03.743373.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_08_18T10_17_03.743373", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T10:17:03.743373.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T10:17:03.743373.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_08_18T10_17_03.743373", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T10:17:03.743373.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T10:17:03.743373.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_08_18T10_17_03.743373", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T10:17:03.743373.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T10:17:03.743373.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_08_18T10_17_03.743373", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T10:17:03.743373.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T10:17:03.743373.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_08_18T10_17_03.743373", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T10:17:03.743373.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T10:17:03.743373.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_08_18T10_17_03.743373", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T10:17:03.743373.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T10:17:03.743373.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_08_18T10_17_03.743373", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T10:17:03.743373.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T10:17:03.743373.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_08_18T10_17_03.743373", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T10:17:03.743373.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T10:17:03.743373.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_08_18T10_17_03.743373", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T10:17:03.743373.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T10:17:03.743373.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_08_18T10_17_03.743373", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T10:17:03.743373.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T10:17:03.743373.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_08_18T10_17_03.743373", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T10:17:03.743373.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T10:17:03.743373.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_08_18T10_17_03.743373", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T10:17:03.743373.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T10:17:03.743373.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_08_18T10_17_03.743373", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-18T10:17:03.743373.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-18T10:17:03.743373.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_08_18T10_17_03.743373", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T10:17:03.743373.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T10:17:03.743373.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_08_18T10_17_03.743373", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-18T10:17:03.743373.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-18T10:17:03.743373.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_08_18T10_17_03.743373", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T10:17:03.743373.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T10:17:03.743373.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_08_18T10_17_03.743373", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T10:17:03.743373.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T10:17:03.743373.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_08_18T10_17_03.743373", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T10:17:03.743373.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T10:17:03.743373.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_08_18T10_17_03.743373", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-18T10:17:03.743373.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-18T10:17:03.743373.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_08_18T10_17_03.743373", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-18T10:17:03.743373.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-18T10:17:03.743373.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_08_18T10_17_03.743373", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T10:17:03.743373.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T10:17:03.743373.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_08_18T10_17_03.743373", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T10:17:03.743373.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T10:17:03.743373.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_08_18T10_17_03.743373", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T10:17:03.743373.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T10:17:03.743373.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_08_18T10_17_03.743373", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T10:17:03.743373.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T10:17:03.743373.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_08_18T10_17_03.743373", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-18T10:17:03.743373.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-18T10:17:03.743373.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_08_18T10_17_03.743373", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-18T10:17:03.743373.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-18T10:17:03.743373.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_08_18T10_17_03.743373", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-18T10:17:03.743373.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-18T10:17:03.743373.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_08_18T10_17_03.743373", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T10:17:03.743373.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T10:17:03.743373.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_08_18T10_17_03.743373", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-18T10:17:03.743373.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-18T10:17:03.743373.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_08_18T10_17_03.743373", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T10:17:03.743373.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T10:17:03.743373.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_08_18T10_17_03.743373", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T10:17:03.743373.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T10:17:03.743373.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_08_18T10_17_03.743373", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-18T10:17:03.743373.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-18T10:17:03.743373.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_08_18T10_17_03.743373", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-18T10:17:03.743373.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-18T10:17:03.743373.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_08_18T10_17_03.743373", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-18T10:17:03.743373.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-18T10:17:03.743373.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_08_18T10_17_03.743373", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T10:17:03.743373.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T10:17:03.743373.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_08_18T10_17_03.743373", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-18T10:17:03.743373.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-18T10:17:03.743373.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_08_18T10_17_03.743373", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-18T10:17:03.743373.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-18T10:17:03.743373.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_08_18T10_17_03.743373", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-18T10:17:03.743373.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-18T10:17:03.743373.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_16T22_59_16.293660", "path": ["**/details_harness|winogrande|5_2023-10-16T22-59-16.293660.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-16T22-59-16.293660.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_08_18T10_17_03.743373", "path": ["results_2023-08-18T10:17:03.743373.parquet"]}, {"split": "2023_10_16T22_59_16.293660", "path": ["results_2023-10-16T22-59-16.293660.parquet"]}, {"split": "latest", "path": ["results_2023-10-16T22-59-16.293660.parquet"]}]}]}
2023-10-16T21:59:29+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of FelixChao/vicuna-7B-physics ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model FelixChao/vicuna-7B-physics on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-10-16T22:59:16.293660(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of FelixChao/vicuna-7B-physics", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model FelixChao/vicuna-7B-physics on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-16T22:59:16.293660(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of FelixChao/vicuna-7B-physics", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model FelixChao/vicuna-7B-physics on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-16T22:59:16.293660(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 21, 31, 169, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of FelixChao/vicuna-7B-physics## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model FelixChao/vicuna-7B-physics on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-16T22:59:16.293660(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
3a3ce167929b1c63eac0f949def71422de41b4c5
# Dataset Card for Evaluation run of FelixChao/llama2-13b-math1.1 ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/FelixChao/llama2-13b-math1.1 - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [FelixChao/llama2-13b-math1.1](https://huggingface.co/FelixChao/llama2-13b-math1.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_FelixChao__llama2-13b-math1.1", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-10-16T00:33:55.833296](https://huggingface.co/datasets/open-llm-leaderboard/details_FelixChao__llama2-13b-math1.1/blob/main/results_2023-10-16T00-33-55.833296.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.055683724832214766, "em_stderr": 0.002348348394190227, "f1": 0.13264366610738232, "f1_stderr": 0.0027382594726878934, "acc": 0.42558849383038144, "acc_stderr": 0.010386118205480476 }, "harness|drop|3": { "em": 0.055683724832214766, "em_stderr": 0.002348348394190227, "f1": 0.13264366610738232, "f1_stderr": 0.0027382594726878934 }, "harness|gsm8k|5": { "acc": 0.1068991660348749, "acc_stderr": 0.008510982565520478 }, "harness|winogrande|5": { "acc": 0.744277821625888, "acc_stderr": 0.012261253845440473 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_FelixChao__llama2-13b-math1.1
[ "region:us" ]
2023-08-18T17:55:52+00:00
{"pretty_name": "Evaluation run of FelixChao/llama2-13b-math1.1", "dataset_summary": "Dataset automatically created during the evaluation run of model [FelixChao/llama2-13b-math1.1](https://huggingface.co/FelixChao/llama2-13b-math1.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_FelixChao__llama2-13b-math1.1\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-16T00:33:55.833296](https://huggingface.co/datasets/open-llm-leaderboard/details_FelixChao__llama2-13b-math1.1/blob/main/results_2023-10-16T00-33-55.833296.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.055683724832214766,\n \"em_stderr\": 0.002348348394190227,\n \"f1\": 0.13264366610738232,\n \"f1_stderr\": 0.0027382594726878934,\n \"acc\": 0.42558849383038144,\n \"acc_stderr\": 0.010386118205480476\n },\n \"harness|drop|3\": {\n \"em\": 0.055683724832214766,\n \"em_stderr\": 0.002348348394190227,\n \"f1\": 0.13264366610738232,\n \"f1_stderr\": 0.0027382594726878934\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.1068991660348749,\n \"acc_stderr\": 0.008510982565520478\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.744277821625888,\n \"acc_stderr\": 0.012261253845440473\n }\n}\n```", "repo_url": "https://huggingface.co/FelixChao/llama2-13b-math1.1", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_08_18T11_29_01.098404", "path": ["**/details_harness|arc:challenge|25_2023-08-18T11:29:01.098404.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-08-18T11:29:01.098404.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_09_18T01_52_04.935110", "path": ["**/details_harness|drop|3_2023-09-18T01-52-04.935110.parquet"]}, {"split": "2023_10_16T00_33_55.833296", "path": ["**/details_harness|drop|3_2023-10-16T00-33-55.833296.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-16T00-33-55.833296.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_09_18T01_52_04.935110", "path": ["**/details_harness|gsm8k|5_2023-09-18T01-52-04.935110.parquet"]}, {"split": "2023_10_16T00_33_55.833296", "path": ["**/details_harness|gsm8k|5_2023-10-16T00-33-55.833296.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-16T00-33-55.833296.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_08_18T11_29_01.098404", "path": ["**/details_harness|hellaswag|10_2023-08-18T11:29:01.098404.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-08-18T11:29:01.098404.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_08_18T11_29_01.098404", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T11:29:01.098404.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-18T11:29:01.098404.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-18T11:29:01.098404.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T11:29:01.098404.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T11:29:01.098404.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-18T11:29:01.098404.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T11:29:01.098404.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T11:29:01.098404.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T11:29:01.098404.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T11:29:01.098404.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-18T11:29:01.098404.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-18T11:29:01.098404.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T11:29:01.098404.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-18T11:29:01.098404.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T11:29:01.098404.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T11:29:01.098404.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T11:29:01.098404.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-18T11:29:01.098404.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T11:29:01.098404.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T11:29:01.098404.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T11:29:01.098404.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T11:29:01.098404.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T11:29:01.098404.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T11:29:01.098404.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T11:29:01.098404.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T11:29:01.098404.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T11:29:01.098404.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T11:29:01.098404.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T11:29:01.098404.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T11:29:01.098404.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T11:29:01.098404.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T11:29:01.098404.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-18T11:29:01.098404.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T11:29:01.098404.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-18T11:29:01.098404.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T11:29:01.098404.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T11:29:01.098404.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T11:29:01.098404.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-18T11:29:01.098404.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-18T11:29:01.098404.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T11:29:01.098404.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T11:29:01.098404.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T11:29:01.098404.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T11:29:01.098404.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-18T11:29:01.098404.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-18T11:29:01.098404.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-18T11:29:01.098404.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T11:29:01.098404.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-18T11:29:01.098404.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T11:29:01.098404.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T11:29:01.098404.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-18T11:29:01.098404.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-18T11:29:01.098404.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-18T11:29:01.098404.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T11:29:01.098404.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-18T11:29:01.098404.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-18T11:29:01.098404.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T11:29:01.098404.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-18T11:29:01.098404.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-18T11:29:01.098404.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T11:29:01.098404.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T11:29:01.098404.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-18T11:29:01.098404.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T11:29:01.098404.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T11:29:01.098404.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T11:29:01.098404.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T11:29:01.098404.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-18T11:29:01.098404.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-18T11:29:01.098404.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T11:29:01.098404.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-18T11:29:01.098404.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T11:29:01.098404.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T11:29:01.098404.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T11:29:01.098404.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-18T11:29:01.098404.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T11:29:01.098404.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T11:29:01.098404.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T11:29:01.098404.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T11:29:01.098404.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T11:29:01.098404.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T11:29:01.098404.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T11:29:01.098404.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T11:29:01.098404.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T11:29:01.098404.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T11:29:01.098404.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T11:29:01.098404.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T11:29:01.098404.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T11:29:01.098404.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T11:29:01.098404.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-18T11:29:01.098404.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T11:29:01.098404.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-18T11:29:01.098404.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T11:29:01.098404.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T11:29:01.098404.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T11:29:01.098404.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-18T11:29:01.098404.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-18T11:29:01.098404.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T11:29:01.098404.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T11:29:01.098404.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T11:29:01.098404.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T11:29:01.098404.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-18T11:29:01.098404.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-18T11:29:01.098404.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-18T11:29:01.098404.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T11:29:01.098404.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-18T11:29:01.098404.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T11:29:01.098404.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T11:29:01.098404.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-18T11:29:01.098404.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-18T11:29:01.098404.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-18T11:29:01.098404.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T11:29:01.098404.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-18T11:29:01.098404.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-18T11:29:01.098404.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_08_18T11_29_01.098404", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T11:29:01.098404.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T11:29:01.098404.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_08_18T11_29_01.098404", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-18T11:29:01.098404.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-18T11:29:01.098404.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_08_18T11_29_01.098404", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-18T11:29:01.098404.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-18T11:29:01.098404.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_08_18T11_29_01.098404", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T11:29:01.098404.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T11:29:01.098404.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_08_18T11_29_01.098404", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T11:29:01.098404.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T11:29:01.098404.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_08_18T11_29_01.098404", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-18T11:29:01.098404.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-18T11:29:01.098404.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_08_18T11_29_01.098404", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T11:29:01.098404.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T11:29:01.098404.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_08_18T11_29_01.098404", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T11:29:01.098404.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T11:29:01.098404.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_08_18T11_29_01.098404", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T11:29:01.098404.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T11:29:01.098404.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_08_18T11_29_01.098404", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T11:29:01.098404.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T11:29:01.098404.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_08_18T11_29_01.098404", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-18T11:29:01.098404.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-18T11:29:01.098404.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_08_18T11_29_01.098404", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-18T11:29:01.098404.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-18T11:29:01.098404.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_08_18T11_29_01.098404", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T11:29:01.098404.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T11:29:01.098404.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_08_18T11_29_01.098404", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-18T11:29:01.098404.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-18T11:29:01.098404.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_08_18T11_29_01.098404", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T11:29:01.098404.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T11:29:01.098404.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_08_18T11_29_01.098404", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T11:29:01.098404.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T11:29:01.098404.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_08_18T11_29_01.098404", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T11:29:01.098404.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T11:29:01.098404.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_08_18T11_29_01.098404", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-18T11:29:01.098404.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-18T11:29:01.098404.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_08_18T11_29_01.098404", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T11:29:01.098404.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T11:29:01.098404.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_08_18T11_29_01.098404", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T11:29:01.098404.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T11:29:01.098404.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_08_18T11_29_01.098404", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T11:29:01.098404.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T11:29:01.098404.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_08_18T11_29_01.098404", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T11:29:01.098404.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T11:29:01.098404.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_08_18T11_29_01.098404", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T11:29:01.098404.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T11:29:01.098404.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_08_18T11_29_01.098404", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T11:29:01.098404.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T11:29:01.098404.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_08_18T11_29_01.098404", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T11:29:01.098404.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T11:29:01.098404.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_08_18T11_29_01.098404", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T11:29:01.098404.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T11:29:01.098404.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_08_18T11_29_01.098404", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T11:29:01.098404.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T11:29:01.098404.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_08_18T11_29_01.098404", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T11:29:01.098404.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T11:29:01.098404.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_08_18T11_29_01.098404", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T11:29:01.098404.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T11:29:01.098404.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_08_18T11_29_01.098404", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T11:29:01.098404.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T11:29:01.098404.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_08_18T11_29_01.098404", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T11:29:01.098404.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T11:29:01.098404.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_08_18T11_29_01.098404", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T11:29:01.098404.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T11:29:01.098404.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_08_18T11_29_01.098404", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-18T11:29:01.098404.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-18T11:29:01.098404.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_08_18T11_29_01.098404", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T11:29:01.098404.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T11:29:01.098404.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_08_18T11_29_01.098404", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-18T11:29:01.098404.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-18T11:29:01.098404.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_08_18T11_29_01.098404", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T11:29:01.098404.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T11:29:01.098404.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_08_18T11_29_01.098404", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T11:29:01.098404.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T11:29:01.098404.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_08_18T11_29_01.098404", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T11:29:01.098404.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T11:29:01.098404.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_08_18T11_29_01.098404", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-18T11:29:01.098404.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-18T11:29:01.098404.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_08_18T11_29_01.098404", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-18T11:29:01.098404.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-18T11:29:01.098404.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_08_18T11_29_01.098404", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T11:29:01.098404.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T11:29:01.098404.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_08_18T11_29_01.098404", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T11:29:01.098404.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T11:29:01.098404.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_08_18T11_29_01.098404", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T11:29:01.098404.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T11:29:01.098404.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_08_18T11_29_01.098404", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T11:29:01.098404.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T11:29:01.098404.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_08_18T11_29_01.098404", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-18T11:29:01.098404.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-18T11:29:01.098404.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_08_18T11_29_01.098404", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-18T11:29:01.098404.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-18T11:29:01.098404.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_08_18T11_29_01.098404", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-18T11:29:01.098404.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-18T11:29:01.098404.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_08_18T11_29_01.098404", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T11:29:01.098404.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T11:29:01.098404.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_08_18T11_29_01.098404", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-18T11:29:01.098404.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-18T11:29:01.098404.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_08_18T11_29_01.098404", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T11:29:01.098404.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T11:29:01.098404.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_08_18T11_29_01.098404", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T11:29:01.098404.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T11:29:01.098404.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_08_18T11_29_01.098404", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-18T11:29:01.098404.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-18T11:29:01.098404.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_08_18T11_29_01.098404", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-18T11:29:01.098404.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-18T11:29:01.098404.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_08_18T11_29_01.098404", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-18T11:29:01.098404.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-18T11:29:01.098404.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_08_18T11_29_01.098404", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T11:29:01.098404.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T11:29:01.098404.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_08_18T11_29_01.098404", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-18T11:29:01.098404.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-18T11:29:01.098404.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_08_18T11_29_01.098404", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-18T11:29:01.098404.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-18T11:29:01.098404.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_08_18T11_29_01.098404", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-18T11:29:01.098404.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-18T11:29:01.098404.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_09_18T01_52_04.935110", "path": ["**/details_harness|winogrande|5_2023-09-18T01-52-04.935110.parquet"]}, {"split": "2023_10_16T00_33_55.833296", "path": ["**/details_harness|winogrande|5_2023-10-16T00-33-55.833296.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-16T00-33-55.833296.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_08_18T11_29_01.098404", "path": ["results_2023-08-18T11:29:01.098404.parquet"]}, {"split": "2023_09_18T01_52_04.935110", "path": ["results_2023-09-18T01-52-04.935110.parquet"]}, {"split": "2023_10_16T00_33_55.833296", "path": ["results_2023-10-16T00-33-55.833296.parquet"]}, {"split": "latest", "path": ["results_2023-10-16T00-33-55.833296.parquet"]}]}]}
2023-10-15T23:34:08+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of FelixChao/llama2-13b-math1.1 ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model FelixChao/llama2-13b-math1.1 on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-10-16T00:33:55.833296(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of FelixChao/llama2-13b-math1.1", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model FelixChao/llama2-13b-math1.1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-16T00:33:55.833296(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of FelixChao/llama2-13b-math1.1", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model FelixChao/llama2-13b-math1.1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-16T00:33:55.833296(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 21, 31, 169, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of FelixChao/llama2-13b-math1.1## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model FelixChao/llama2-13b-math1.1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-16T00:33:55.833296(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
a619f2d4fb115fcf46193e2b2285ce086552d718
# Dataset Card for Evaluation run of nkpz/llama2-22b-daydreamer-v3 ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/nkpz/llama2-22b-daydreamer-v3 - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [nkpz/llama2-22b-daydreamer-v3](https://huggingface.co/nkpz/llama2-22b-daydreamer-v3) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_nkpz__llama2-22b-daydreamer-v3", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-09-23T08:56:42.787237](https://huggingface.co/datasets/open-llm-leaderboard/details_nkpz__llama2-22b-daydreamer-v3/blob/main/results_2023-09-23T08-56-42.787237.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.006606543624161074, "em_stderr": 0.0008296357389921868, "f1": 0.08847525167785215, "f1_stderr": 0.0017746482079898484, "acc": 0.38635706776019, "acc_stderr": 0.008833441686995644 }, "harness|drop|3": { "em": 0.006606543624161074, "em_stderr": 0.0008296357389921868, "f1": 0.08847525167785215, "f1_stderr": 0.0017746482079898484 }, "harness|gsm8k|5": { "acc": 0.03790750568612585, "acc_stderr": 0.0052603339077984266 }, "harness|winogrande|5": { "acc": 0.7348066298342542, "acc_stderr": 0.012406549466192861 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_nkpz__llama2-22b-daydreamer-v3
[ "region:us" ]
2023-08-18T17:56:05+00:00
{"pretty_name": "Evaluation run of nkpz/llama2-22b-daydreamer-v3", "dataset_summary": "Dataset automatically created during the evaluation run of model [nkpz/llama2-22b-daydreamer-v3](https://huggingface.co/nkpz/llama2-22b-daydreamer-v3) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_nkpz__llama2-22b-daydreamer-v3\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-09-23T08:56:42.787237](https://huggingface.co/datasets/open-llm-leaderboard/details_nkpz__llama2-22b-daydreamer-v3/blob/main/results_2023-09-23T08-56-42.787237.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.006606543624161074,\n \"em_stderr\": 0.0008296357389921868,\n \"f1\": 0.08847525167785215,\n \"f1_stderr\": 0.0017746482079898484,\n \"acc\": 0.38635706776019,\n \"acc_stderr\": 0.008833441686995644\n },\n \"harness|drop|3\": {\n \"em\": 0.006606543624161074,\n \"em_stderr\": 0.0008296357389921868,\n \"f1\": 0.08847525167785215,\n \"f1_stderr\": 0.0017746482079898484\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.03790750568612585,\n \"acc_stderr\": 0.0052603339077984266\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7348066298342542,\n \"acc_stderr\": 0.012406549466192861\n }\n}\n```", "repo_url": "https://huggingface.co/nkpz/llama2-22b-daydreamer-v3", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_08_17T14_34_13.922429", "path": ["**/details_harness|arc:challenge|25_2023-08-17T14:34:13.922429.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-08-17T14:34:13.922429.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_09_23T08_56_42.787237", "path": ["**/details_harness|drop|3_2023-09-23T08-56-42.787237.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-09-23T08-56-42.787237.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_09_23T08_56_42.787237", "path": ["**/details_harness|gsm8k|5_2023-09-23T08-56-42.787237.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-09-23T08-56-42.787237.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_08_17T14_34_13.922429", "path": ["**/details_harness|hellaswag|10_2023-08-17T14:34:13.922429.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-08-17T14:34:13.922429.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_08_17T14_34_13.922429", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T14:34:13.922429.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-17T14:34:13.922429.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-17T14:34:13.922429.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T14:34:13.922429.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T14:34:13.922429.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-17T14:34:13.922429.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T14:34:13.922429.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T14:34:13.922429.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T14:34:13.922429.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T14:34:13.922429.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-17T14:34:13.922429.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-17T14:34:13.922429.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T14:34:13.922429.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-17T14:34:13.922429.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T14:34:13.922429.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T14:34:13.922429.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T14:34:13.922429.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-17T14:34:13.922429.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T14:34:13.922429.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T14:34:13.922429.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T14:34:13.922429.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T14:34:13.922429.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T14:34:13.922429.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T14:34:13.922429.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T14:34:13.922429.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T14:34:13.922429.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T14:34:13.922429.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T14:34:13.922429.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T14:34:13.922429.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T14:34:13.922429.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T14:34:13.922429.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T14:34:13.922429.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-17T14:34:13.922429.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T14:34:13.922429.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-17T14:34:13.922429.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T14:34:13.922429.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T14:34:13.922429.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T14:34:13.922429.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-17T14:34:13.922429.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-17T14:34:13.922429.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T14:34:13.922429.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T14:34:13.922429.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T14:34:13.922429.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T14:34:13.922429.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-17T14:34:13.922429.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-17T14:34:13.922429.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-17T14:34:13.922429.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T14:34:13.922429.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-17T14:34:13.922429.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T14:34:13.922429.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T14:34:13.922429.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-17T14:34:13.922429.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-17T14:34:13.922429.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-17T14:34:13.922429.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T14:34:13.922429.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-17T14:34:13.922429.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-17T14:34:13.922429.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T14:34:13.922429.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-17T14:34:13.922429.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-17T14:34:13.922429.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T14:34:13.922429.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T14:34:13.922429.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-17T14:34:13.922429.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T14:34:13.922429.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T14:34:13.922429.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T14:34:13.922429.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T14:34:13.922429.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-17T14:34:13.922429.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-17T14:34:13.922429.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T14:34:13.922429.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-17T14:34:13.922429.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T14:34:13.922429.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T14:34:13.922429.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T14:34:13.922429.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-17T14:34:13.922429.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T14:34:13.922429.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T14:34:13.922429.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T14:34:13.922429.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T14:34:13.922429.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T14:34:13.922429.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T14:34:13.922429.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T14:34:13.922429.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T14:34:13.922429.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T14:34:13.922429.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T14:34:13.922429.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T14:34:13.922429.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T14:34:13.922429.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T14:34:13.922429.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T14:34:13.922429.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-17T14:34:13.922429.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T14:34:13.922429.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-17T14:34:13.922429.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T14:34:13.922429.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T14:34:13.922429.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T14:34:13.922429.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-17T14:34:13.922429.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-17T14:34:13.922429.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T14:34:13.922429.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T14:34:13.922429.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T14:34:13.922429.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T14:34:13.922429.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-17T14:34:13.922429.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-17T14:34:13.922429.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-17T14:34:13.922429.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T14:34:13.922429.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-17T14:34:13.922429.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T14:34:13.922429.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T14:34:13.922429.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-17T14:34:13.922429.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-17T14:34:13.922429.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-17T14:34:13.922429.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T14:34:13.922429.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-17T14:34:13.922429.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-17T14:34:13.922429.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_08_17T14_34_13.922429", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T14:34:13.922429.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T14:34:13.922429.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_08_17T14_34_13.922429", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-17T14:34:13.922429.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-17T14:34:13.922429.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_08_17T14_34_13.922429", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-17T14:34:13.922429.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-17T14:34:13.922429.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_08_17T14_34_13.922429", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T14:34:13.922429.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T14:34:13.922429.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_08_17T14_34_13.922429", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T14:34:13.922429.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T14:34:13.922429.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_08_17T14_34_13.922429", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-17T14:34:13.922429.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-17T14:34:13.922429.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_08_17T14_34_13.922429", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T14:34:13.922429.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T14:34:13.922429.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_08_17T14_34_13.922429", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T14:34:13.922429.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T14:34:13.922429.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_08_17T14_34_13.922429", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T14:34:13.922429.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T14:34:13.922429.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_08_17T14_34_13.922429", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T14:34:13.922429.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T14:34:13.922429.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_08_17T14_34_13.922429", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-17T14:34:13.922429.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-17T14:34:13.922429.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_08_17T14_34_13.922429", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-17T14:34:13.922429.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-17T14:34:13.922429.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_08_17T14_34_13.922429", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T14:34:13.922429.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T14:34:13.922429.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_08_17T14_34_13.922429", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-17T14:34:13.922429.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-17T14:34:13.922429.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_08_17T14_34_13.922429", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T14:34:13.922429.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T14:34:13.922429.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_08_17T14_34_13.922429", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T14:34:13.922429.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T14:34:13.922429.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_08_17T14_34_13.922429", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T14:34:13.922429.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T14:34:13.922429.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_08_17T14_34_13.922429", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-17T14:34:13.922429.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-17T14:34:13.922429.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_08_17T14_34_13.922429", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T14:34:13.922429.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T14:34:13.922429.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_08_17T14_34_13.922429", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T14:34:13.922429.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T14:34:13.922429.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_08_17T14_34_13.922429", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T14:34:13.922429.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T14:34:13.922429.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_08_17T14_34_13.922429", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T14:34:13.922429.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T14:34:13.922429.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_08_17T14_34_13.922429", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T14:34:13.922429.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T14:34:13.922429.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_08_17T14_34_13.922429", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T14:34:13.922429.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T14:34:13.922429.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_08_17T14_34_13.922429", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T14:34:13.922429.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T14:34:13.922429.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_08_17T14_34_13.922429", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T14:34:13.922429.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T14:34:13.922429.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_08_17T14_34_13.922429", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T14:34:13.922429.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T14:34:13.922429.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_08_17T14_34_13.922429", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T14:34:13.922429.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T14:34:13.922429.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_08_17T14_34_13.922429", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T14:34:13.922429.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T14:34:13.922429.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_08_17T14_34_13.922429", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T14:34:13.922429.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T14:34:13.922429.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_08_17T14_34_13.922429", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T14:34:13.922429.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T14:34:13.922429.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_08_17T14_34_13.922429", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T14:34:13.922429.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T14:34:13.922429.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_08_17T14_34_13.922429", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-17T14:34:13.922429.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-17T14:34:13.922429.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_08_17T14_34_13.922429", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T14:34:13.922429.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T14:34:13.922429.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_08_17T14_34_13.922429", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-17T14:34:13.922429.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-17T14:34:13.922429.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_08_17T14_34_13.922429", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T14:34:13.922429.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T14:34:13.922429.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_08_17T14_34_13.922429", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T14:34:13.922429.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T14:34:13.922429.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_08_17T14_34_13.922429", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T14:34:13.922429.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T14:34:13.922429.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_08_17T14_34_13.922429", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-17T14:34:13.922429.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-17T14:34:13.922429.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_08_17T14_34_13.922429", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-17T14:34:13.922429.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-17T14:34:13.922429.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_08_17T14_34_13.922429", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T14:34:13.922429.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T14:34:13.922429.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_08_17T14_34_13.922429", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T14:34:13.922429.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T14:34:13.922429.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_08_17T14_34_13.922429", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T14:34:13.922429.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T14:34:13.922429.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_08_17T14_34_13.922429", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T14:34:13.922429.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T14:34:13.922429.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_08_17T14_34_13.922429", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-17T14:34:13.922429.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-17T14:34:13.922429.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_08_17T14_34_13.922429", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-17T14:34:13.922429.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-17T14:34:13.922429.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_08_17T14_34_13.922429", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-17T14:34:13.922429.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-17T14:34:13.922429.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_08_17T14_34_13.922429", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T14:34:13.922429.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T14:34:13.922429.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_08_17T14_34_13.922429", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-17T14:34:13.922429.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-17T14:34:13.922429.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_08_17T14_34_13.922429", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T14:34:13.922429.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T14:34:13.922429.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_08_17T14_34_13.922429", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T14:34:13.922429.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T14:34:13.922429.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_08_17T14_34_13.922429", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-17T14:34:13.922429.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-17T14:34:13.922429.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_08_17T14_34_13.922429", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-17T14:34:13.922429.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-17T14:34:13.922429.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_08_17T14_34_13.922429", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-17T14:34:13.922429.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-17T14:34:13.922429.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_08_17T14_34_13.922429", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T14:34:13.922429.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T14:34:13.922429.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_08_17T14_34_13.922429", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-17T14:34:13.922429.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-17T14:34:13.922429.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_08_17T14_34_13.922429", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-17T14:34:13.922429.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-17T14:34:13.922429.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_08_17T14_34_13.922429", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-17T14:34:13.922429.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-17T14:34:13.922429.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_09_23T08_56_42.787237", "path": ["**/details_harness|winogrande|5_2023-09-23T08-56-42.787237.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-09-23T08-56-42.787237.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_08_17T14_34_13.922429", "path": ["results_2023-08-17T14:34:13.922429.parquet"]}, {"split": "2023_09_23T08_56_42.787237", "path": ["results_2023-09-23T08-56-42.787237.parquet"]}, {"split": "latest", "path": ["results_2023-09-23T08-56-42.787237.parquet"]}]}]}
2023-09-23T07:56:54+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of nkpz/llama2-22b-daydreamer-v3 ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model nkpz/llama2-22b-daydreamer-v3 on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-09-23T08:56:42.787237(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of nkpz/llama2-22b-daydreamer-v3", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model nkpz/llama2-22b-daydreamer-v3 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-09-23T08:56:42.787237(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of nkpz/llama2-22b-daydreamer-v3", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model nkpz/llama2-22b-daydreamer-v3 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-09-23T08:56:42.787237(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 26, 31, 174, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of nkpz/llama2-22b-daydreamer-v3## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model nkpz/llama2-22b-daydreamer-v3 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-09-23T08:56:42.787237(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
35959ef2290eadcc4110cc93990998f4ccbd95b1
# Dataset Card for Evaluation run of Voicelab/trurl-2-13b ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/Voicelab/trurl-2-13b - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [Voicelab/trurl-2-13b](https://huggingface.co/Voicelab/trurl-2-13b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 122 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_Voicelab__trurl-2-13b", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-10-13T14:01:30.256231](https://huggingface.co/datasets/open-llm-leaderboard/details_Voicelab__trurl-2-13b/blob/main/results_2023-10-13T14-01-30.256231.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.3288590604026846, "em_stderr": 0.0048111779783056785, "f1": 0.43254823825503735, "f1_stderr": 0.0046112274283012355, "acc": 0.437781127387769, "acc_stderr": 0.010708773499687067 }, "harness|drop|3": { "em": 0.3288590604026846, "em_stderr": 0.0048111779783056785, "f1": 0.43254823825503735, "f1_stderr": 0.0046112274283012355 }, "harness|gsm8k|5": { "acc": 0.1281273692191054, "acc_stderr": 0.009206398549980031 }, "harness|winogrande|5": { "acc": 0.7474348855564326, "acc_stderr": 0.012211148449394105 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_Voicelab__trurl-2-13b
[ "region:us" ]
2023-08-18T17:56:24+00:00
{"pretty_name": "Evaluation run of Voicelab/trurl-2-13b", "dataset_summary": "Dataset automatically created during the evaluation run of model [Voicelab/trurl-2-13b](https://huggingface.co/Voicelab/trurl-2-13b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 122 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Voicelab__trurl-2-13b\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-13T14:01:30.256231](https://huggingface.co/datasets/open-llm-leaderboard/details_Voicelab__trurl-2-13b/blob/main/results_2023-10-13T14-01-30.256231.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.3288590604026846,\n \"em_stderr\": 0.0048111779783056785,\n \"f1\": 0.43254823825503735,\n \"f1_stderr\": 0.0046112274283012355,\n \"acc\": 0.437781127387769,\n \"acc_stderr\": 0.010708773499687067\n },\n \"harness|drop|3\": {\n \"em\": 0.3288590604026846,\n \"em_stderr\": 0.0048111779783056785,\n \"f1\": 0.43254823825503735,\n \"f1_stderr\": 0.0046112274283012355\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.1281273692191054,\n \"acc_stderr\": 0.009206398549980031\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7474348855564326,\n \"acc_stderr\": 0.012211148449394105\n }\n}\n```", "repo_url": "https://huggingface.co/Voicelab/trurl-2-13b", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_08_17T15_17_14.973994", "path": ["**/details_harness|arc:challenge|25_2023-08-17T15:17:14.973994.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-08-17T15:17:14.973994.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_13T14_01_30.256231", "path": ["**/details_harness|drop|3_2023-10-13T14-01-30.256231.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-13T14-01-30.256231.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_13T14_01_30.256231", "path": ["**/details_harness|gsm8k|5_2023-10-13T14-01-30.256231.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-13T14-01-30.256231.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_08_17T15_17_14.973994", "path": ["**/details_harness|hellaswag|10_2023-08-17T15:17:14.973994.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-08-17T15:17:14.973994.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_08_17T15_17_14.973994", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T15:17:14.973994.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-17T15:17:14.973994.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-17T15:17:14.973994.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T15:17:14.973994.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T15:17:14.973994.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-17T15:17:14.973994.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T15:17:14.973994.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T15:17:14.973994.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T15:17:14.973994.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T15:17:14.973994.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-17T15:17:14.973994.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-17T15:17:14.973994.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T15:17:14.973994.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-17T15:17:14.973994.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T15:17:14.973994.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T15:17:14.973994.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T15:17:14.973994.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-17T15:17:14.973994.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T15:17:14.973994.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T15:17:14.973994.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T15:17:14.973994.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T15:17:14.973994.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T15:17:14.973994.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T15:17:14.973994.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T15:17:14.973994.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T15:17:14.973994.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T15:17:14.973994.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T15:17:14.973994.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T15:17:14.973994.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T15:17:14.973994.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T15:17:14.973994.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T15:17:14.973994.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-17T15:17:14.973994.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T15:17:14.973994.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-17T15:17:14.973994.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T15:17:14.973994.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T15:17:14.973994.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T15:17:14.973994.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-17T15:17:14.973994.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-17T15:17:14.973994.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T15:17:14.973994.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T15:17:14.973994.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T15:17:14.973994.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T15:17:14.973994.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-17T15:17:14.973994.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-17T15:17:14.973994.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-17T15:17:14.973994.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T15:17:14.973994.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-17T15:17:14.973994.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T15:17:14.973994.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T15:17:14.973994.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-17T15:17:14.973994.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-17T15:17:14.973994.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-17T15:17:14.973994.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T15:17:14.973994.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-17T15:17:14.973994.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-17T15:17:14.973994.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T15:17:14.973994.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-17T15:17:14.973994.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-17T15:17:14.973994.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T15:17:14.973994.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T15:17:14.973994.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-17T15:17:14.973994.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T15:17:14.973994.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T15:17:14.973994.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T15:17:14.973994.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T15:17:14.973994.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-17T15:17:14.973994.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-17T15:17:14.973994.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T15:17:14.973994.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-17T15:17:14.973994.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T15:17:14.973994.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T15:17:14.973994.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T15:17:14.973994.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-17T15:17:14.973994.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T15:17:14.973994.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T15:17:14.973994.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T15:17:14.973994.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T15:17:14.973994.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T15:17:14.973994.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T15:17:14.973994.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T15:17:14.973994.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T15:17:14.973994.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T15:17:14.973994.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T15:17:14.973994.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T15:17:14.973994.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T15:17:14.973994.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T15:17:14.973994.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T15:17:14.973994.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-17T15:17:14.973994.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T15:17:14.973994.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-17T15:17:14.973994.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T15:17:14.973994.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T15:17:14.973994.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T15:17:14.973994.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-17T15:17:14.973994.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-17T15:17:14.973994.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T15:17:14.973994.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T15:17:14.973994.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T15:17:14.973994.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T15:17:14.973994.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-17T15:17:14.973994.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-17T15:17:14.973994.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-17T15:17:14.973994.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T15:17:14.973994.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-17T15:17:14.973994.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T15:17:14.973994.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T15:17:14.973994.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-17T15:17:14.973994.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-17T15:17:14.973994.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-17T15:17:14.973994.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T15:17:14.973994.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-17T15:17:14.973994.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-17T15:17:14.973994.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_08_17T15_17_14.973994", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T15:17:14.973994.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T15:17:14.973994.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_08_17T15_17_14.973994", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-17T15:17:14.973994.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-17T15:17:14.973994.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_08_17T15_17_14.973994", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-17T15:17:14.973994.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-17T15:17:14.973994.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_08_17T15_17_14.973994", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T15:17:14.973994.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T15:17:14.973994.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_08_17T15_17_14.973994", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T15:17:14.973994.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T15:17:14.973994.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_08_17T15_17_14.973994", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-17T15:17:14.973994.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-17T15:17:14.973994.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_08_17T15_17_14.973994", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T15:17:14.973994.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T15:17:14.973994.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_08_17T15_17_14.973994", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T15:17:14.973994.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T15:17:14.973994.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_08_17T15_17_14.973994", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T15:17:14.973994.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T15:17:14.973994.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_08_17T15_17_14.973994", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T15:17:14.973994.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T15:17:14.973994.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_08_17T15_17_14.973994", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-17T15:17:14.973994.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-17T15:17:14.973994.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_08_17T15_17_14.973994", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-17T15:17:14.973994.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-17T15:17:14.973994.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_08_17T15_17_14.973994", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T15:17:14.973994.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T15:17:14.973994.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_08_17T15_17_14.973994", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-17T15:17:14.973994.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-17T15:17:14.973994.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_08_17T15_17_14.973994", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T15:17:14.973994.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T15:17:14.973994.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_08_17T15_17_14.973994", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T15:17:14.973994.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T15:17:14.973994.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_08_17T15_17_14.973994", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T15:17:14.973994.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T15:17:14.973994.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_08_17T15_17_14.973994", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-17T15:17:14.973994.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-17T15:17:14.973994.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_08_17T15_17_14.973994", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T15:17:14.973994.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T15:17:14.973994.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_08_17T15_17_14.973994", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T15:17:14.973994.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T15:17:14.973994.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_08_17T15_17_14.973994", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T15:17:14.973994.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T15:17:14.973994.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_08_17T15_17_14.973994", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T15:17:14.973994.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T15:17:14.973994.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_08_17T15_17_14.973994", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T15:17:14.973994.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T15:17:14.973994.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_08_17T15_17_14.973994", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T15:17:14.973994.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T15:17:14.973994.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_08_17T15_17_14.973994", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T15:17:14.973994.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T15:17:14.973994.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_08_17T15_17_14.973994", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T15:17:14.973994.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T15:17:14.973994.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_08_17T15_17_14.973994", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T15:17:14.973994.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T15:17:14.973994.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_08_17T15_17_14.973994", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T15:17:14.973994.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T15:17:14.973994.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_08_17T15_17_14.973994", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T15:17:14.973994.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T15:17:14.973994.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_08_17T15_17_14.973994", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T15:17:14.973994.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T15:17:14.973994.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_08_17T15_17_14.973994", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T15:17:14.973994.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T15:17:14.973994.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_08_17T15_17_14.973994", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T15:17:14.973994.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T15:17:14.973994.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_08_17T15_17_14.973994", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-17T15:17:14.973994.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-17T15:17:14.973994.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_08_17T15_17_14.973994", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T15:17:14.973994.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T15:17:14.973994.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_08_17T15_17_14.973994", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-17T15:17:14.973994.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-17T15:17:14.973994.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_08_17T15_17_14.973994", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T15:17:14.973994.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T15:17:14.973994.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_08_17T15_17_14.973994", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T15:17:14.973994.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T15:17:14.973994.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_08_17T15_17_14.973994", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T15:17:14.973994.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T15:17:14.973994.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_08_17T15_17_14.973994", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-17T15:17:14.973994.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-17T15:17:14.973994.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_08_17T15_17_14.973994", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-17T15:17:14.973994.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-17T15:17:14.973994.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_08_17T15_17_14.973994", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T15:17:14.973994.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T15:17:14.973994.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_08_17T15_17_14.973994", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T15:17:14.973994.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T15:17:14.973994.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_08_17T15_17_14.973994", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T15:17:14.973994.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T15:17:14.973994.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_08_17T15_17_14.973994", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T15:17:14.973994.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T15:17:14.973994.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_08_17T15_17_14.973994", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-17T15:17:14.973994.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-17T15:17:14.973994.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_08_17T15_17_14.973994", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-17T15:17:14.973994.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-17T15:17:14.973994.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_08_17T15_17_14.973994", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-17T15:17:14.973994.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-17T15:17:14.973994.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_08_17T15_17_14.973994", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T15:17:14.973994.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T15:17:14.973994.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_08_17T15_17_14.973994", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-17T15:17:14.973994.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-17T15:17:14.973994.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_08_17T15_17_14.973994", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T15:17:14.973994.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T15:17:14.973994.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_08_17T15_17_14.973994", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T15:17:14.973994.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T15:17:14.973994.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_08_17T15_17_14.973994", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-17T15:17:14.973994.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-17T15:17:14.973994.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_08_17T15_17_14.973994", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-17T15:17:14.973994.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-17T15:17:14.973994.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_08_17T15_17_14.973994", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-17T15:17:14.973994.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-17T15:17:14.973994.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_08_17T15_17_14.973994", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T15:17:14.973994.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T15:17:14.973994.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_08_17T15_17_14.973994", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-17T15:17:14.973994.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-17T15:17:14.973994.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_08_17T15_17_14.973994", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-17T15:17:14.973994.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-17T15:17:14.973994.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_08_17T15_17_14.973994", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-17T15:17:14.973994.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-17T15:17:14.973994.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_13T14_01_30.256231", "path": ["**/details_harness|winogrande|5_2023-10-13T14-01-30.256231.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-13T14-01-30.256231.parquet"]}]}, {"config_name": "original_mmlu_5", "data_files": [{"split": "2023_08_28T20_57_35.828044", "path": ["**/details_original|mmlu:abstract_algebra|5_2023-08-28T20:57:35.828044.parquet", "**/details_original|mmlu:anatomy|5_2023-08-28T20:57:35.828044.parquet", "**/details_original|mmlu:astronomy|5_2023-08-28T20:57:35.828044.parquet", "**/details_original|mmlu:business_ethics|5_2023-08-28T20:57:35.828044.parquet", "**/details_original|mmlu:clinical_knowledge|5_2023-08-28T20:57:35.828044.parquet", "**/details_original|mmlu:college_biology|5_2023-08-28T20:57:35.828044.parquet", "**/details_original|mmlu:college_chemistry|5_2023-08-28T20:57:35.828044.parquet", "**/details_original|mmlu:college_computer_science|5_2023-08-28T20:57:35.828044.parquet", "**/details_original|mmlu:college_mathematics|5_2023-08-28T20:57:35.828044.parquet", "**/details_original|mmlu:college_medicine|5_2023-08-28T20:57:35.828044.parquet", "**/details_original|mmlu:college_physics|5_2023-08-28T20:57:35.828044.parquet", "**/details_original|mmlu:computer_security|5_2023-08-28T20:57:35.828044.parquet", "**/details_original|mmlu:conceptual_physics|5_2023-08-28T20:57:35.828044.parquet", "**/details_original|mmlu:econometrics|5_2023-08-28T20:57:35.828044.parquet", "**/details_original|mmlu:electrical_engineering|5_2023-08-28T20:57:35.828044.parquet", "**/details_original|mmlu:elementary_mathematics|5_2023-08-28T20:57:35.828044.parquet", "**/details_original|mmlu:formal_logic|5_2023-08-28T20:57:35.828044.parquet", "**/details_original|mmlu:global_facts|5_2023-08-28T20:57:35.828044.parquet", "**/details_original|mmlu:high_school_biology|5_2023-08-28T20:57:35.828044.parquet", "**/details_original|mmlu:high_school_chemistry|5_2023-08-28T20:57:35.828044.parquet", "**/details_original|mmlu:high_school_computer_science|5_2023-08-28T20:57:35.828044.parquet", "**/details_original|mmlu:high_school_european_history|5_2023-08-28T20:57:35.828044.parquet", "**/details_original|mmlu:high_school_geography|5_2023-08-28T20:57:35.828044.parquet", "**/details_original|mmlu:high_school_government_and_politics|5_2023-08-28T20:57:35.828044.parquet", "**/details_original|mmlu:high_school_macroeconomics|5_2023-08-28T20:57:35.828044.parquet", "**/details_original|mmlu:high_school_mathematics|5_2023-08-28T20:57:35.828044.parquet", "**/details_original|mmlu:high_school_microeconomics|5_2023-08-28T20:57:35.828044.parquet", "**/details_original|mmlu:high_school_physics|5_2023-08-28T20:57:35.828044.parquet", "**/details_original|mmlu:high_school_psychology|5_2023-08-28T20:57:35.828044.parquet", "**/details_original|mmlu:high_school_statistics|5_2023-08-28T20:57:35.828044.parquet", "**/details_original|mmlu:high_school_us_history|5_2023-08-28T20:57:35.828044.parquet", "**/details_original|mmlu:high_school_world_history|5_2023-08-28T20:57:35.828044.parquet", "**/details_original|mmlu:human_aging|5_2023-08-28T20:57:35.828044.parquet", "**/details_original|mmlu:human_sexuality|5_2023-08-28T20:57:35.828044.parquet", "**/details_original|mmlu:international_law|5_2023-08-28T20:57:35.828044.parquet", "**/details_original|mmlu:jurisprudence|5_2023-08-28T20:57:35.828044.parquet", "**/details_original|mmlu:logical_fallacies|5_2023-08-28T20:57:35.828044.parquet", "**/details_original|mmlu:machine_learning|5_2023-08-28T20:57:35.828044.parquet", "**/details_original|mmlu:management|5_2023-08-28T20:57:35.828044.parquet", "**/details_original|mmlu:marketing|5_2023-08-28T20:57:35.828044.parquet", "**/details_original|mmlu:medical_genetics|5_2023-08-28T20:57:35.828044.parquet", "**/details_original|mmlu:miscellaneous|5_2023-08-28T20:57:35.828044.parquet", "**/details_original|mmlu:moral_disputes|5_2023-08-28T20:57:35.828044.parquet", "**/details_original|mmlu:moral_scenarios|5_2023-08-28T20:57:35.828044.parquet", "**/details_original|mmlu:nutrition|5_2023-08-28T20:57:35.828044.parquet", "**/details_original|mmlu:philosophy|5_2023-08-28T20:57:35.828044.parquet", "**/details_original|mmlu:prehistory|5_2023-08-28T20:57:35.828044.parquet", "**/details_original|mmlu:professional_accounting|5_2023-08-28T20:57:35.828044.parquet", "**/details_original|mmlu:professional_law|5_2023-08-28T20:57:35.828044.parquet", "**/details_original|mmlu:professional_medicine|5_2023-08-28T20:57:35.828044.parquet", "**/details_original|mmlu:professional_psychology|5_2023-08-28T20:57:35.828044.parquet", "**/details_original|mmlu:public_relations|5_2023-08-28T20:57:35.828044.parquet", "**/details_original|mmlu:security_studies|5_2023-08-28T20:57:35.828044.parquet", "**/details_original|mmlu:sociology|5_2023-08-28T20:57:35.828044.parquet", "**/details_original|mmlu:us_foreign_policy|5_2023-08-28T20:57:35.828044.parquet", "**/details_original|mmlu:virology|5_2023-08-28T20:57:35.828044.parquet", "**/details_original|mmlu:world_religions|5_2023-08-28T20:57:35.828044.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:abstract_algebra|5_2023-08-28T20:57:35.828044.parquet", "**/details_original|mmlu:anatomy|5_2023-08-28T20:57:35.828044.parquet", "**/details_original|mmlu:astronomy|5_2023-08-28T20:57:35.828044.parquet", "**/details_original|mmlu:business_ethics|5_2023-08-28T20:57:35.828044.parquet", "**/details_original|mmlu:clinical_knowledge|5_2023-08-28T20:57:35.828044.parquet", "**/details_original|mmlu:college_biology|5_2023-08-28T20:57:35.828044.parquet", "**/details_original|mmlu:college_chemistry|5_2023-08-28T20:57:35.828044.parquet", "**/details_original|mmlu:college_computer_science|5_2023-08-28T20:57:35.828044.parquet", "**/details_original|mmlu:college_mathematics|5_2023-08-28T20:57:35.828044.parquet", "**/details_original|mmlu:college_medicine|5_2023-08-28T20:57:35.828044.parquet", "**/details_original|mmlu:college_physics|5_2023-08-28T20:57:35.828044.parquet", "**/details_original|mmlu:computer_security|5_2023-08-28T20:57:35.828044.parquet", "**/details_original|mmlu:conceptual_physics|5_2023-08-28T20:57:35.828044.parquet", "**/details_original|mmlu:econometrics|5_2023-08-28T20:57:35.828044.parquet", "**/details_original|mmlu:electrical_engineering|5_2023-08-28T20:57:35.828044.parquet", "**/details_original|mmlu:elementary_mathematics|5_2023-08-28T20:57:35.828044.parquet", "**/details_original|mmlu:formal_logic|5_2023-08-28T20:57:35.828044.parquet", "**/details_original|mmlu:global_facts|5_2023-08-28T20:57:35.828044.parquet", "**/details_original|mmlu:high_school_biology|5_2023-08-28T20:57:35.828044.parquet", "**/details_original|mmlu:high_school_chemistry|5_2023-08-28T20:57:35.828044.parquet", "**/details_original|mmlu:high_school_computer_science|5_2023-08-28T20:57:35.828044.parquet", "**/details_original|mmlu:high_school_european_history|5_2023-08-28T20:57:35.828044.parquet", "**/details_original|mmlu:high_school_geography|5_2023-08-28T20:57:35.828044.parquet", "**/details_original|mmlu:high_school_government_and_politics|5_2023-08-28T20:57:35.828044.parquet", "**/details_original|mmlu:high_school_macroeconomics|5_2023-08-28T20:57:35.828044.parquet", "**/details_original|mmlu:high_school_mathematics|5_2023-08-28T20:57:35.828044.parquet", "**/details_original|mmlu:high_school_microeconomics|5_2023-08-28T20:57:35.828044.parquet", "**/details_original|mmlu:high_school_physics|5_2023-08-28T20:57:35.828044.parquet", "**/details_original|mmlu:high_school_psychology|5_2023-08-28T20:57:35.828044.parquet", "**/details_original|mmlu:high_school_statistics|5_2023-08-28T20:57:35.828044.parquet", "**/details_original|mmlu:high_school_us_history|5_2023-08-28T20:57:35.828044.parquet", "**/details_original|mmlu:high_school_world_history|5_2023-08-28T20:57:35.828044.parquet", "**/details_original|mmlu:human_aging|5_2023-08-28T20:57:35.828044.parquet", "**/details_original|mmlu:human_sexuality|5_2023-08-28T20:57:35.828044.parquet", "**/details_original|mmlu:international_law|5_2023-08-28T20:57:35.828044.parquet", "**/details_original|mmlu:jurisprudence|5_2023-08-28T20:57:35.828044.parquet", "**/details_original|mmlu:logical_fallacies|5_2023-08-28T20:57:35.828044.parquet", "**/details_original|mmlu:machine_learning|5_2023-08-28T20:57:35.828044.parquet", "**/details_original|mmlu:management|5_2023-08-28T20:57:35.828044.parquet", "**/details_original|mmlu:marketing|5_2023-08-28T20:57:35.828044.parquet", "**/details_original|mmlu:medical_genetics|5_2023-08-28T20:57:35.828044.parquet", "**/details_original|mmlu:miscellaneous|5_2023-08-28T20:57:35.828044.parquet", "**/details_original|mmlu:moral_disputes|5_2023-08-28T20:57:35.828044.parquet", "**/details_original|mmlu:moral_scenarios|5_2023-08-28T20:57:35.828044.parquet", "**/details_original|mmlu:nutrition|5_2023-08-28T20:57:35.828044.parquet", "**/details_original|mmlu:philosophy|5_2023-08-28T20:57:35.828044.parquet", "**/details_original|mmlu:prehistory|5_2023-08-28T20:57:35.828044.parquet", "**/details_original|mmlu:professional_accounting|5_2023-08-28T20:57:35.828044.parquet", "**/details_original|mmlu:professional_law|5_2023-08-28T20:57:35.828044.parquet", "**/details_original|mmlu:professional_medicine|5_2023-08-28T20:57:35.828044.parquet", "**/details_original|mmlu:professional_psychology|5_2023-08-28T20:57:35.828044.parquet", "**/details_original|mmlu:public_relations|5_2023-08-28T20:57:35.828044.parquet", "**/details_original|mmlu:security_studies|5_2023-08-28T20:57:35.828044.parquet", "**/details_original|mmlu:sociology|5_2023-08-28T20:57:35.828044.parquet", "**/details_original|mmlu:us_foreign_policy|5_2023-08-28T20:57:35.828044.parquet", "**/details_original|mmlu:virology|5_2023-08-28T20:57:35.828044.parquet", "**/details_original|mmlu:world_religions|5_2023-08-28T20:57:35.828044.parquet"]}]}, {"config_name": "original_mmlu_abstract_algebra_5", "data_files": [{"split": "2023_08_28T20_57_35.828044", "path": ["**/details_original|mmlu:abstract_algebra|5_2023-08-28T20:57:35.828044.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:abstract_algebra|5_2023-08-28T20:57:35.828044.parquet"]}]}, {"config_name": "original_mmlu_anatomy_5", "data_files": [{"split": "2023_08_28T20_57_35.828044", "path": ["**/details_original|mmlu:anatomy|5_2023-08-28T20:57:35.828044.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:anatomy|5_2023-08-28T20:57:35.828044.parquet"]}]}, {"config_name": "original_mmlu_astronomy_5", "data_files": [{"split": "2023_08_28T20_57_35.828044", "path": ["**/details_original|mmlu:astronomy|5_2023-08-28T20:57:35.828044.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:astronomy|5_2023-08-28T20:57:35.828044.parquet"]}]}, {"config_name": "original_mmlu_business_ethics_5", "data_files": [{"split": "2023_08_28T20_57_35.828044", "path": ["**/details_original|mmlu:business_ethics|5_2023-08-28T20:57:35.828044.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:business_ethics|5_2023-08-28T20:57:35.828044.parquet"]}]}, {"config_name": "original_mmlu_clinical_knowledge_5", "data_files": [{"split": "2023_08_28T20_57_35.828044", "path": ["**/details_original|mmlu:clinical_knowledge|5_2023-08-28T20:57:35.828044.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:clinical_knowledge|5_2023-08-28T20:57:35.828044.parquet"]}]}, {"config_name": "original_mmlu_college_biology_5", "data_files": [{"split": "2023_08_28T20_57_35.828044", "path": ["**/details_original|mmlu:college_biology|5_2023-08-28T20:57:35.828044.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:college_biology|5_2023-08-28T20:57:35.828044.parquet"]}]}, {"config_name": "original_mmlu_college_chemistry_5", "data_files": [{"split": "2023_08_28T20_57_35.828044", "path": ["**/details_original|mmlu:college_chemistry|5_2023-08-28T20:57:35.828044.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:college_chemistry|5_2023-08-28T20:57:35.828044.parquet"]}]}, {"config_name": "original_mmlu_college_computer_science_5", "data_files": [{"split": "2023_08_28T20_57_35.828044", "path": ["**/details_original|mmlu:college_computer_science|5_2023-08-28T20:57:35.828044.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:college_computer_science|5_2023-08-28T20:57:35.828044.parquet"]}]}, {"config_name": "original_mmlu_college_mathematics_5", "data_files": [{"split": "2023_08_28T20_57_35.828044", "path": ["**/details_original|mmlu:college_mathematics|5_2023-08-28T20:57:35.828044.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:college_mathematics|5_2023-08-28T20:57:35.828044.parquet"]}]}, {"config_name": "original_mmlu_college_medicine_5", "data_files": [{"split": "2023_08_28T20_57_35.828044", "path": ["**/details_original|mmlu:college_medicine|5_2023-08-28T20:57:35.828044.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:college_medicine|5_2023-08-28T20:57:35.828044.parquet"]}]}, {"config_name": "original_mmlu_college_physics_5", "data_files": [{"split": "2023_08_28T20_57_35.828044", "path": ["**/details_original|mmlu:college_physics|5_2023-08-28T20:57:35.828044.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:college_physics|5_2023-08-28T20:57:35.828044.parquet"]}]}, {"config_name": "original_mmlu_computer_security_5", "data_files": [{"split": "2023_08_28T20_57_35.828044", "path": ["**/details_original|mmlu:computer_security|5_2023-08-28T20:57:35.828044.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:computer_security|5_2023-08-28T20:57:35.828044.parquet"]}]}, {"config_name": "original_mmlu_conceptual_physics_5", "data_files": [{"split": "2023_08_28T20_57_35.828044", "path": ["**/details_original|mmlu:conceptual_physics|5_2023-08-28T20:57:35.828044.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:conceptual_physics|5_2023-08-28T20:57:35.828044.parquet"]}]}, {"config_name": "original_mmlu_econometrics_5", "data_files": [{"split": "2023_08_28T20_57_35.828044", "path": ["**/details_original|mmlu:econometrics|5_2023-08-28T20:57:35.828044.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:econometrics|5_2023-08-28T20:57:35.828044.parquet"]}]}, {"config_name": "original_mmlu_electrical_engineering_5", "data_files": [{"split": "2023_08_28T20_57_35.828044", "path": ["**/details_original|mmlu:electrical_engineering|5_2023-08-28T20:57:35.828044.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:electrical_engineering|5_2023-08-28T20:57:35.828044.parquet"]}]}, {"config_name": "original_mmlu_elementary_mathematics_5", "data_files": [{"split": "2023_08_28T20_57_35.828044", "path": ["**/details_original|mmlu:elementary_mathematics|5_2023-08-28T20:57:35.828044.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:elementary_mathematics|5_2023-08-28T20:57:35.828044.parquet"]}]}, {"config_name": "original_mmlu_formal_logic_5", "data_files": [{"split": "2023_08_28T20_57_35.828044", "path": ["**/details_original|mmlu:formal_logic|5_2023-08-28T20:57:35.828044.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:formal_logic|5_2023-08-28T20:57:35.828044.parquet"]}]}, {"config_name": "original_mmlu_global_facts_5", "data_files": [{"split": "2023_08_28T20_57_35.828044", "path": ["**/details_original|mmlu:global_facts|5_2023-08-28T20:57:35.828044.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:global_facts|5_2023-08-28T20:57:35.828044.parquet"]}]}, {"config_name": "original_mmlu_high_school_biology_5", "data_files": [{"split": "2023_08_28T20_57_35.828044", "path": ["**/details_original|mmlu:high_school_biology|5_2023-08-28T20:57:35.828044.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:high_school_biology|5_2023-08-28T20:57:35.828044.parquet"]}]}, {"config_name": "original_mmlu_high_school_chemistry_5", "data_files": [{"split": "2023_08_28T20_57_35.828044", "path": ["**/details_original|mmlu:high_school_chemistry|5_2023-08-28T20:57:35.828044.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:high_school_chemistry|5_2023-08-28T20:57:35.828044.parquet"]}]}, {"config_name": "original_mmlu_high_school_computer_science_5", "data_files": [{"split": "2023_08_28T20_57_35.828044", "path": ["**/details_original|mmlu:high_school_computer_science|5_2023-08-28T20:57:35.828044.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:high_school_computer_science|5_2023-08-28T20:57:35.828044.parquet"]}]}, {"config_name": "original_mmlu_high_school_european_history_5", "data_files": [{"split": "2023_08_28T20_57_35.828044", "path": ["**/details_original|mmlu:high_school_european_history|5_2023-08-28T20:57:35.828044.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:high_school_european_history|5_2023-08-28T20:57:35.828044.parquet"]}]}, {"config_name": "original_mmlu_high_school_geography_5", "data_files": [{"split": "2023_08_28T20_57_35.828044", "path": ["**/details_original|mmlu:high_school_geography|5_2023-08-28T20:57:35.828044.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:high_school_geography|5_2023-08-28T20:57:35.828044.parquet"]}]}, {"config_name": "original_mmlu_high_school_government_and_politics_5", "data_files": [{"split": "2023_08_28T20_57_35.828044", "path": ["**/details_original|mmlu:high_school_government_and_politics|5_2023-08-28T20:57:35.828044.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:high_school_government_and_politics|5_2023-08-28T20:57:35.828044.parquet"]}]}, {"config_name": "original_mmlu_high_school_macroeconomics_5", "data_files": [{"split": "2023_08_28T20_57_35.828044", "path": ["**/details_original|mmlu:high_school_macroeconomics|5_2023-08-28T20:57:35.828044.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:high_school_macroeconomics|5_2023-08-28T20:57:35.828044.parquet"]}]}, {"config_name": "original_mmlu_high_school_mathematics_5", "data_files": [{"split": "2023_08_28T20_57_35.828044", "path": ["**/details_original|mmlu:high_school_mathematics|5_2023-08-28T20:57:35.828044.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:high_school_mathematics|5_2023-08-28T20:57:35.828044.parquet"]}]}, {"config_name": "original_mmlu_high_school_microeconomics_5", "data_files": [{"split": "2023_08_28T20_57_35.828044", "path": ["**/details_original|mmlu:high_school_microeconomics|5_2023-08-28T20:57:35.828044.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:high_school_microeconomics|5_2023-08-28T20:57:35.828044.parquet"]}]}, {"config_name": "original_mmlu_high_school_physics_5", "data_files": [{"split": "2023_08_28T20_57_35.828044", "path": ["**/details_original|mmlu:high_school_physics|5_2023-08-28T20:57:35.828044.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:high_school_physics|5_2023-08-28T20:57:35.828044.parquet"]}]}, {"config_name": "original_mmlu_high_school_psychology_5", "data_files": [{"split": "2023_08_28T20_57_35.828044", "path": ["**/details_original|mmlu:high_school_psychology|5_2023-08-28T20:57:35.828044.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:high_school_psychology|5_2023-08-28T20:57:35.828044.parquet"]}]}, {"config_name": "original_mmlu_high_school_statistics_5", "data_files": [{"split": "2023_08_28T20_57_35.828044", "path": ["**/details_original|mmlu:high_school_statistics|5_2023-08-28T20:57:35.828044.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:high_school_statistics|5_2023-08-28T20:57:35.828044.parquet"]}]}, {"config_name": "original_mmlu_high_school_us_history_5", "data_files": [{"split": "2023_08_28T20_57_35.828044", "path": ["**/details_original|mmlu:high_school_us_history|5_2023-08-28T20:57:35.828044.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:high_school_us_history|5_2023-08-28T20:57:35.828044.parquet"]}]}, {"config_name": "original_mmlu_high_school_world_history_5", "data_files": [{"split": "2023_08_28T20_57_35.828044", "path": ["**/details_original|mmlu:high_school_world_history|5_2023-08-28T20:57:35.828044.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:high_school_world_history|5_2023-08-28T20:57:35.828044.parquet"]}]}, {"config_name": "original_mmlu_human_aging_5", "data_files": [{"split": "2023_08_28T20_57_35.828044", "path": ["**/details_original|mmlu:human_aging|5_2023-08-28T20:57:35.828044.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:human_aging|5_2023-08-28T20:57:35.828044.parquet"]}]}, {"config_name": "original_mmlu_human_sexuality_5", "data_files": [{"split": "2023_08_28T20_57_35.828044", "path": ["**/details_original|mmlu:human_sexuality|5_2023-08-28T20:57:35.828044.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:human_sexuality|5_2023-08-28T20:57:35.828044.parquet"]}]}, {"config_name": "original_mmlu_international_law_5", "data_files": [{"split": "2023_08_28T20_57_35.828044", "path": ["**/details_original|mmlu:international_law|5_2023-08-28T20:57:35.828044.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:international_law|5_2023-08-28T20:57:35.828044.parquet"]}]}, {"config_name": "original_mmlu_jurisprudence_5", "data_files": [{"split": "2023_08_28T20_57_35.828044", "path": ["**/details_original|mmlu:jurisprudence|5_2023-08-28T20:57:35.828044.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:jurisprudence|5_2023-08-28T20:57:35.828044.parquet"]}]}, {"config_name": "original_mmlu_logical_fallacies_5", "data_files": [{"split": "2023_08_28T20_57_35.828044", "path": ["**/details_original|mmlu:logical_fallacies|5_2023-08-28T20:57:35.828044.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:logical_fallacies|5_2023-08-28T20:57:35.828044.parquet"]}]}, {"config_name": "original_mmlu_machine_learning_5", "data_files": [{"split": "2023_08_28T20_57_35.828044", "path": ["**/details_original|mmlu:machine_learning|5_2023-08-28T20:57:35.828044.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:machine_learning|5_2023-08-28T20:57:35.828044.parquet"]}]}, {"config_name": "original_mmlu_management_5", "data_files": [{"split": "2023_08_28T20_57_35.828044", "path": ["**/details_original|mmlu:management|5_2023-08-28T20:57:35.828044.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:management|5_2023-08-28T20:57:35.828044.parquet"]}]}, {"config_name": "original_mmlu_marketing_5", "data_files": [{"split": "2023_08_28T20_57_35.828044", "path": ["**/details_original|mmlu:marketing|5_2023-08-28T20:57:35.828044.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:marketing|5_2023-08-28T20:57:35.828044.parquet"]}]}, {"config_name": "original_mmlu_medical_genetics_5", "data_files": [{"split": "2023_08_28T20_57_35.828044", "path": ["**/details_original|mmlu:medical_genetics|5_2023-08-28T20:57:35.828044.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:medical_genetics|5_2023-08-28T20:57:35.828044.parquet"]}]}, {"config_name": "original_mmlu_miscellaneous_5", "data_files": [{"split": "2023_08_28T20_57_35.828044", "path": ["**/details_original|mmlu:miscellaneous|5_2023-08-28T20:57:35.828044.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:miscellaneous|5_2023-08-28T20:57:35.828044.parquet"]}]}, {"config_name": "original_mmlu_moral_disputes_5", "data_files": [{"split": "2023_08_28T20_57_35.828044", "path": ["**/details_original|mmlu:moral_disputes|5_2023-08-28T20:57:35.828044.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:moral_disputes|5_2023-08-28T20:57:35.828044.parquet"]}]}, {"config_name": "original_mmlu_moral_scenarios_5", "data_files": [{"split": "2023_08_28T20_57_35.828044", "path": ["**/details_original|mmlu:moral_scenarios|5_2023-08-28T20:57:35.828044.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:moral_scenarios|5_2023-08-28T20:57:35.828044.parquet"]}]}, {"config_name": "original_mmlu_nutrition_5", "data_files": [{"split": "2023_08_28T20_57_35.828044", "path": ["**/details_original|mmlu:nutrition|5_2023-08-28T20:57:35.828044.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:nutrition|5_2023-08-28T20:57:35.828044.parquet"]}]}, {"config_name": "original_mmlu_philosophy_5", "data_files": [{"split": "2023_08_28T20_57_35.828044", "path": ["**/details_original|mmlu:philosophy|5_2023-08-28T20:57:35.828044.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:philosophy|5_2023-08-28T20:57:35.828044.parquet"]}]}, {"config_name": "original_mmlu_prehistory_5", "data_files": [{"split": "2023_08_28T20_57_35.828044", "path": ["**/details_original|mmlu:prehistory|5_2023-08-28T20:57:35.828044.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:prehistory|5_2023-08-28T20:57:35.828044.parquet"]}]}, {"config_name": "original_mmlu_professional_accounting_5", "data_files": [{"split": "2023_08_28T20_57_35.828044", "path": ["**/details_original|mmlu:professional_accounting|5_2023-08-28T20:57:35.828044.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:professional_accounting|5_2023-08-28T20:57:35.828044.parquet"]}]}, {"config_name": "original_mmlu_professional_law_5", "data_files": [{"split": "2023_08_28T20_57_35.828044", "path": ["**/details_original|mmlu:professional_law|5_2023-08-28T20:57:35.828044.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:professional_law|5_2023-08-28T20:57:35.828044.parquet"]}]}, {"config_name": "original_mmlu_professional_medicine_5", "data_files": [{"split": "2023_08_28T20_57_35.828044", "path": ["**/details_original|mmlu:professional_medicine|5_2023-08-28T20:57:35.828044.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:professional_medicine|5_2023-08-28T20:57:35.828044.parquet"]}]}, {"config_name": "original_mmlu_professional_psychology_5", "data_files": [{"split": "2023_08_28T20_57_35.828044", "path": ["**/details_original|mmlu:professional_psychology|5_2023-08-28T20:57:35.828044.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:professional_psychology|5_2023-08-28T20:57:35.828044.parquet"]}]}, {"config_name": "original_mmlu_public_relations_5", "data_files": [{"split": "2023_08_28T20_57_35.828044", "path": ["**/details_original|mmlu:public_relations|5_2023-08-28T20:57:35.828044.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:public_relations|5_2023-08-28T20:57:35.828044.parquet"]}]}, {"config_name": "original_mmlu_security_studies_5", "data_files": [{"split": "2023_08_28T20_57_35.828044", "path": ["**/details_original|mmlu:security_studies|5_2023-08-28T20:57:35.828044.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:security_studies|5_2023-08-28T20:57:35.828044.parquet"]}]}, {"config_name": "original_mmlu_sociology_5", "data_files": [{"split": "2023_08_28T20_57_35.828044", "path": ["**/details_original|mmlu:sociology|5_2023-08-28T20:57:35.828044.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:sociology|5_2023-08-28T20:57:35.828044.parquet"]}]}, {"config_name": "original_mmlu_us_foreign_policy_5", "data_files": [{"split": "2023_08_28T20_57_35.828044", "path": ["**/details_original|mmlu:us_foreign_policy|5_2023-08-28T20:57:35.828044.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:us_foreign_policy|5_2023-08-28T20:57:35.828044.parquet"]}]}, {"config_name": "original_mmlu_virology_5", "data_files": [{"split": "2023_08_28T20_57_35.828044", "path": ["**/details_original|mmlu:virology|5_2023-08-28T20:57:35.828044.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:virology|5_2023-08-28T20:57:35.828044.parquet"]}]}, {"config_name": "original_mmlu_world_religions_5", "data_files": [{"split": "2023_08_28T20_57_35.828044", "path": ["**/details_original|mmlu:world_religions|5_2023-08-28T20:57:35.828044.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:world_religions|5_2023-08-28T20:57:35.828044.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_08_17T15_17_14.973994", "path": ["results_2023-08-17T15:17:14.973994.parquet"]}, {"split": "2023_08_28T20_57_35.828044", "path": ["results_2023-08-28T20:57:35.828044.parquet"]}, {"split": "2023_10_13T14_01_30.256231", "path": ["results_2023-10-13T14-01-30.256231.parquet"]}, {"split": "latest", "path": ["results_2023-10-13T14-01-30.256231.parquet"]}]}]}
2023-10-13T13:01:42+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of Voicelab/trurl-2-13b ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model Voicelab/trurl-2-13b on the Open LLM Leaderboard. The dataset is composed of 122 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-10-13T14:01:30.256231(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of Voicelab/trurl-2-13b", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model Voicelab/trurl-2-13b on the Open LLM Leaderboard.\n\nThe dataset is composed of 122 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-13T14:01:30.256231(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of Voicelab/trurl-2-13b", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model Voicelab/trurl-2-13b on the Open LLM Leaderboard.\n\nThe dataset is composed of 122 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-13T14:01:30.256231(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 17, 31, 165, 68, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of Voicelab/trurl-2-13b## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model Voicelab/trurl-2-13b on the Open LLM Leaderboard.\n\nThe dataset is composed of 122 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-13T14:01:30.256231(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
a32d4a28eb7c00177219d8f5c1d9a7e3c629707f
# Dataset Card for Evaluation run of Voicelab/trurl-2-7b ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/Voicelab/trurl-2-7b - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [Voicelab/trurl-2-7b](https://huggingface.co/Voicelab/trurl-2-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_Voicelab__trurl-2-7b", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-10-24T13:00:35.734451](https://huggingface.co/datasets/open-llm-leaderboard/details_Voicelab__trurl-2-7b/blob/main/results_2023-10-24T13-00-35.734451.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.26908557046979864, "em_stderr": 0.004541696656496853, "f1": 0.3290079697986583, "f1_stderr": 0.004499453214736992, "acc": 0.3967222424009962, "acc_stderr": 0.009837690155913053 }, "harness|drop|3": { "em": 0.26908557046979864, "em_stderr": 0.004541696656496853, "f1": 0.3290079697986583, "f1_stderr": 0.004499453214736992 }, "harness|gsm8k|5": { "acc": 0.0712661106899166, "acc_stderr": 0.007086462127954499 }, "harness|winogrande|5": { "acc": 0.7221783741120757, "acc_stderr": 0.012588918183871605 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_Voicelab__trurl-2-7b
[ "region:us" ]
2023-08-18T17:56:33+00:00
{"pretty_name": "Evaluation run of Voicelab/trurl-2-7b", "dataset_summary": "Dataset automatically created during the evaluation run of model [Voicelab/trurl-2-7b](https://huggingface.co/Voicelab/trurl-2-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Voicelab__trurl-2-7b\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-24T13:00:35.734451](https://huggingface.co/datasets/open-llm-leaderboard/details_Voicelab__trurl-2-7b/blob/main/results_2023-10-24T13-00-35.734451.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.26908557046979864,\n \"em_stderr\": 0.004541696656496853,\n \"f1\": 0.3290079697986583,\n \"f1_stderr\": 0.004499453214736992,\n \"acc\": 0.3967222424009962,\n \"acc_stderr\": 0.009837690155913053\n },\n \"harness|drop|3\": {\n \"em\": 0.26908557046979864,\n \"em_stderr\": 0.004541696656496853,\n \"f1\": 0.3290079697986583,\n \"f1_stderr\": 0.004499453214736992\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0712661106899166,\n \"acc_stderr\": 0.007086462127954499\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7221783741120757,\n \"acc_stderr\": 0.012588918183871605\n }\n}\n```", "repo_url": "https://huggingface.co/Voicelab/trurl-2-7b", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_08_17T14_14_32.422343", "path": ["**/details_harness|arc:challenge|25_2023-08-17T14:14:32.422343.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-08-17T14:14:32.422343.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_24T13_00_35.734451", "path": ["**/details_harness|drop|3_2023-10-24T13-00-35.734451.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-24T13-00-35.734451.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_24T13_00_35.734451", "path": ["**/details_harness|gsm8k|5_2023-10-24T13-00-35.734451.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-24T13-00-35.734451.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_08_17T14_14_32.422343", "path": ["**/details_harness|hellaswag|10_2023-08-17T14:14:32.422343.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-08-17T14:14:32.422343.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_08_17T14_14_32.422343", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T14:14:32.422343.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-17T14:14:32.422343.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-17T14:14:32.422343.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T14:14:32.422343.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T14:14:32.422343.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-17T14:14:32.422343.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T14:14:32.422343.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T14:14:32.422343.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T14:14:32.422343.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T14:14:32.422343.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-17T14:14:32.422343.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-17T14:14:32.422343.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T14:14:32.422343.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-17T14:14:32.422343.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T14:14:32.422343.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T14:14:32.422343.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T14:14:32.422343.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-17T14:14:32.422343.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T14:14:32.422343.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T14:14:32.422343.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T14:14:32.422343.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T14:14:32.422343.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T14:14:32.422343.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T14:14:32.422343.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T14:14:32.422343.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T14:14:32.422343.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T14:14:32.422343.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T14:14:32.422343.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T14:14:32.422343.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T14:14:32.422343.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T14:14:32.422343.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T14:14:32.422343.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-17T14:14:32.422343.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T14:14:32.422343.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-17T14:14:32.422343.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T14:14:32.422343.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T14:14:32.422343.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T14:14:32.422343.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-17T14:14:32.422343.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-17T14:14:32.422343.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T14:14:32.422343.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T14:14:32.422343.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T14:14:32.422343.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T14:14:32.422343.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-17T14:14:32.422343.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-17T14:14:32.422343.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-17T14:14:32.422343.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T14:14:32.422343.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-17T14:14:32.422343.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T14:14:32.422343.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T14:14:32.422343.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-17T14:14:32.422343.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-17T14:14:32.422343.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-17T14:14:32.422343.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T14:14:32.422343.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-17T14:14:32.422343.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-17T14:14:32.422343.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T14:14:32.422343.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-17T14:14:32.422343.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-17T14:14:32.422343.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T14:14:32.422343.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T14:14:32.422343.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-17T14:14:32.422343.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T14:14:32.422343.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T14:14:32.422343.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T14:14:32.422343.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T14:14:32.422343.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-17T14:14:32.422343.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-17T14:14:32.422343.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T14:14:32.422343.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-17T14:14:32.422343.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T14:14:32.422343.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T14:14:32.422343.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T14:14:32.422343.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-17T14:14:32.422343.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T14:14:32.422343.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T14:14:32.422343.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T14:14:32.422343.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T14:14:32.422343.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T14:14:32.422343.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T14:14:32.422343.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T14:14:32.422343.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T14:14:32.422343.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T14:14:32.422343.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T14:14:32.422343.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T14:14:32.422343.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T14:14:32.422343.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T14:14:32.422343.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T14:14:32.422343.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-17T14:14:32.422343.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T14:14:32.422343.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-17T14:14:32.422343.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T14:14:32.422343.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T14:14:32.422343.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T14:14:32.422343.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-17T14:14:32.422343.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-17T14:14:32.422343.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T14:14:32.422343.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T14:14:32.422343.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T14:14:32.422343.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T14:14:32.422343.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-17T14:14:32.422343.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-17T14:14:32.422343.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-17T14:14:32.422343.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T14:14:32.422343.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-17T14:14:32.422343.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T14:14:32.422343.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T14:14:32.422343.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-17T14:14:32.422343.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-17T14:14:32.422343.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-17T14:14:32.422343.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T14:14:32.422343.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-17T14:14:32.422343.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-17T14:14:32.422343.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_08_17T14_14_32.422343", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T14:14:32.422343.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T14:14:32.422343.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_08_17T14_14_32.422343", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-17T14:14:32.422343.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-17T14:14:32.422343.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_08_17T14_14_32.422343", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-17T14:14:32.422343.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-17T14:14:32.422343.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_08_17T14_14_32.422343", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T14:14:32.422343.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T14:14:32.422343.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_08_17T14_14_32.422343", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T14:14:32.422343.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T14:14:32.422343.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_08_17T14_14_32.422343", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-17T14:14:32.422343.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-17T14:14:32.422343.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_08_17T14_14_32.422343", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T14:14:32.422343.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T14:14:32.422343.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_08_17T14_14_32.422343", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T14:14:32.422343.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T14:14:32.422343.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_08_17T14_14_32.422343", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T14:14:32.422343.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T14:14:32.422343.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_08_17T14_14_32.422343", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T14:14:32.422343.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T14:14:32.422343.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_08_17T14_14_32.422343", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-17T14:14:32.422343.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-17T14:14:32.422343.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_08_17T14_14_32.422343", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-17T14:14:32.422343.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-17T14:14:32.422343.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_08_17T14_14_32.422343", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T14:14:32.422343.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T14:14:32.422343.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_08_17T14_14_32.422343", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-17T14:14:32.422343.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-17T14:14:32.422343.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_08_17T14_14_32.422343", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T14:14:32.422343.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T14:14:32.422343.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_08_17T14_14_32.422343", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T14:14:32.422343.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T14:14:32.422343.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_08_17T14_14_32.422343", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T14:14:32.422343.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T14:14:32.422343.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_08_17T14_14_32.422343", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-17T14:14:32.422343.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-17T14:14:32.422343.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_08_17T14_14_32.422343", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T14:14:32.422343.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T14:14:32.422343.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_08_17T14_14_32.422343", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T14:14:32.422343.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T14:14:32.422343.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_08_17T14_14_32.422343", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T14:14:32.422343.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T14:14:32.422343.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_08_17T14_14_32.422343", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T14:14:32.422343.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T14:14:32.422343.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_08_17T14_14_32.422343", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T14:14:32.422343.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T14:14:32.422343.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_08_17T14_14_32.422343", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T14:14:32.422343.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T14:14:32.422343.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_08_17T14_14_32.422343", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T14:14:32.422343.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T14:14:32.422343.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_08_17T14_14_32.422343", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T14:14:32.422343.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T14:14:32.422343.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_08_17T14_14_32.422343", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T14:14:32.422343.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T14:14:32.422343.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_08_17T14_14_32.422343", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T14:14:32.422343.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T14:14:32.422343.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_08_17T14_14_32.422343", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T14:14:32.422343.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T14:14:32.422343.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_08_17T14_14_32.422343", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T14:14:32.422343.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T14:14:32.422343.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_08_17T14_14_32.422343", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T14:14:32.422343.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T14:14:32.422343.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_08_17T14_14_32.422343", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T14:14:32.422343.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T14:14:32.422343.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_08_17T14_14_32.422343", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-17T14:14:32.422343.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-17T14:14:32.422343.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_08_17T14_14_32.422343", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T14:14:32.422343.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T14:14:32.422343.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_08_17T14_14_32.422343", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-17T14:14:32.422343.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-17T14:14:32.422343.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_08_17T14_14_32.422343", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T14:14:32.422343.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T14:14:32.422343.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_08_17T14_14_32.422343", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T14:14:32.422343.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T14:14:32.422343.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_08_17T14_14_32.422343", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T14:14:32.422343.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T14:14:32.422343.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_08_17T14_14_32.422343", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-17T14:14:32.422343.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-17T14:14:32.422343.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_08_17T14_14_32.422343", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-17T14:14:32.422343.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-17T14:14:32.422343.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_08_17T14_14_32.422343", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T14:14:32.422343.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T14:14:32.422343.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_08_17T14_14_32.422343", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T14:14:32.422343.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T14:14:32.422343.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_08_17T14_14_32.422343", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T14:14:32.422343.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T14:14:32.422343.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_08_17T14_14_32.422343", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T14:14:32.422343.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T14:14:32.422343.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_08_17T14_14_32.422343", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-17T14:14:32.422343.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-17T14:14:32.422343.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_08_17T14_14_32.422343", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-17T14:14:32.422343.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-17T14:14:32.422343.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_08_17T14_14_32.422343", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-17T14:14:32.422343.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-17T14:14:32.422343.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_08_17T14_14_32.422343", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T14:14:32.422343.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T14:14:32.422343.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_08_17T14_14_32.422343", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-17T14:14:32.422343.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-17T14:14:32.422343.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_08_17T14_14_32.422343", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T14:14:32.422343.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T14:14:32.422343.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_08_17T14_14_32.422343", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T14:14:32.422343.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T14:14:32.422343.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_08_17T14_14_32.422343", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-17T14:14:32.422343.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-17T14:14:32.422343.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_08_17T14_14_32.422343", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-17T14:14:32.422343.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-17T14:14:32.422343.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_08_17T14_14_32.422343", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-17T14:14:32.422343.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-17T14:14:32.422343.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_08_17T14_14_32.422343", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T14:14:32.422343.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T14:14:32.422343.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_08_17T14_14_32.422343", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-17T14:14:32.422343.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-17T14:14:32.422343.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_08_17T14_14_32.422343", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-17T14:14:32.422343.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-17T14:14:32.422343.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_08_17T14_14_32.422343", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-17T14:14:32.422343.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-17T14:14:32.422343.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_24T13_00_35.734451", "path": ["**/details_harness|winogrande|5_2023-10-24T13-00-35.734451.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-24T13-00-35.734451.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_08_17T14_14_32.422343", "path": ["results_2023-08-17T14:14:32.422343.parquet"]}, {"split": "2023_10_24T13_00_35.734451", "path": ["results_2023-10-24T13-00-35.734451.parquet"]}, {"split": "latest", "path": ["results_2023-10-24T13-00-35.734451.parquet"]}]}]}
2023-10-24T12:00:48+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of Voicelab/trurl-2-7b ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model Voicelab/trurl-2-7b on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-10-24T13:00:35.734451(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of Voicelab/trurl-2-7b", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model Voicelab/trurl-2-7b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-24T13:00:35.734451(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of Voicelab/trurl-2-7b", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model Voicelab/trurl-2-7b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-24T13:00:35.734451(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 17, 31, 165, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of Voicelab/trurl-2-7b## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model Voicelab/trurl-2-7b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-24T13:00:35.734451(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
4cbd987a646b5ef121f1e32db552fe69d7ce5cd8
# Dataset Card for Evaluation run of h2oai/h2ogpt-research-oasst1-llama-65b ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/h2oai/h2ogpt-research-oasst1-llama-65b - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [h2oai/h2ogpt-research-oasst1-llama-65b](https://huggingface.co/h2oai/h2ogpt-research-oasst1-llama-65b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_h2oai__h2ogpt-research-oasst1-llama-65b", "harness_truthfulqa_mc_0", split="train") ``` ## Latest results These are the [latest results from run 2023-08-17T22:10:29.981773](https://huggingface.co/datasets/open-llm-leaderboard/details_h2oai__h2ogpt-research-oasst1-llama-65b/blob/main/results_2023-08-17T22%3A10%3A29.981773.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.6359037673839993, "acc_stderr": 0.0329346816196445, "acc_norm": 0.6396809356138717, "acc_norm_stderr": 0.03290965482744071, "mc1": 0.34394124847001223, "mc1_stderr": 0.01662908751427678, "mc2": 0.48845185520886875, "mc2_stderr": 0.014057830912491135 }, "harness|arc:challenge|25": { "acc": 0.6177474402730375, "acc_stderr": 0.014200454049979275, "acc_norm": 0.6476109215017065, "acc_norm_stderr": 0.01396014260059868 }, "harness|hellaswag|10": { "acc": 0.6664011153156741, "acc_stderr": 0.004705347137699622, "acc_norm": 0.8593905596494722, "acc_norm_stderr": 0.0034690778470563765 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.32, "acc_stderr": 0.046882617226215034, "acc_norm": 0.32, "acc_norm_stderr": 0.046882617226215034 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.562962962962963, "acc_stderr": 0.042849586397534015, "acc_norm": 0.562962962962963, "acc_norm_stderr": 0.042849586397534015 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.75, "acc_stderr": 0.03523807393012047, "acc_norm": 0.75, "acc_norm_stderr": 0.03523807393012047 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.57, "acc_stderr": 0.049756985195624284, "acc_norm": 0.57, "acc_norm_stderr": 0.049756985195624284 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.6339622641509434, "acc_stderr": 0.029647813539365245, "acc_norm": 0.6339622641509434, "acc_norm_stderr": 0.029647813539365245 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.7083333333333334, "acc_stderr": 0.03800968060554858, "acc_norm": 0.7083333333333334, "acc_norm_stderr": 0.03800968060554858 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.51, "acc_stderr": 0.05024183937956911, "acc_norm": 0.51, "acc_norm_stderr": 0.05024183937956911 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.52, "acc_stderr": 0.050211673156867795, "acc_norm": 0.52, "acc_norm_stderr": 0.050211673156867795 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.33, "acc_stderr": 0.04725815626252604, "acc_norm": 0.33, "acc_norm_stderr": 0.04725815626252604 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.5549132947976878, "acc_stderr": 0.03789401760283648, "acc_norm": 0.5549132947976878, "acc_norm_stderr": 0.03789401760283648 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.3333333333333333, "acc_stderr": 0.04690650298201942, "acc_norm": 0.3333333333333333, "acc_norm_stderr": 0.04690650298201942 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.75, "acc_stderr": 0.04351941398892446, "acc_norm": 0.75, "acc_norm_stderr": 0.04351941398892446 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.6, "acc_stderr": 0.03202563076101737, "acc_norm": 0.6, "acc_norm_stderr": 0.03202563076101737 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.38596491228070173, "acc_stderr": 0.04579639422070434, "acc_norm": 0.38596491228070173, "acc_norm_stderr": 0.04579639422070434 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.5310344827586206, "acc_stderr": 0.04158632762097828, "acc_norm": 0.5310344827586206, "acc_norm_stderr": 0.04158632762097828 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.3783068783068783, "acc_stderr": 0.024976954053155254, "acc_norm": 0.3783068783068783, "acc_norm_stderr": 0.024976954053155254 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.42063492063492064, "acc_stderr": 0.04415438226743744, "acc_norm": 0.42063492063492064, "acc_norm_stderr": 0.04415438226743744 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.4, "acc_stderr": 0.04923659639173309, "acc_norm": 0.4, "acc_norm_stderr": 0.04923659639173309 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.7483870967741936, "acc_stderr": 0.024685979286239963, "acc_norm": 0.7483870967741936, "acc_norm_stderr": 0.024685979286239963 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.4039408866995074, "acc_stderr": 0.0345245390382204, "acc_norm": 0.4039408866995074, "acc_norm_stderr": 0.0345245390382204 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.67, "acc_stderr": 0.047258156262526066, "acc_norm": 0.67, "acc_norm_stderr": 0.047258156262526066 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.8, "acc_stderr": 0.031234752377721164, "acc_norm": 0.8, "acc_norm_stderr": 0.031234752377721164 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.803030303030303, "acc_stderr": 0.028335609732463355, "acc_norm": 0.803030303030303, "acc_norm_stderr": 0.028335609732463355 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.8756476683937824, "acc_stderr": 0.023814477086593542, "acc_norm": 0.8756476683937824, "acc_norm_stderr": 0.023814477086593542 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.6256410256410256, "acc_stderr": 0.024537591572830513, "acc_norm": 0.6256410256410256, "acc_norm_stderr": 0.024537591572830513 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.31851851851851853, "acc_stderr": 0.02840653309060846, "acc_norm": 0.31851851851851853, "acc_norm_stderr": 0.02840653309060846 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.6722689075630253, "acc_stderr": 0.030489911417673227, "acc_norm": 0.6722689075630253, "acc_norm_stderr": 0.030489911417673227 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.3973509933774834, "acc_stderr": 0.0399552400768168, "acc_norm": 0.3973509933774834, "acc_norm_stderr": 0.0399552400768168 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.8275229357798165, "acc_stderr": 0.016197807956848043, "acc_norm": 0.8275229357798165, "acc_norm_stderr": 0.016197807956848043 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.5740740740740741, "acc_stderr": 0.03372343271653062, "acc_norm": 0.5740740740740741, "acc_norm_stderr": 0.03372343271653062 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.8333333333333334, "acc_stderr": 0.026156867523931055, "acc_norm": 0.8333333333333334, "acc_norm_stderr": 0.026156867523931055 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.8396624472573839, "acc_stderr": 0.02388438092596567, "acc_norm": 0.8396624472573839, "acc_norm_stderr": 0.02388438092596567 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.6995515695067265, "acc_stderr": 0.030769352008229146, "acc_norm": 0.6995515695067265, "acc_norm_stderr": 0.030769352008229146 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.7404580152671756, "acc_stderr": 0.03844876139785271, "acc_norm": 0.7404580152671756, "acc_norm_stderr": 0.03844876139785271 }, "harness|hendrycksTest-international_law|5": { "acc": 0.8099173553719008, "acc_stderr": 0.03581796951709282, "acc_norm": 0.8099173553719008, "acc_norm_stderr": 0.03581796951709282 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.7314814814814815, "acc_stderr": 0.042844679680521934, "acc_norm": 0.7314814814814815, "acc_norm_stderr": 0.042844679680521934 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.7791411042944786, "acc_stderr": 0.03259177392742179, "acc_norm": 0.7791411042944786, "acc_norm_stderr": 0.03259177392742179 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.5178571428571429, "acc_stderr": 0.047427623612430116, "acc_norm": 0.5178571428571429, "acc_norm_stderr": 0.047427623612430116 }, "harness|hendrycksTest-management|5": { "acc": 0.8446601941747572, "acc_stderr": 0.03586594738573974, "acc_norm": 0.8446601941747572, "acc_norm_stderr": 0.03586594738573974 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8717948717948718, "acc_stderr": 0.02190190511507333, "acc_norm": 0.8717948717948718, "acc_norm_stderr": 0.02190190511507333 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.68, "acc_stderr": 0.046882617226215034, "acc_norm": 0.68, "acc_norm_stderr": 0.046882617226215034 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.8148148148148148, "acc_stderr": 0.013890862162876166, "acc_norm": 0.8148148148148148, "acc_norm_stderr": 0.013890862162876166 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.7196531791907514, "acc_stderr": 0.02418242749657761, "acc_norm": 0.7196531791907514, "acc_norm_stderr": 0.02418242749657761 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.4770949720670391, "acc_stderr": 0.016704945740326188, "acc_norm": 0.4770949720670391, "acc_norm_stderr": 0.016704945740326188 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.6764705882352942, "acc_stderr": 0.026787453111906497, "acc_norm": 0.6764705882352942, "acc_norm_stderr": 0.026787453111906497 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.7459807073954984, "acc_stderr": 0.0247238615047717, "acc_norm": 0.7459807073954984, "acc_norm_stderr": 0.0247238615047717 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.7530864197530864, "acc_stderr": 0.02399350170904212, "acc_norm": 0.7530864197530864, "acc_norm_stderr": 0.02399350170904212 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.48936170212765956, "acc_stderr": 0.029820747191422473, "acc_norm": 0.48936170212765956, "acc_norm_stderr": 0.029820747191422473 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.4830508474576271, "acc_stderr": 0.01276289688921086, "acc_norm": 0.4830508474576271, "acc_norm_stderr": 0.01276289688921086 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.6066176470588235, "acc_stderr": 0.029674288281311155, "acc_norm": 0.6066176470588235, "acc_norm_stderr": 0.029674288281311155 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.673202614379085, "acc_stderr": 0.018975427920507205, "acc_norm": 0.673202614379085, "acc_norm_stderr": 0.018975427920507205 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.7272727272727273, "acc_stderr": 0.04265792110940589, "acc_norm": 0.7272727272727273, "acc_norm_stderr": 0.04265792110940589 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.6653061224489796, "acc_stderr": 0.030209235226242304, "acc_norm": 0.6653061224489796, "acc_norm_stderr": 0.030209235226242304 }, "harness|hendrycksTest-sociology|5": { "acc": 0.835820895522388, "acc_stderr": 0.026193923544454125, "acc_norm": 0.835820895522388, "acc_norm_stderr": 0.026193923544454125 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.87, "acc_stderr": 0.03379976689896308, "acc_norm": 0.87, "acc_norm_stderr": 0.03379976689896308 }, "harness|hendrycksTest-virology|5": { "acc": 0.5481927710843374, "acc_stderr": 0.03874371556587953, "acc_norm": 0.5481927710843374, "acc_norm_stderr": 0.03874371556587953 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.8245614035087719, "acc_stderr": 0.029170885500727665, "acc_norm": 0.8245614035087719, "acc_norm_stderr": 0.029170885500727665 }, "harness|truthfulqa:mc|0": { "mc1": 0.34394124847001223, "mc1_stderr": 0.01662908751427678, "mc2": 0.48845185520886875, "mc2_stderr": 0.014057830912491135 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_h2oai__h2ogpt-research-oasst1-llama-65b
[ "region:us" ]
2023-08-18T17:56:43+00:00
{"pretty_name": "Evaluation run of h2oai/h2ogpt-research-oasst1-llama-65b", "dataset_summary": "Dataset automatically created during the evaluation run of model [h2oai/h2ogpt-research-oasst1-llama-65b](https://huggingface.co/h2oai/h2ogpt-research-oasst1-llama-65b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_h2oai__h2ogpt-research-oasst1-llama-65b\",\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-08-17T22:10:29.981773](https://huggingface.co/datasets/open-llm-leaderboard/details_h2oai__h2ogpt-research-oasst1-llama-65b/blob/main/results_2023-08-17T22%3A10%3A29.981773.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6359037673839993,\n \"acc_stderr\": 0.0329346816196445,\n \"acc_norm\": 0.6396809356138717,\n \"acc_norm_stderr\": 0.03290965482744071,\n \"mc1\": 0.34394124847001223,\n \"mc1_stderr\": 0.01662908751427678,\n \"mc2\": 0.48845185520886875,\n \"mc2_stderr\": 0.014057830912491135\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6177474402730375,\n \"acc_stderr\": 0.014200454049979275,\n \"acc_norm\": 0.6476109215017065,\n \"acc_norm_stderr\": 0.01396014260059868\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6664011153156741,\n \"acc_stderr\": 0.004705347137699622,\n \"acc_norm\": 0.8593905596494722,\n \"acc_norm_stderr\": 0.0034690778470563765\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.562962962962963,\n \"acc_stderr\": 0.042849586397534015,\n \"acc_norm\": 0.562962962962963,\n \"acc_norm_stderr\": 0.042849586397534015\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.03523807393012047,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.03523807393012047\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.57,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.57,\n \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6339622641509434,\n \"acc_stderr\": 0.029647813539365245,\n \"acc_norm\": 0.6339622641509434,\n \"acc_norm_stderr\": 0.029647813539365245\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7083333333333334,\n \"acc_stderr\": 0.03800968060554858,\n \"acc_norm\": 0.7083333333333334,\n \"acc_norm_stderr\": 0.03800968060554858\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956911,\n \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956911\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5549132947976878,\n \"acc_stderr\": 0.03789401760283648,\n \"acc_norm\": 0.5549132947976878,\n \"acc_norm_stderr\": 0.03789401760283648\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.04690650298201942,\n \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.04690650298201942\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.03202563076101737,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.03202563076101737\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.38596491228070173,\n \"acc_stderr\": 0.04579639422070434,\n \"acc_norm\": 0.38596491228070173,\n \"acc_norm_stderr\": 0.04579639422070434\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5310344827586206,\n \"acc_stderr\": 0.04158632762097828,\n \"acc_norm\": 0.5310344827586206,\n \"acc_norm_stderr\": 0.04158632762097828\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.3783068783068783,\n \"acc_stderr\": 0.024976954053155254,\n \"acc_norm\": 0.3783068783068783,\n \"acc_norm_stderr\": 0.024976954053155254\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.42063492063492064,\n \"acc_stderr\": 0.04415438226743744,\n \"acc_norm\": 0.42063492063492064,\n \"acc_norm_stderr\": 0.04415438226743744\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.4,\n \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7483870967741936,\n \"acc_stderr\": 0.024685979286239963,\n \"acc_norm\": 0.7483870967741936,\n \"acc_norm_stderr\": 0.024685979286239963\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4039408866995074,\n \"acc_stderr\": 0.0345245390382204,\n \"acc_norm\": 0.4039408866995074,\n \"acc_norm_stderr\": 0.0345245390382204\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.67,\n \"acc_stderr\": 0.047258156262526066,\n \"acc_norm\": 0.67,\n \"acc_norm_stderr\": 0.047258156262526066\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.8,\n \"acc_stderr\": 0.031234752377721164,\n \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.031234752377721164\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.803030303030303,\n \"acc_stderr\": 0.028335609732463355,\n \"acc_norm\": 0.803030303030303,\n \"acc_norm_stderr\": 0.028335609732463355\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8756476683937824,\n \"acc_stderr\": 0.023814477086593542,\n \"acc_norm\": 0.8756476683937824,\n \"acc_norm_stderr\": 0.023814477086593542\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6256410256410256,\n \"acc_stderr\": 0.024537591572830513,\n \"acc_norm\": 0.6256410256410256,\n \"acc_norm_stderr\": 0.024537591572830513\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.31851851851851853,\n \"acc_stderr\": 0.02840653309060846,\n \"acc_norm\": 0.31851851851851853,\n \"acc_norm_stderr\": 0.02840653309060846\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6722689075630253,\n \"acc_stderr\": 0.030489911417673227,\n \"acc_norm\": 0.6722689075630253,\n \"acc_norm_stderr\": 0.030489911417673227\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3973509933774834,\n \"acc_stderr\": 0.0399552400768168,\n \"acc_norm\": 0.3973509933774834,\n \"acc_norm_stderr\": 0.0399552400768168\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8275229357798165,\n \"acc_stderr\": 0.016197807956848043,\n \"acc_norm\": 0.8275229357798165,\n \"acc_norm_stderr\": 0.016197807956848043\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5740740740740741,\n \"acc_stderr\": 0.03372343271653062,\n \"acc_norm\": 0.5740740740740741,\n \"acc_norm_stderr\": 0.03372343271653062\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8333333333333334,\n \"acc_stderr\": 0.026156867523931055,\n \"acc_norm\": 0.8333333333333334,\n \"acc_norm_stderr\": 0.026156867523931055\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8396624472573839,\n \"acc_stderr\": 0.02388438092596567,\n \"acc_norm\": 0.8396624472573839,\n \"acc_norm_stderr\": 0.02388438092596567\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6995515695067265,\n \"acc_stderr\": 0.030769352008229146,\n \"acc_norm\": 0.6995515695067265,\n \"acc_norm_stderr\": 0.030769352008229146\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7404580152671756,\n \"acc_stderr\": 0.03844876139785271,\n \"acc_norm\": 0.7404580152671756,\n \"acc_norm_stderr\": 0.03844876139785271\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8099173553719008,\n \"acc_stderr\": 0.03581796951709282,\n \"acc_norm\": 0.8099173553719008,\n \"acc_norm_stderr\": 0.03581796951709282\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7314814814814815,\n \"acc_stderr\": 0.042844679680521934,\n \"acc_norm\": 0.7314814814814815,\n \"acc_norm_stderr\": 0.042844679680521934\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7791411042944786,\n \"acc_stderr\": 0.03259177392742179,\n \"acc_norm\": 0.7791411042944786,\n \"acc_norm_stderr\": 0.03259177392742179\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5178571428571429,\n \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.5178571428571429,\n \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8446601941747572,\n \"acc_stderr\": 0.03586594738573974,\n \"acc_norm\": 0.8446601941747572,\n \"acc_norm_stderr\": 0.03586594738573974\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8717948717948718,\n \"acc_stderr\": 0.02190190511507333,\n \"acc_norm\": 0.8717948717948718,\n \"acc_norm_stderr\": 0.02190190511507333\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.68,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8148148148148148,\n \"acc_stderr\": 0.013890862162876166,\n \"acc_norm\": 0.8148148148148148,\n \"acc_norm_stderr\": 0.013890862162876166\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7196531791907514,\n \"acc_stderr\": 0.02418242749657761,\n \"acc_norm\": 0.7196531791907514,\n \"acc_norm_stderr\": 0.02418242749657761\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4770949720670391,\n \"acc_stderr\": 0.016704945740326188,\n \"acc_norm\": 0.4770949720670391,\n \"acc_norm_stderr\": 0.016704945740326188\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.026787453111906497,\n \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.026787453111906497\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7459807073954984,\n \"acc_stderr\": 0.0247238615047717,\n \"acc_norm\": 0.7459807073954984,\n \"acc_norm_stderr\": 0.0247238615047717\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7530864197530864,\n \"acc_stderr\": 0.02399350170904212,\n \"acc_norm\": 0.7530864197530864,\n \"acc_norm_stderr\": 0.02399350170904212\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.48936170212765956,\n \"acc_stderr\": 0.029820747191422473,\n \"acc_norm\": 0.48936170212765956,\n \"acc_norm_stderr\": 0.029820747191422473\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4830508474576271,\n \"acc_stderr\": 0.01276289688921086,\n \"acc_norm\": 0.4830508474576271,\n \"acc_norm_stderr\": 0.01276289688921086\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6066176470588235,\n \"acc_stderr\": 0.029674288281311155,\n \"acc_norm\": 0.6066176470588235,\n \"acc_norm_stderr\": 0.029674288281311155\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.673202614379085,\n \"acc_stderr\": 0.018975427920507205,\n \"acc_norm\": 0.673202614379085,\n \"acc_norm_stderr\": 0.018975427920507205\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7272727272727273,\n \"acc_stderr\": 0.04265792110940589,\n \"acc_norm\": 0.7272727272727273,\n \"acc_norm_stderr\": 0.04265792110940589\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.6653061224489796,\n \"acc_stderr\": 0.030209235226242304,\n \"acc_norm\": 0.6653061224489796,\n \"acc_norm_stderr\": 0.030209235226242304\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.835820895522388,\n \"acc_stderr\": 0.026193923544454125,\n \"acc_norm\": 0.835820895522388,\n \"acc_norm_stderr\": 0.026193923544454125\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.87,\n \"acc_stderr\": 0.03379976689896308,\n \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.03379976689896308\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5481927710843374,\n \"acc_stderr\": 0.03874371556587953,\n \"acc_norm\": 0.5481927710843374,\n \"acc_norm_stderr\": 0.03874371556587953\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8245614035087719,\n \"acc_stderr\": 0.029170885500727665,\n \"acc_norm\": 0.8245614035087719,\n \"acc_norm_stderr\": 0.029170885500727665\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.34394124847001223,\n \"mc1_stderr\": 0.01662908751427678,\n \"mc2\": 0.48845185520886875,\n \"mc2_stderr\": 0.014057830912491135\n }\n}\n```", "repo_url": "https://huggingface.co/h2oai/h2ogpt-research-oasst1-llama-65b", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_08_17T17_53_50.635044", "path": ["**/details_harness|arc:challenge|25_2023-08-17T17:53:50.635044.parquet"]}, {"split": "2023_08_17T22_10_29.981773", "path": ["**/details_harness|arc:challenge|25_2023-08-17T22:10:29.981773.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-08-17T22:10:29.981773.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_08_17T17_53_50.635044", "path": ["**/details_harness|hellaswag|10_2023-08-17T17:53:50.635044.parquet"]}, {"split": "2023_08_17T22_10_29.981773", "path": ["**/details_harness|hellaswag|10_2023-08-17T22:10:29.981773.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-08-17T22:10:29.981773.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_08_17T17_53_50.635044", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T17:53:50.635044.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-17T17:53:50.635044.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-17T17:53:50.635044.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T17:53:50.635044.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T17:53:50.635044.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-17T17:53:50.635044.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T17:53:50.635044.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T17:53:50.635044.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T17:53:50.635044.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T17:53:50.635044.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-17T17:53:50.635044.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-17T17:53:50.635044.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T17:53:50.635044.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-17T17:53:50.635044.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T17:53:50.635044.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T17:53:50.635044.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T17:53:50.635044.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-17T17:53:50.635044.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T17:53:50.635044.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T17:53:50.635044.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T17:53:50.635044.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T17:53:50.635044.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T17:53:50.635044.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T17:53:50.635044.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T17:53:50.635044.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T17:53:50.635044.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T17:53:50.635044.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T17:53:50.635044.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T17:53:50.635044.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T17:53:50.635044.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T17:53:50.635044.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T17:53:50.635044.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-17T17:53:50.635044.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T17:53:50.635044.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-17T17:53:50.635044.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T17:53:50.635044.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T17:53:50.635044.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T17:53:50.635044.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-17T17:53:50.635044.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-17T17:53:50.635044.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T17:53:50.635044.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T17:53:50.635044.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T17:53:50.635044.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T17:53:50.635044.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-17T17:53:50.635044.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-17T17:53:50.635044.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-17T17:53:50.635044.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T17:53:50.635044.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-17T17:53:50.635044.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T17:53:50.635044.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T17:53:50.635044.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-17T17:53:50.635044.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-17T17:53:50.635044.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-17T17:53:50.635044.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T17:53:50.635044.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-17T17:53:50.635044.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-17T17:53:50.635044.parquet"]}, {"split": "2023_08_17T22_10_29.981773", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T22:10:29.981773.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-17T22:10:29.981773.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-17T22:10:29.981773.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T22:10:29.981773.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T22:10:29.981773.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-17T22:10:29.981773.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T22:10:29.981773.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T22:10:29.981773.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T22:10:29.981773.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T22:10:29.981773.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-17T22:10:29.981773.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-17T22:10:29.981773.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T22:10:29.981773.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-17T22:10:29.981773.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T22:10:29.981773.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T22:10:29.981773.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T22:10:29.981773.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-17T22:10:29.981773.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T22:10:29.981773.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T22:10:29.981773.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T22:10:29.981773.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T22:10:29.981773.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T22:10:29.981773.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T22:10:29.981773.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T22:10:29.981773.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T22:10:29.981773.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T22:10:29.981773.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T22:10:29.981773.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T22:10:29.981773.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T22:10:29.981773.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T22:10:29.981773.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T22:10:29.981773.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-17T22:10:29.981773.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T22:10:29.981773.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-17T22:10:29.981773.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T22:10:29.981773.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T22:10:29.981773.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T22:10:29.981773.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-17T22:10:29.981773.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-17T22:10:29.981773.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T22:10:29.981773.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T22:10:29.981773.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T22:10:29.981773.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T22:10:29.981773.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-17T22:10:29.981773.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-17T22:10:29.981773.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-17T22:10:29.981773.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T22:10:29.981773.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-17T22:10:29.981773.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T22:10:29.981773.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T22:10:29.981773.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-17T22:10:29.981773.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-17T22:10:29.981773.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-17T22:10:29.981773.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T22:10:29.981773.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-17T22:10:29.981773.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-17T22:10:29.981773.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T22:10:29.981773.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-17T22:10:29.981773.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-17T22:10:29.981773.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T22:10:29.981773.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T22:10:29.981773.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-17T22:10:29.981773.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T22:10:29.981773.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T22:10:29.981773.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T22:10:29.981773.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T22:10:29.981773.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-17T22:10:29.981773.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-17T22:10:29.981773.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T22:10:29.981773.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-17T22:10:29.981773.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T22:10:29.981773.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T22:10:29.981773.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T22:10:29.981773.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-17T22:10:29.981773.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T22:10:29.981773.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T22:10:29.981773.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T22:10:29.981773.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T22:10:29.981773.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T22:10:29.981773.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T22:10:29.981773.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T22:10:29.981773.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T22:10:29.981773.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T22:10:29.981773.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T22:10:29.981773.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T22:10:29.981773.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T22:10:29.981773.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T22:10:29.981773.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T22:10:29.981773.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-17T22:10:29.981773.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T22:10:29.981773.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-17T22:10:29.981773.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T22:10:29.981773.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T22:10:29.981773.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T22:10:29.981773.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-17T22:10:29.981773.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-17T22:10:29.981773.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T22:10:29.981773.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T22:10:29.981773.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T22:10:29.981773.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T22:10:29.981773.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-17T22:10:29.981773.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-17T22:10:29.981773.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-17T22:10:29.981773.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T22:10:29.981773.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-17T22:10:29.981773.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T22:10:29.981773.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T22:10:29.981773.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-17T22:10:29.981773.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-17T22:10:29.981773.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-17T22:10:29.981773.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T22:10:29.981773.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-17T22:10:29.981773.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-17T22:10:29.981773.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_08_17T17_53_50.635044", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T17:53:50.635044.parquet"]}, {"split": "2023_08_17T22_10_29.981773", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T22:10:29.981773.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T22:10:29.981773.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_08_17T17_53_50.635044", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-17T17:53:50.635044.parquet"]}, {"split": "2023_08_17T22_10_29.981773", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-17T22:10:29.981773.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-17T22:10:29.981773.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_08_17T17_53_50.635044", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-17T17:53:50.635044.parquet"]}, {"split": "2023_08_17T22_10_29.981773", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-17T22:10:29.981773.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-17T22:10:29.981773.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_08_17T17_53_50.635044", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T17:53:50.635044.parquet"]}, {"split": "2023_08_17T22_10_29.981773", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T22:10:29.981773.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T22:10:29.981773.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_08_17T17_53_50.635044", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T17:53:50.635044.parquet"]}, {"split": "2023_08_17T22_10_29.981773", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T22:10:29.981773.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T22:10:29.981773.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_08_17T17_53_50.635044", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-17T17:53:50.635044.parquet"]}, {"split": "2023_08_17T22_10_29.981773", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-17T22:10:29.981773.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-17T22:10:29.981773.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_08_17T17_53_50.635044", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T17:53:50.635044.parquet"]}, {"split": "2023_08_17T22_10_29.981773", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T22:10:29.981773.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T22:10:29.981773.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_08_17T17_53_50.635044", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T17:53:50.635044.parquet"]}, {"split": "2023_08_17T22_10_29.981773", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T22:10:29.981773.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T22:10:29.981773.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_08_17T17_53_50.635044", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T17:53:50.635044.parquet"]}, {"split": "2023_08_17T22_10_29.981773", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T22:10:29.981773.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T22:10:29.981773.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_08_17T17_53_50.635044", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T17:53:50.635044.parquet"]}, {"split": "2023_08_17T22_10_29.981773", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T22:10:29.981773.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T22:10:29.981773.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_08_17T17_53_50.635044", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-17T17:53:50.635044.parquet"]}, {"split": "2023_08_17T22_10_29.981773", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-17T22:10:29.981773.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-17T22:10:29.981773.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_08_17T17_53_50.635044", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-17T17:53:50.635044.parquet"]}, {"split": "2023_08_17T22_10_29.981773", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-17T22:10:29.981773.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-17T22:10:29.981773.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_08_17T17_53_50.635044", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T17:53:50.635044.parquet"]}, {"split": "2023_08_17T22_10_29.981773", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T22:10:29.981773.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T22:10:29.981773.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_08_17T17_53_50.635044", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-17T17:53:50.635044.parquet"]}, {"split": "2023_08_17T22_10_29.981773", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-17T22:10:29.981773.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-17T22:10:29.981773.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_08_17T17_53_50.635044", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T17:53:50.635044.parquet"]}, {"split": "2023_08_17T22_10_29.981773", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T22:10:29.981773.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T22:10:29.981773.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_08_17T17_53_50.635044", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T17:53:50.635044.parquet"]}, {"split": "2023_08_17T22_10_29.981773", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T22:10:29.981773.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T22:10:29.981773.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_08_17T17_53_50.635044", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T17:53:50.635044.parquet"]}, {"split": "2023_08_17T22_10_29.981773", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T22:10:29.981773.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T22:10:29.981773.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_08_17T17_53_50.635044", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-17T17:53:50.635044.parquet"]}, {"split": "2023_08_17T22_10_29.981773", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-17T22:10:29.981773.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-17T22:10:29.981773.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_08_17T17_53_50.635044", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T17:53:50.635044.parquet"]}, {"split": "2023_08_17T22_10_29.981773", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T22:10:29.981773.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T22:10:29.981773.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_08_17T17_53_50.635044", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T17:53:50.635044.parquet"]}, {"split": "2023_08_17T22_10_29.981773", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T22:10:29.981773.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T22:10:29.981773.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_08_17T17_53_50.635044", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T17:53:50.635044.parquet"]}, {"split": "2023_08_17T22_10_29.981773", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T22:10:29.981773.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T22:10:29.981773.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_08_17T17_53_50.635044", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T17:53:50.635044.parquet"]}, {"split": "2023_08_17T22_10_29.981773", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T22:10:29.981773.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T22:10:29.981773.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_08_17T17_53_50.635044", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T17:53:50.635044.parquet"]}, {"split": "2023_08_17T22_10_29.981773", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T22:10:29.981773.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T22:10:29.981773.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_08_17T17_53_50.635044", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T17:53:50.635044.parquet"]}, {"split": "2023_08_17T22_10_29.981773", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T22:10:29.981773.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T22:10:29.981773.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_08_17T17_53_50.635044", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T17:53:50.635044.parquet"]}, {"split": "2023_08_17T22_10_29.981773", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T22:10:29.981773.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T22:10:29.981773.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_08_17T17_53_50.635044", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T17:53:50.635044.parquet"]}, {"split": "2023_08_17T22_10_29.981773", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T22:10:29.981773.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T22:10:29.981773.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_08_17T17_53_50.635044", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T17:53:50.635044.parquet"]}, {"split": "2023_08_17T22_10_29.981773", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T22:10:29.981773.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T22:10:29.981773.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_08_17T17_53_50.635044", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T17:53:50.635044.parquet"]}, {"split": "2023_08_17T22_10_29.981773", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T22:10:29.981773.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T22:10:29.981773.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_08_17T17_53_50.635044", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T17:53:50.635044.parquet"]}, {"split": "2023_08_17T22_10_29.981773", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T22:10:29.981773.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T22:10:29.981773.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_08_17T17_53_50.635044", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T17:53:50.635044.parquet"]}, {"split": "2023_08_17T22_10_29.981773", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T22:10:29.981773.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T22:10:29.981773.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_08_17T17_53_50.635044", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T17:53:50.635044.parquet"]}, {"split": "2023_08_17T22_10_29.981773", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T22:10:29.981773.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T22:10:29.981773.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_08_17T17_53_50.635044", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T17:53:50.635044.parquet"]}, {"split": "2023_08_17T22_10_29.981773", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T22:10:29.981773.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T22:10:29.981773.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_08_17T17_53_50.635044", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-17T17:53:50.635044.parquet"]}, {"split": "2023_08_17T22_10_29.981773", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-17T22:10:29.981773.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-17T22:10:29.981773.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_08_17T17_53_50.635044", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T17:53:50.635044.parquet"]}, {"split": "2023_08_17T22_10_29.981773", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T22:10:29.981773.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T22:10:29.981773.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_08_17T17_53_50.635044", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-17T17:53:50.635044.parquet"]}, {"split": "2023_08_17T22_10_29.981773", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-17T22:10:29.981773.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-17T22:10:29.981773.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_08_17T17_53_50.635044", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T17:53:50.635044.parquet"]}, {"split": "2023_08_17T22_10_29.981773", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T22:10:29.981773.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T22:10:29.981773.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_08_17T17_53_50.635044", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T17:53:50.635044.parquet"]}, {"split": "2023_08_17T22_10_29.981773", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T22:10:29.981773.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T22:10:29.981773.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_08_17T17_53_50.635044", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T17:53:50.635044.parquet"]}, {"split": "2023_08_17T22_10_29.981773", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T22:10:29.981773.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T22:10:29.981773.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_08_17T17_53_50.635044", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-17T17:53:50.635044.parquet"]}, {"split": "2023_08_17T22_10_29.981773", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-17T22:10:29.981773.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-17T22:10:29.981773.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_08_17T17_53_50.635044", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-17T17:53:50.635044.parquet"]}, {"split": "2023_08_17T22_10_29.981773", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-17T22:10:29.981773.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-17T22:10:29.981773.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_08_17T17_53_50.635044", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T17:53:50.635044.parquet"]}, {"split": "2023_08_17T22_10_29.981773", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T22:10:29.981773.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T22:10:29.981773.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_08_17T17_53_50.635044", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T17:53:50.635044.parquet"]}, {"split": "2023_08_17T22_10_29.981773", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T22:10:29.981773.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T22:10:29.981773.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_08_17T17_53_50.635044", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T17:53:50.635044.parquet"]}, {"split": "2023_08_17T22_10_29.981773", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T22:10:29.981773.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T22:10:29.981773.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_08_17T17_53_50.635044", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T17:53:50.635044.parquet"]}, {"split": "2023_08_17T22_10_29.981773", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T22:10:29.981773.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T22:10:29.981773.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_08_17T17_53_50.635044", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-17T17:53:50.635044.parquet"]}, {"split": "2023_08_17T22_10_29.981773", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-17T22:10:29.981773.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-17T22:10:29.981773.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_08_17T17_53_50.635044", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-17T17:53:50.635044.parquet"]}, {"split": "2023_08_17T22_10_29.981773", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-17T22:10:29.981773.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-17T22:10:29.981773.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_08_17T17_53_50.635044", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-17T17:53:50.635044.parquet"]}, {"split": "2023_08_17T22_10_29.981773", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-17T22:10:29.981773.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-17T22:10:29.981773.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_08_17T17_53_50.635044", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T17:53:50.635044.parquet"]}, {"split": "2023_08_17T22_10_29.981773", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T22:10:29.981773.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T22:10:29.981773.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_08_17T17_53_50.635044", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-17T17:53:50.635044.parquet"]}, {"split": "2023_08_17T22_10_29.981773", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-17T22:10:29.981773.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-17T22:10:29.981773.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_08_17T17_53_50.635044", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T17:53:50.635044.parquet"]}, {"split": "2023_08_17T22_10_29.981773", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T22:10:29.981773.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T22:10:29.981773.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_08_17T17_53_50.635044", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T17:53:50.635044.parquet"]}, {"split": "2023_08_17T22_10_29.981773", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T22:10:29.981773.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T22:10:29.981773.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_08_17T17_53_50.635044", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-17T17:53:50.635044.parquet"]}, {"split": "2023_08_17T22_10_29.981773", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-17T22:10:29.981773.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-17T22:10:29.981773.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_08_17T17_53_50.635044", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-17T17:53:50.635044.parquet"]}, {"split": "2023_08_17T22_10_29.981773", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-17T22:10:29.981773.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-17T22:10:29.981773.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_08_17T17_53_50.635044", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-17T17:53:50.635044.parquet"]}, {"split": "2023_08_17T22_10_29.981773", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-17T22:10:29.981773.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-17T22:10:29.981773.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_08_17T17_53_50.635044", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T17:53:50.635044.parquet"]}, {"split": "2023_08_17T22_10_29.981773", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T22:10:29.981773.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T22:10:29.981773.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_08_17T17_53_50.635044", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-17T17:53:50.635044.parquet"]}, {"split": "2023_08_17T22_10_29.981773", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-17T22:10:29.981773.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-17T22:10:29.981773.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_08_17T17_53_50.635044", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-17T17:53:50.635044.parquet"]}, {"split": "2023_08_17T22_10_29.981773", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-17T22:10:29.981773.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-17T22:10:29.981773.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_08_17T17_53_50.635044", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-17T17:53:50.635044.parquet"]}, {"split": "2023_08_17T22_10_29.981773", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-17T22:10:29.981773.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-17T22:10:29.981773.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_08_17T17_53_50.635044", "path": ["results_2023-08-17T17:53:50.635044.parquet"]}, {"split": "2023_08_17T22_10_29.981773", "path": ["results_2023-08-17T22:10:29.981773.parquet"]}, {"split": "latest", "path": ["results_2023-08-17T22:10:29.981773.parquet"]}]}]}
2023-08-27T11:41:44+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of h2oai/h2ogpt-research-oasst1-llama-65b ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model h2oai/h2ogpt-research-oasst1-llama-65b on the Open LLM Leaderboard. The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-08-17T22:10:29.981773 (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of h2oai/h2ogpt-research-oasst1-llama-65b", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model h2oai/h2ogpt-research-oasst1-llama-65b on the Open LLM Leaderboard.\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-08-17T22:10:29.981773 (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of h2oai/h2ogpt-research-oasst1-llama-65b", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model h2oai/h2ogpt-research-oasst1-llama-65b on the Open LLM Leaderboard.\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-08-17T22:10:29.981773 (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 31, 31, 179, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of h2oai/h2ogpt-research-oasst1-llama-65b## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model h2oai/h2ogpt-research-oasst1-llama-65b on the Open LLM Leaderboard.\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-08-17T22:10:29.981773 (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
db6339ef2dd5473fd0eb304ad76bdcfce39c90ff
# Dataset of iizunamaru_megumu/飯綱丸龍/이이즈나마루메구무 (Touhou) This is the dataset of iizunamaru_megumu/飯綱丸龍/이이즈나마루메구무 (Touhou), containing 45 images and their tags. The core tags of this character are `hat, long_hair, tokin_hat, red_eyes, blue_hair, blue_headwear, breasts, pointy_ears, hair_between_eyes, bangs`, which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)). ## List of Packages | Name | Images | Size | Download | Type | Description | |:-----------------|---------:|:-----------|:--------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------| | raw | 45 | 70.08 MiB | [Download](https://huggingface.co/datasets/CyberHarem/iizunamaru_megumu_touhou/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). | | 800 | 45 | 38.13 MiB | [Download](https://huggingface.co/datasets/CyberHarem/iizunamaru_megumu_touhou/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. | | stage3-p480-800 | 117 | 80.68 MiB | [Download](https://huggingface.co/datasets/CyberHarem/iizunamaru_megumu_touhou/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | | 1200 | 45 | 61.67 MiB | [Download](https://huggingface.co/datasets/CyberHarem/iizunamaru_megumu_touhou/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. | | stage3-p480-1200 | 117 | 116.29 MiB | [Download](https://huggingface.co/datasets/CyberHarem/iizunamaru_megumu_touhou/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code ```python import os import zipfile from huggingface_hub import hf_hub_download from waifuc.source import LocalSource # download raw archive file zip_file = hf_hub_download( repo_id='CyberHarem/iizunamaru_megumu_touhou', repo_type='dataset', filename='dataset-raw.zip', ) # extract files to your directory dataset_dir = 'dataset_dir' os.makedirs(dataset_dir, exist_ok=True) with zipfile.ZipFile(zip_file, 'r') as zf: zf.extractall(dataset_dir) # load the dataset with waifuc source = LocalSource(dataset_dir) for item in source: print(item.image, item.meta['filename'], item.meta['tags']) ``` ## List of Clusters List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| | 0 | 10 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, blue_dress, frilled_dress, pom_pom_(clothes), ribbon_trim, sleeveless_coat, solo, kneehighs, tengu-geta, black_socks, smile, gem, purple_footwear, open_mouth, looking_at_viewer, black_coat | | 1 | 6 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | 1girl, blue_dress, frilled_dress, pom_pom_(clothes), ribbon_trim, sleeveless_coat, solo, gem, kneehighs, looking_at_viewer, starry_sky, black_socks, closed_mouth, night_sky, smile, cloud, pauldrons, tengu-geta | | 2 | 12 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | 1girl, blue_dress, frilled_dress, ribbon_trim, solo, gem, sleeveless_coat, pom_pom_(clothes), large_breasts, simple_background, looking_at_viewer, smile, white_background, wings, black_coat, closed_mouth, cowboy_shot, earrings | ### Table Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | blue_dress | frilled_dress | pom_pom_(clothes) | ribbon_trim | sleeveless_coat | solo | kneehighs | tengu-geta | black_socks | smile | gem | purple_footwear | open_mouth | looking_at_viewer | black_coat | starry_sky | closed_mouth | night_sky | cloud | pauldrons | large_breasts | simple_background | white_background | wings | cowboy_shot | earrings | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------------|:----------------|:--------------------|:--------------|:------------------|:-------|:------------|:-------------|:--------------|:--------|:------|:------------------|:-------------|:--------------------|:-------------|:-------------|:---------------|:------------|:--------|:------------|:----------------|:--------------------|:-------------------|:--------|:--------------|:-----------| | 0 | 10 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | 1 | 6 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | X | X | X | X | X | X | X | X | X | X | X | X | | | X | | X | X | X | X | X | | | | | | | | 2 | 12 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | X | X | X | X | X | X | X | | | | X | X | | | X | X | | X | | | | X | X | X | X | X | X |
CyberHarem/iizunamaru_megumu_touhou
[ "task_categories:text-to-image", "size_categories:n<1K", "license:mit", "art", "not-for-all-audiences", "region:us" ]
2023-08-18T17:56:51+00:00
{"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]}
2024-01-15T02:22:04+00:00
[]
[]
TAGS #task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
Dataset of iizunamaru\_megumu/飯綱丸龍/이이즈나마루메구무 (Touhou) ===================================================== This is the dataset of iizunamaru\_megumu/飯綱丸龍/이이즈나마루메구무 (Touhou), containing 45 images and their tags. The core tags of this character are 'hat, long\_hair, tokin\_hat, red\_eyes, blue\_hair, blue\_headwear, breasts, pointy\_ears, hair\_between\_eyes, bangs', which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization). List of Packages ---------------- ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code List of Clusters ---------------- List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version ### Table Version
[ "### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.", "### Raw Text Version", "### Table Version" ]
[ "TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n", "### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.", "### Raw Text Version", "### Table Version" ]
[ 44, 61, 5, 4 ]
[ "passage: TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.### Raw Text Version### Table Version" ]
925fd672cf96b2b1334d217921ddd7e01a99c23e
# Dataset Card for Evaluation run of CoolWP/llama-2-13b-guanaco-fp16 ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/CoolWP/llama-2-13b-guanaco-fp16 - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [CoolWP/llama-2-13b-guanaco-fp16](https://huggingface.co/CoolWP/llama-2-13b-guanaco-fp16) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_CoolWP__llama-2-13b-guanaco-fp16", "harness_truthfulqa_mc_0", split="train") ``` ## Latest results These are the [latest results from run 2023-08-17T18:49:30.894423](https://huggingface.co/datasets/open-llm-leaderboard/details_CoolWP__llama-2-13b-guanaco-fp16/blob/main/results_2023-08-17T18%3A49%3A30.894423.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.5557402565625233, "acc_stderr": 0.03433097920024075, "acc_norm": 0.5600027152011281, "acc_norm_stderr": 0.03430992590405376, "mc1": 0.29865361077111385, "mc1_stderr": 0.016021570613768542, "mc2": 0.43400538092704843, "mc2_stderr": 0.014284105671223521 }, "harness|arc:challenge|25": { "acc": 0.552901023890785, "acc_stderr": 0.014529380160526843, "acc_norm": 0.5955631399317406, "acc_norm_stderr": 0.014342036483436177 }, "harness|hellaswag|10": { "acc": 0.615116510655248, "acc_stderr": 0.004855733568540267, "acc_norm": 0.8239394542919737, "acc_norm_stderr": 0.003800932770597752 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.34, "acc_stderr": 0.047609522856952365, "acc_norm": 0.34, "acc_norm_stderr": 0.047609522856952365 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.4740740740740741, "acc_stderr": 0.04313531696750574, "acc_norm": 0.4740740740740741, "acc_norm_stderr": 0.04313531696750574 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.5263157894736842, "acc_stderr": 0.04063302731486671, "acc_norm": 0.5263157894736842, "acc_norm_stderr": 0.04063302731486671 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.56, "acc_stderr": 0.04988876515698589, "acc_norm": 0.56, "acc_norm_stderr": 0.04988876515698589 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.6037735849056604, "acc_stderr": 0.030102793781791197, "acc_norm": 0.6037735849056604, "acc_norm_stderr": 0.030102793781791197 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.5763888888888888, "acc_stderr": 0.0413212501972337, "acc_norm": 0.5763888888888888, "acc_norm_stderr": 0.0413212501972337 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.41, "acc_stderr": 0.04943110704237102, "acc_norm": 0.41, "acc_norm_stderr": 0.04943110704237102 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.47, "acc_stderr": 0.05016135580465919, "acc_norm": 0.47, "acc_norm_stderr": 0.05016135580465919 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.38, "acc_stderr": 0.048783173121456316, "acc_norm": 0.38, "acc_norm_stderr": 0.048783173121456316 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.5144508670520231, "acc_stderr": 0.03810871630454764, "acc_norm": 0.5144508670520231, "acc_norm_stderr": 0.03810871630454764 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.23529411764705882, "acc_stderr": 0.04220773659171452, "acc_norm": 0.23529411764705882, "acc_norm_stderr": 0.04220773659171452 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.72, "acc_stderr": 0.04512608598542128, "acc_norm": 0.72, "acc_norm_stderr": 0.04512608598542128 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.42127659574468085, "acc_stderr": 0.03227834510146268, "acc_norm": 0.42127659574468085, "acc_norm_stderr": 0.03227834510146268 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.2894736842105263, "acc_stderr": 0.04266339443159394, "acc_norm": 0.2894736842105263, "acc_norm_stderr": 0.04266339443159394 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.5379310344827586, "acc_stderr": 0.04154659671707548, "acc_norm": 0.5379310344827586, "acc_norm_stderr": 0.04154659671707548 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.3333333333333333, "acc_stderr": 0.0242785680243077, "acc_norm": 0.3333333333333333, "acc_norm_stderr": 0.0242785680243077 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.3253968253968254, "acc_stderr": 0.04190596438871137, "acc_norm": 0.3253968253968254, "acc_norm_stderr": 0.04190596438871137 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.34, "acc_stderr": 0.04760952285695236, "acc_norm": 0.34, "acc_norm_stderr": 0.04760952285695236 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.6645161290322581, "acc_stderr": 0.02686020644472435, "acc_norm": 0.6645161290322581, "acc_norm_stderr": 0.02686020644472435 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.458128078817734, "acc_stderr": 0.03505630140785741, "acc_norm": 0.458128078817734, "acc_norm_stderr": 0.03505630140785741 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.59, "acc_stderr": 0.04943110704237102, "acc_norm": 0.59, "acc_norm_stderr": 0.04943110704237102 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.6666666666666666, "acc_stderr": 0.03681050869161551, "acc_norm": 0.6666666666666666, "acc_norm_stderr": 0.03681050869161551 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.6919191919191919, "acc_stderr": 0.032894773300986155, "acc_norm": 0.6919191919191919, "acc_norm_stderr": 0.032894773300986155 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.8031088082901554, "acc_stderr": 0.028697873971860677, "acc_norm": 0.8031088082901554, "acc_norm_stderr": 0.028697873971860677 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.5102564102564102, "acc_stderr": 0.025345672221942374, "acc_norm": 0.5102564102564102, "acc_norm_stderr": 0.025345672221942374 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.29259259259259257, "acc_stderr": 0.02773896963217609, "acc_norm": 0.29259259259259257, "acc_norm_stderr": 0.02773896963217609 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.5714285714285714, "acc_stderr": 0.032145368597886394, "acc_norm": 0.5714285714285714, "acc_norm_stderr": 0.032145368597886394 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.31125827814569534, "acc_stderr": 0.037804458505267334, "acc_norm": 0.31125827814569534, "acc_norm_stderr": 0.037804458505267334 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.7486238532110092, "acc_stderr": 0.018599206360287415, "acc_norm": 0.7486238532110092, "acc_norm_stderr": 0.018599206360287415 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.4444444444444444, "acc_stderr": 0.03388857118502326, "acc_norm": 0.4444444444444444, "acc_norm_stderr": 0.03388857118502326 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.7598039215686274, "acc_stderr": 0.02998373305591362, "acc_norm": 0.7598039215686274, "acc_norm_stderr": 0.02998373305591362 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.729957805907173, "acc_stderr": 0.028900721906293426, "acc_norm": 0.729957805907173, "acc_norm_stderr": 0.028900721906293426 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.6547085201793722, "acc_stderr": 0.03191100192835794, "acc_norm": 0.6547085201793722, "acc_norm_stderr": 0.03191100192835794 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.6106870229007634, "acc_stderr": 0.04276486542814591, "acc_norm": 0.6106870229007634, "acc_norm_stderr": 0.04276486542814591 }, "harness|hendrycksTest-international_law|5": { "acc": 0.7272727272727273, "acc_stderr": 0.04065578140908706, "acc_norm": 0.7272727272727273, "acc_norm_stderr": 0.04065578140908706 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.7129629629629629, "acc_stderr": 0.043733130409147614, "acc_norm": 0.7129629629629629, "acc_norm_stderr": 0.043733130409147614 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.6932515337423313, "acc_stderr": 0.03623089915724146, "acc_norm": 0.6932515337423313, "acc_norm_stderr": 0.03623089915724146 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.26785714285714285, "acc_stderr": 0.04203277291467762, "acc_norm": 0.26785714285714285, "acc_norm_stderr": 0.04203277291467762 }, "harness|hendrycksTest-management|5": { "acc": 0.7669902912621359, "acc_stderr": 0.04185832598928315, "acc_norm": 0.7669902912621359, "acc_norm_stderr": 0.04185832598928315 }, "harness|hendrycksTest-marketing|5": { "acc": 0.7991452991452992, "acc_stderr": 0.026246772946890474, "acc_norm": 0.7991452991452992, "acc_norm_stderr": 0.026246772946890474 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.57, "acc_stderr": 0.049756985195624284, "acc_norm": 0.57, "acc_norm_stderr": 0.049756985195624284 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.7586206896551724, "acc_stderr": 0.015302380123542108, "acc_norm": 0.7586206896551724, "acc_norm_stderr": 0.015302380123542108 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.6329479768786127, "acc_stderr": 0.02595005433765408, "acc_norm": 0.6329479768786127, "acc_norm_stderr": 0.02595005433765408 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.3553072625698324, "acc_stderr": 0.01600698993480319, "acc_norm": 0.3553072625698324, "acc_norm_stderr": 0.01600698993480319 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.6111111111111112, "acc_stderr": 0.027914055510468008, "acc_norm": 0.6111111111111112, "acc_norm_stderr": 0.027914055510468008 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.6205787781350482, "acc_stderr": 0.027559949802347813, "acc_norm": 0.6205787781350482, "acc_norm_stderr": 0.027559949802347813 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.6358024691358025, "acc_stderr": 0.026774929899722334, "acc_norm": 0.6358024691358025, "acc_norm_stderr": 0.026774929899722334 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.3971631205673759, "acc_stderr": 0.0291898056735871, "acc_norm": 0.3971631205673759, "acc_norm_stderr": 0.0291898056735871 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.41851368970013036, "acc_stderr": 0.012599505608336461, "acc_norm": 0.41851368970013036, "acc_norm_stderr": 0.012599505608336461 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.5294117647058824, "acc_stderr": 0.03032024326500413, "acc_norm": 0.5294117647058824, "acc_norm_stderr": 0.03032024326500413 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.5408496732026143, "acc_stderr": 0.020160213617222516, "acc_norm": 0.5408496732026143, "acc_norm_stderr": 0.020160213617222516 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6636363636363637, "acc_stderr": 0.04525393596302506, "acc_norm": 0.6636363636363637, "acc_norm_stderr": 0.04525393596302506 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.6081632653061224, "acc_stderr": 0.031251275910891656, "acc_norm": 0.6081632653061224, "acc_norm_stderr": 0.031251275910891656 }, "harness|hendrycksTest-sociology|5": { "acc": 0.7412935323383084, "acc_stderr": 0.030965903123573026, "acc_norm": 0.7412935323383084, "acc_norm_stderr": 0.030965903123573026 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.81, "acc_stderr": 0.039427724440366255, "acc_norm": 0.81, "acc_norm_stderr": 0.039427724440366255 }, "harness|hendrycksTest-virology|5": { "acc": 0.43373493975903615, "acc_stderr": 0.03858158940685517, "acc_norm": 0.43373493975903615, "acc_norm_stderr": 0.03858158940685517 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.7602339181286549, "acc_stderr": 0.03274485211946956, "acc_norm": 0.7602339181286549, "acc_norm_stderr": 0.03274485211946956 }, "harness|truthfulqa:mc|0": { "mc1": 0.29865361077111385, "mc1_stderr": 0.016021570613768542, "mc2": 0.43400538092704843, "mc2_stderr": 0.014284105671223521 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_CoolWP__llama-2-13b-guanaco-fp16
[ "region:us" ]
2023-08-18T17:56:54+00:00
{"pretty_name": "Evaluation run of CoolWP/llama-2-13b-guanaco-fp16", "dataset_summary": "Dataset automatically created during the evaluation run of model [CoolWP/llama-2-13b-guanaco-fp16](https://huggingface.co/CoolWP/llama-2-13b-guanaco-fp16) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_CoolWP__llama-2-13b-guanaco-fp16\",\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-08-17T18:49:30.894423](https://huggingface.co/datasets/open-llm-leaderboard/details_CoolWP__llama-2-13b-guanaco-fp16/blob/main/results_2023-08-17T18%3A49%3A30.894423.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5557402565625233,\n \"acc_stderr\": 0.03433097920024075,\n \"acc_norm\": 0.5600027152011281,\n \"acc_norm_stderr\": 0.03430992590405376,\n \"mc1\": 0.29865361077111385,\n \"mc1_stderr\": 0.016021570613768542,\n \"mc2\": 0.43400538092704843,\n \"mc2_stderr\": 0.014284105671223521\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.552901023890785,\n \"acc_stderr\": 0.014529380160526843,\n \"acc_norm\": 0.5955631399317406,\n \"acc_norm_stderr\": 0.014342036483436177\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.615116510655248,\n \"acc_stderr\": 0.004855733568540267,\n \"acc_norm\": 0.8239394542919737,\n \"acc_norm_stderr\": 0.003800932770597752\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.047609522856952365,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.047609522856952365\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4740740740740741,\n \"acc_stderr\": 0.04313531696750574,\n \"acc_norm\": 0.4740740740740741,\n \"acc_norm_stderr\": 0.04313531696750574\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.5263157894736842,\n \"acc_stderr\": 0.04063302731486671,\n \"acc_norm\": 0.5263157894736842,\n \"acc_norm_stderr\": 0.04063302731486671\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.56,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.56,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6037735849056604,\n \"acc_stderr\": 0.030102793781791197,\n \"acc_norm\": 0.6037735849056604,\n \"acc_norm_stderr\": 0.030102793781791197\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5763888888888888,\n \"acc_stderr\": 0.0413212501972337,\n \"acc_norm\": 0.5763888888888888,\n \"acc_norm_stderr\": 0.0413212501972337\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.41,\n \"acc_stderr\": 0.04943110704237102,\n \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.04943110704237102\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.47,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5144508670520231,\n \"acc_stderr\": 0.03810871630454764,\n \"acc_norm\": 0.5144508670520231,\n \"acc_norm_stderr\": 0.03810871630454764\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.23529411764705882,\n \"acc_stderr\": 0.04220773659171452,\n \"acc_norm\": 0.23529411764705882,\n \"acc_norm_stderr\": 0.04220773659171452\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.42127659574468085,\n \"acc_stderr\": 0.03227834510146268,\n \"acc_norm\": 0.42127659574468085,\n \"acc_norm_stderr\": 0.03227834510146268\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2894736842105263,\n \"acc_stderr\": 0.04266339443159394,\n \"acc_norm\": 0.2894736842105263,\n \"acc_norm_stderr\": 0.04266339443159394\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5379310344827586,\n \"acc_stderr\": 0.04154659671707548,\n \"acc_norm\": 0.5379310344827586,\n \"acc_norm_stderr\": 0.04154659671707548\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.0242785680243077,\n \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.0242785680243077\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3253968253968254,\n \"acc_stderr\": 0.04190596438871137,\n \"acc_norm\": 0.3253968253968254,\n \"acc_norm_stderr\": 0.04190596438871137\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6645161290322581,\n \"acc_stderr\": 0.02686020644472435,\n \"acc_norm\": 0.6645161290322581,\n \"acc_norm_stderr\": 0.02686020644472435\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.458128078817734,\n \"acc_stderr\": 0.03505630140785741,\n \"acc_norm\": 0.458128078817734,\n \"acc_norm_stderr\": 0.03505630140785741\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.59,\n \"acc_stderr\": 0.04943110704237102,\n \"acc_norm\": 0.59,\n \"acc_norm_stderr\": 0.04943110704237102\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.03681050869161551,\n \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.03681050869161551\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.6919191919191919,\n \"acc_stderr\": 0.032894773300986155,\n \"acc_norm\": 0.6919191919191919,\n \"acc_norm_stderr\": 0.032894773300986155\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8031088082901554,\n \"acc_stderr\": 0.028697873971860677,\n \"acc_norm\": 0.8031088082901554,\n \"acc_norm_stderr\": 0.028697873971860677\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.5102564102564102,\n \"acc_stderr\": 0.025345672221942374,\n \"acc_norm\": 0.5102564102564102,\n \"acc_norm_stderr\": 0.025345672221942374\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.29259259259259257,\n \"acc_stderr\": 0.02773896963217609,\n \"acc_norm\": 0.29259259259259257,\n \"acc_norm_stderr\": 0.02773896963217609\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.5714285714285714,\n \"acc_stderr\": 0.032145368597886394,\n \"acc_norm\": 0.5714285714285714,\n \"acc_norm_stderr\": 0.032145368597886394\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.31125827814569534,\n \"acc_stderr\": 0.037804458505267334,\n \"acc_norm\": 0.31125827814569534,\n \"acc_norm_stderr\": 0.037804458505267334\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.7486238532110092,\n \"acc_stderr\": 0.018599206360287415,\n \"acc_norm\": 0.7486238532110092,\n \"acc_norm_stderr\": 0.018599206360287415\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.4444444444444444,\n \"acc_stderr\": 0.03388857118502326,\n \"acc_norm\": 0.4444444444444444,\n \"acc_norm_stderr\": 0.03388857118502326\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7598039215686274,\n \"acc_stderr\": 0.02998373305591362,\n \"acc_norm\": 0.7598039215686274,\n \"acc_norm_stderr\": 0.02998373305591362\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.729957805907173,\n \"acc_stderr\": 0.028900721906293426,\n \"acc_norm\": 0.729957805907173,\n \"acc_norm_stderr\": 0.028900721906293426\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6547085201793722,\n \"acc_stderr\": 0.03191100192835794,\n \"acc_norm\": 0.6547085201793722,\n \"acc_norm_stderr\": 0.03191100192835794\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.6106870229007634,\n \"acc_stderr\": 0.04276486542814591,\n \"acc_norm\": 0.6106870229007634,\n \"acc_norm_stderr\": 0.04276486542814591\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7272727272727273,\n \"acc_stderr\": 0.04065578140908706,\n \"acc_norm\": 0.7272727272727273,\n \"acc_norm_stderr\": 0.04065578140908706\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7129629629629629,\n \"acc_stderr\": 0.043733130409147614,\n \"acc_norm\": 0.7129629629629629,\n \"acc_norm_stderr\": 0.043733130409147614\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.6932515337423313,\n \"acc_stderr\": 0.03623089915724146,\n \"acc_norm\": 0.6932515337423313,\n \"acc_norm_stderr\": 0.03623089915724146\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.26785714285714285,\n \"acc_stderr\": 0.04203277291467762,\n \"acc_norm\": 0.26785714285714285,\n \"acc_norm_stderr\": 0.04203277291467762\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7991452991452992,\n \"acc_stderr\": 0.026246772946890474,\n \"acc_norm\": 0.7991452991452992,\n \"acc_norm_stderr\": 0.026246772946890474\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.57,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.57,\n \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7586206896551724,\n \"acc_stderr\": 0.015302380123542108,\n \"acc_norm\": 0.7586206896551724,\n \"acc_norm_stderr\": 0.015302380123542108\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.6329479768786127,\n \"acc_stderr\": 0.02595005433765408,\n \"acc_norm\": 0.6329479768786127,\n \"acc_norm_stderr\": 0.02595005433765408\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3553072625698324,\n \"acc_stderr\": 0.01600698993480319,\n \"acc_norm\": 0.3553072625698324,\n \"acc_norm_stderr\": 0.01600698993480319\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.6111111111111112,\n \"acc_stderr\": 0.027914055510468008,\n \"acc_norm\": 0.6111111111111112,\n \"acc_norm_stderr\": 0.027914055510468008\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6205787781350482,\n \"acc_stderr\": 0.027559949802347813,\n \"acc_norm\": 0.6205787781350482,\n \"acc_norm_stderr\": 0.027559949802347813\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.6358024691358025,\n \"acc_stderr\": 0.026774929899722334,\n \"acc_norm\": 0.6358024691358025,\n \"acc_norm_stderr\": 0.026774929899722334\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.3971631205673759,\n \"acc_stderr\": 0.0291898056735871,\n \"acc_norm\": 0.3971631205673759,\n \"acc_norm_stderr\": 0.0291898056735871\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.41851368970013036,\n \"acc_stderr\": 0.012599505608336461,\n \"acc_norm\": 0.41851368970013036,\n \"acc_norm_stderr\": 0.012599505608336461\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.5294117647058824,\n \"acc_stderr\": 0.03032024326500413,\n \"acc_norm\": 0.5294117647058824,\n \"acc_norm_stderr\": 0.03032024326500413\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.5408496732026143,\n \"acc_stderr\": 0.020160213617222516,\n \"acc_norm\": 0.5408496732026143,\n \"acc_norm_stderr\": 0.020160213617222516\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n \"acc_stderr\": 0.04525393596302506,\n \"acc_norm\": 0.6636363636363637,\n \"acc_norm_stderr\": 0.04525393596302506\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.6081632653061224,\n \"acc_stderr\": 0.031251275910891656,\n \"acc_norm\": 0.6081632653061224,\n \"acc_norm_stderr\": 0.031251275910891656\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7412935323383084,\n \"acc_stderr\": 0.030965903123573026,\n \"acc_norm\": 0.7412935323383084,\n \"acc_norm_stderr\": 0.030965903123573026\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.81,\n \"acc_stderr\": 0.039427724440366255,\n \"acc_norm\": 0.81,\n \"acc_norm_stderr\": 0.039427724440366255\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.43373493975903615,\n \"acc_stderr\": 0.03858158940685517,\n \"acc_norm\": 0.43373493975903615,\n \"acc_norm_stderr\": 0.03858158940685517\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.7602339181286549,\n \"acc_stderr\": 0.03274485211946956,\n \"acc_norm\": 0.7602339181286549,\n \"acc_norm_stderr\": 0.03274485211946956\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.29865361077111385,\n \"mc1_stderr\": 0.016021570613768542,\n \"mc2\": 0.43400538092704843,\n \"mc2_stderr\": 0.014284105671223521\n }\n}\n```", "repo_url": "https://huggingface.co/CoolWP/llama-2-13b-guanaco-fp16", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_08_17T18_49_30.894423", "path": ["**/details_harness|arc:challenge|25_2023-08-17T18:49:30.894423.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-08-17T18:49:30.894423.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_08_17T18_49_30.894423", "path": ["**/details_harness|hellaswag|10_2023-08-17T18:49:30.894423.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-08-17T18:49:30.894423.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_08_17T18_49_30.894423", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T18:49:30.894423.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-17T18:49:30.894423.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-17T18:49:30.894423.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T18:49:30.894423.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T18:49:30.894423.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-17T18:49:30.894423.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T18:49:30.894423.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T18:49:30.894423.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T18:49:30.894423.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T18:49:30.894423.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-17T18:49:30.894423.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-17T18:49:30.894423.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T18:49:30.894423.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-17T18:49:30.894423.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T18:49:30.894423.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T18:49:30.894423.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T18:49:30.894423.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-17T18:49:30.894423.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T18:49:30.894423.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T18:49:30.894423.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T18:49:30.894423.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T18:49:30.894423.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T18:49:30.894423.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T18:49:30.894423.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T18:49:30.894423.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T18:49:30.894423.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T18:49:30.894423.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T18:49:30.894423.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T18:49:30.894423.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T18:49:30.894423.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T18:49:30.894423.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T18:49:30.894423.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-17T18:49:30.894423.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T18:49:30.894423.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-17T18:49:30.894423.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T18:49:30.894423.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T18:49:30.894423.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T18:49:30.894423.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-17T18:49:30.894423.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-17T18:49:30.894423.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T18:49:30.894423.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T18:49:30.894423.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T18:49:30.894423.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T18:49:30.894423.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-17T18:49:30.894423.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-17T18:49:30.894423.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-17T18:49:30.894423.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T18:49:30.894423.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-17T18:49:30.894423.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T18:49:30.894423.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T18:49:30.894423.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-17T18:49:30.894423.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-17T18:49:30.894423.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-17T18:49:30.894423.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T18:49:30.894423.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-17T18:49:30.894423.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-17T18:49:30.894423.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T18:49:30.894423.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-17T18:49:30.894423.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-17T18:49:30.894423.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T18:49:30.894423.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T18:49:30.894423.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-17T18:49:30.894423.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T18:49:30.894423.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T18:49:30.894423.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T18:49:30.894423.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T18:49:30.894423.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-17T18:49:30.894423.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-17T18:49:30.894423.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T18:49:30.894423.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-17T18:49:30.894423.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T18:49:30.894423.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T18:49:30.894423.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T18:49:30.894423.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-17T18:49:30.894423.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T18:49:30.894423.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T18:49:30.894423.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T18:49:30.894423.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T18:49:30.894423.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T18:49:30.894423.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T18:49:30.894423.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T18:49:30.894423.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T18:49:30.894423.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T18:49:30.894423.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T18:49:30.894423.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T18:49:30.894423.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T18:49:30.894423.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T18:49:30.894423.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T18:49:30.894423.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-17T18:49:30.894423.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T18:49:30.894423.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-17T18:49:30.894423.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T18:49:30.894423.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T18:49:30.894423.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T18:49:30.894423.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-17T18:49:30.894423.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-17T18:49:30.894423.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T18:49:30.894423.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T18:49:30.894423.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T18:49:30.894423.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T18:49:30.894423.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-17T18:49:30.894423.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-17T18:49:30.894423.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-17T18:49:30.894423.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T18:49:30.894423.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-17T18:49:30.894423.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T18:49:30.894423.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T18:49:30.894423.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-17T18:49:30.894423.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-17T18:49:30.894423.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-17T18:49:30.894423.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T18:49:30.894423.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-17T18:49:30.894423.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-17T18:49:30.894423.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_08_17T18_49_30.894423", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T18:49:30.894423.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T18:49:30.894423.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_08_17T18_49_30.894423", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-17T18:49:30.894423.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-17T18:49:30.894423.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_08_17T18_49_30.894423", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-17T18:49:30.894423.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-17T18:49:30.894423.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_08_17T18_49_30.894423", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T18:49:30.894423.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T18:49:30.894423.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_08_17T18_49_30.894423", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T18:49:30.894423.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T18:49:30.894423.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_08_17T18_49_30.894423", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-17T18:49:30.894423.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-17T18:49:30.894423.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_08_17T18_49_30.894423", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T18:49:30.894423.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T18:49:30.894423.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_08_17T18_49_30.894423", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T18:49:30.894423.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T18:49:30.894423.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_08_17T18_49_30.894423", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T18:49:30.894423.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T18:49:30.894423.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_08_17T18_49_30.894423", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T18:49:30.894423.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T18:49:30.894423.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_08_17T18_49_30.894423", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-17T18:49:30.894423.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-17T18:49:30.894423.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_08_17T18_49_30.894423", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-17T18:49:30.894423.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-17T18:49:30.894423.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_08_17T18_49_30.894423", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T18:49:30.894423.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T18:49:30.894423.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_08_17T18_49_30.894423", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-17T18:49:30.894423.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-17T18:49:30.894423.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_08_17T18_49_30.894423", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T18:49:30.894423.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T18:49:30.894423.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_08_17T18_49_30.894423", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T18:49:30.894423.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T18:49:30.894423.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_08_17T18_49_30.894423", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T18:49:30.894423.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T18:49:30.894423.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_08_17T18_49_30.894423", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-17T18:49:30.894423.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-17T18:49:30.894423.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_08_17T18_49_30.894423", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T18:49:30.894423.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T18:49:30.894423.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_08_17T18_49_30.894423", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T18:49:30.894423.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T18:49:30.894423.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_08_17T18_49_30.894423", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T18:49:30.894423.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T18:49:30.894423.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_08_17T18_49_30.894423", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T18:49:30.894423.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T18:49:30.894423.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_08_17T18_49_30.894423", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T18:49:30.894423.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T18:49:30.894423.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_08_17T18_49_30.894423", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T18:49:30.894423.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T18:49:30.894423.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_08_17T18_49_30.894423", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T18:49:30.894423.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T18:49:30.894423.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_08_17T18_49_30.894423", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T18:49:30.894423.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T18:49:30.894423.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_08_17T18_49_30.894423", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T18:49:30.894423.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T18:49:30.894423.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_08_17T18_49_30.894423", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T18:49:30.894423.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T18:49:30.894423.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_08_17T18_49_30.894423", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T18:49:30.894423.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T18:49:30.894423.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_08_17T18_49_30.894423", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T18:49:30.894423.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T18:49:30.894423.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_08_17T18_49_30.894423", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T18:49:30.894423.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T18:49:30.894423.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_08_17T18_49_30.894423", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T18:49:30.894423.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T18:49:30.894423.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_08_17T18_49_30.894423", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-17T18:49:30.894423.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-17T18:49:30.894423.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_08_17T18_49_30.894423", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T18:49:30.894423.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T18:49:30.894423.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_08_17T18_49_30.894423", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-17T18:49:30.894423.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-17T18:49:30.894423.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_08_17T18_49_30.894423", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T18:49:30.894423.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T18:49:30.894423.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_08_17T18_49_30.894423", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T18:49:30.894423.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T18:49:30.894423.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_08_17T18_49_30.894423", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T18:49:30.894423.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T18:49:30.894423.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_08_17T18_49_30.894423", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-17T18:49:30.894423.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-17T18:49:30.894423.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_08_17T18_49_30.894423", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-17T18:49:30.894423.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-17T18:49:30.894423.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_08_17T18_49_30.894423", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T18:49:30.894423.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T18:49:30.894423.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_08_17T18_49_30.894423", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T18:49:30.894423.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T18:49:30.894423.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_08_17T18_49_30.894423", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T18:49:30.894423.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T18:49:30.894423.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_08_17T18_49_30.894423", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T18:49:30.894423.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T18:49:30.894423.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_08_17T18_49_30.894423", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-17T18:49:30.894423.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-17T18:49:30.894423.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_08_17T18_49_30.894423", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-17T18:49:30.894423.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-17T18:49:30.894423.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_08_17T18_49_30.894423", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-17T18:49:30.894423.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-17T18:49:30.894423.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_08_17T18_49_30.894423", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T18:49:30.894423.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T18:49:30.894423.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_08_17T18_49_30.894423", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-17T18:49:30.894423.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-17T18:49:30.894423.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_08_17T18_49_30.894423", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T18:49:30.894423.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T18:49:30.894423.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_08_17T18_49_30.894423", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T18:49:30.894423.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T18:49:30.894423.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_08_17T18_49_30.894423", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-17T18:49:30.894423.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-17T18:49:30.894423.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_08_17T18_49_30.894423", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-17T18:49:30.894423.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-17T18:49:30.894423.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_08_17T18_49_30.894423", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-17T18:49:30.894423.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-17T18:49:30.894423.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_08_17T18_49_30.894423", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T18:49:30.894423.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T18:49:30.894423.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_08_17T18_49_30.894423", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-17T18:49:30.894423.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-17T18:49:30.894423.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_08_17T18_49_30.894423", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-17T18:49:30.894423.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-17T18:49:30.894423.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_08_17T18_49_30.894423", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-17T18:49:30.894423.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-17T18:49:30.894423.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_08_17T18_49_30.894423", "path": ["results_2023-08-17T18:49:30.894423.parquet"]}, {"split": "latest", "path": ["results_2023-08-17T18:49:30.894423.parquet"]}]}]}
2023-08-27T11:41:46+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of CoolWP/llama-2-13b-guanaco-fp16 ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model CoolWP/llama-2-13b-guanaco-fp16 on the Open LLM Leaderboard. The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-08-17T18:49:30.894423 (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of CoolWP/llama-2-13b-guanaco-fp16", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model CoolWP/llama-2-13b-guanaco-fp16 on the Open LLM Leaderboard.\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-08-17T18:49:30.894423 (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of CoolWP/llama-2-13b-guanaco-fp16", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model CoolWP/llama-2-13b-guanaco-fp16 on the Open LLM Leaderboard.\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-08-17T18:49:30.894423 (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 25, 31, 173, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of CoolWP/llama-2-13b-guanaco-fp16## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model CoolWP/llama-2-13b-guanaco-fp16 on the Open LLM Leaderboard.\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-08-17T18:49:30.894423 (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
3b1d9c1fcb076355a1fb249d3f1a13a5ff194b07
# Dataset Card for Evaluation run of DanielSc4/RedPajama-INCITE-Chat-3B-v1-FT-LoRA-8bit-test1 ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/DanielSc4/RedPajama-INCITE-Chat-3B-v1-FT-LoRA-8bit-test1 - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [DanielSc4/RedPajama-INCITE-Chat-3B-v1-FT-LoRA-8bit-test1](https://huggingface.co/DanielSc4/RedPajama-INCITE-Chat-3B-v1-FT-LoRA-8bit-test1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_DanielSc4__RedPajama-INCITE-Chat-3B-v1-FT-LoRA-8bit-test1", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-09-17T02:48:34.876063](https://huggingface.co/datasets/open-llm-leaderboard/details_DanielSc4__RedPajama-INCITE-Chat-3B-v1-FT-LoRA-8bit-test1/blob/main/results_2023-09-17T02-48-34.876063.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.018141778523489933, "em_stderr": 0.0013667968592600823, "f1": 0.0824182046979865, "f1_stderr": 0.0019512337351707363, "acc": 0.30108941444123377, "acc_stderr": 0.0072592536452981875 }, "harness|drop|3": { "em": 0.018141778523489933, "em_stderr": 0.0013667968592600823, "f1": 0.0824182046979865, "f1_stderr": 0.0019512337351707363 }, "harness|gsm8k|5": { "acc": 0.000758150113722517, "acc_stderr": 0.000758150113722541 }, "harness|winogrande|5": { "acc": 0.601420678768745, "acc_stderr": 0.013760357176873834 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_DanielSc4__RedPajama-INCITE-Chat-3B-v1-FT-LoRA-8bit-test1
[ "region:us" ]
2023-08-18T17:57:04+00:00
{"pretty_name": "Evaluation run of DanielSc4/RedPajama-INCITE-Chat-3B-v1-FT-LoRA-8bit-test1", "dataset_summary": "Dataset automatically created during the evaluation run of model [DanielSc4/RedPajama-INCITE-Chat-3B-v1-FT-LoRA-8bit-test1](https://huggingface.co/DanielSc4/RedPajama-INCITE-Chat-3B-v1-FT-LoRA-8bit-test1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_DanielSc4__RedPajama-INCITE-Chat-3B-v1-FT-LoRA-8bit-test1\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-09-17T02:48:34.876063](https://huggingface.co/datasets/open-llm-leaderboard/details_DanielSc4__RedPajama-INCITE-Chat-3B-v1-FT-LoRA-8bit-test1/blob/main/results_2023-09-17T02-48-34.876063.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.018141778523489933,\n \"em_stderr\": 0.0013667968592600823,\n \"f1\": 0.0824182046979865,\n \"f1_stderr\": 0.0019512337351707363,\n \"acc\": 0.30108941444123377,\n \"acc_stderr\": 0.0072592536452981875\n },\n \"harness|drop|3\": {\n \"em\": 0.018141778523489933,\n \"em_stderr\": 0.0013667968592600823,\n \"f1\": 0.0824182046979865,\n \"f1_stderr\": 0.0019512337351707363\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.000758150113722517,\n \"acc_stderr\": 0.000758150113722541\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.601420678768745,\n \"acc_stderr\": 0.013760357176873834\n }\n}\n```", "repo_url": "https://huggingface.co/DanielSc4/RedPajama-INCITE-Chat-3B-v1-FT-LoRA-8bit-test1", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_08_17T19_06_24.257655", "path": ["**/details_harness|arc:challenge|25_2023-08-17T19:06:24.257655.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-08-17T19:06:24.257655.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_09_17T02_48_34.876063", "path": ["**/details_harness|drop|3_2023-09-17T02-48-34.876063.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-09-17T02-48-34.876063.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_09_17T02_48_34.876063", "path": ["**/details_harness|gsm8k|5_2023-09-17T02-48-34.876063.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-09-17T02-48-34.876063.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_08_17T19_06_24.257655", "path": ["**/details_harness|hellaswag|10_2023-08-17T19:06:24.257655.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-08-17T19:06:24.257655.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_08_17T19_06_24.257655", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T19:06:24.257655.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-17T19:06:24.257655.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-17T19:06:24.257655.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T19:06:24.257655.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T19:06:24.257655.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-17T19:06:24.257655.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T19:06:24.257655.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T19:06:24.257655.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T19:06:24.257655.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T19:06:24.257655.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-17T19:06:24.257655.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-17T19:06:24.257655.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T19:06:24.257655.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-17T19:06:24.257655.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T19:06:24.257655.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T19:06:24.257655.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T19:06:24.257655.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-17T19:06:24.257655.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T19:06:24.257655.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T19:06:24.257655.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T19:06:24.257655.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T19:06:24.257655.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T19:06:24.257655.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T19:06:24.257655.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T19:06:24.257655.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T19:06:24.257655.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T19:06:24.257655.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T19:06:24.257655.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T19:06:24.257655.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T19:06:24.257655.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T19:06:24.257655.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T19:06:24.257655.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-17T19:06:24.257655.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T19:06:24.257655.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-17T19:06:24.257655.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T19:06:24.257655.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T19:06:24.257655.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T19:06:24.257655.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-17T19:06:24.257655.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-17T19:06:24.257655.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T19:06:24.257655.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T19:06:24.257655.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T19:06:24.257655.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T19:06:24.257655.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-17T19:06:24.257655.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-17T19:06:24.257655.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-17T19:06:24.257655.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T19:06:24.257655.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-17T19:06:24.257655.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T19:06:24.257655.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T19:06:24.257655.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-17T19:06:24.257655.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-17T19:06:24.257655.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-17T19:06:24.257655.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T19:06:24.257655.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-17T19:06:24.257655.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-17T19:06:24.257655.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T19:06:24.257655.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-17T19:06:24.257655.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-17T19:06:24.257655.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T19:06:24.257655.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T19:06:24.257655.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-17T19:06:24.257655.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T19:06:24.257655.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T19:06:24.257655.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T19:06:24.257655.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T19:06:24.257655.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-17T19:06:24.257655.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-17T19:06:24.257655.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T19:06:24.257655.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-17T19:06:24.257655.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T19:06:24.257655.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T19:06:24.257655.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T19:06:24.257655.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-17T19:06:24.257655.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T19:06:24.257655.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T19:06:24.257655.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T19:06:24.257655.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T19:06:24.257655.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T19:06:24.257655.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T19:06:24.257655.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T19:06:24.257655.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T19:06:24.257655.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T19:06:24.257655.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T19:06:24.257655.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T19:06:24.257655.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T19:06:24.257655.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T19:06:24.257655.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T19:06:24.257655.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-17T19:06:24.257655.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T19:06:24.257655.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-17T19:06:24.257655.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T19:06:24.257655.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T19:06:24.257655.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T19:06:24.257655.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-17T19:06:24.257655.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-17T19:06:24.257655.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T19:06:24.257655.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T19:06:24.257655.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T19:06:24.257655.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T19:06:24.257655.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-17T19:06:24.257655.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-17T19:06:24.257655.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-17T19:06:24.257655.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T19:06:24.257655.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-17T19:06:24.257655.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T19:06:24.257655.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T19:06:24.257655.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-17T19:06:24.257655.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-17T19:06:24.257655.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-17T19:06:24.257655.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T19:06:24.257655.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-17T19:06:24.257655.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-17T19:06:24.257655.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_08_17T19_06_24.257655", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T19:06:24.257655.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T19:06:24.257655.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_08_17T19_06_24.257655", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-17T19:06:24.257655.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-17T19:06:24.257655.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_08_17T19_06_24.257655", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-17T19:06:24.257655.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-17T19:06:24.257655.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_08_17T19_06_24.257655", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T19:06:24.257655.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T19:06:24.257655.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_08_17T19_06_24.257655", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T19:06:24.257655.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T19:06:24.257655.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_08_17T19_06_24.257655", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-17T19:06:24.257655.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-17T19:06:24.257655.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_08_17T19_06_24.257655", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T19:06:24.257655.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T19:06:24.257655.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_08_17T19_06_24.257655", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T19:06:24.257655.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T19:06:24.257655.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_08_17T19_06_24.257655", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T19:06:24.257655.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T19:06:24.257655.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_08_17T19_06_24.257655", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T19:06:24.257655.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T19:06:24.257655.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_08_17T19_06_24.257655", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-17T19:06:24.257655.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-17T19:06:24.257655.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_08_17T19_06_24.257655", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-17T19:06:24.257655.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-17T19:06:24.257655.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_08_17T19_06_24.257655", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T19:06:24.257655.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T19:06:24.257655.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_08_17T19_06_24.257655", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-17T19:06:24.257655.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-17T19:06:24.257655.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_08_17T19_06_24.257655", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T19:06:24.257655.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T19:06:24.257655.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_08_17T19_06_24.257655", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T19:06:24.257655.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T19:06:24.257655.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_08_17T19_06_24.257655", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T19:06:24.257655.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T19:06:24.257655.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_08_17T19_06_24.257655", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-17T19:06:24.257655.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-17T19:06:24.257655.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_08_17T19_06_24.257655", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T19:06:24.257655.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T19:06:24.257655.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_08_17T19_06_24.257655", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T19:06:24.257655.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T19:06:24.257655.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_08_17T19_06_24.257655", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T19:06:24.257655.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T19:06:24.257655.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_08_17T19_06_24.257655", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T19:06:24.257655.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T19:06:24.257655.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_08_17T19_06_24.257655", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T19:06:24.257655.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T19:06:24.257655.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_08_17T19_06_24.257655", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T19:06:24.257655.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T19:06:24.257655.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_08_17T19_06_24.257655", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T19:06:24.257655.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T19:06:24.257655.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_08_17T19_06_24.257655", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T19:06:24.257655.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T19:06:24.257655.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_08_17T19_06_24.257655", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T19:06:24.257655.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T19:06:24.257655.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_08_17T19_06_24.257655", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T19:06:24.257655.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T19:06:24.257655.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_08_17T19_06_24.257655", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T19:06:24.257655.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T19:06:24.257655.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_08_17T19_06_24.257655", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T19:06:24.257655.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T19:06:24.257655.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_08_17T19_06_24.257655", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T19:06:24.257655.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T19:06:24.257655.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_08_17T19_06_24.257655", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T19:06:24.257655.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T19:06:24.257655.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_08_17T19_06_24.257655", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-17T19:06:24.257655.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-17T19:06:24.257655.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_08_17T19_06_24.257655", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T19:06:24.257655.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T19:06:24.257655.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_08_17T19_06_24.257655", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-17T19:06:24.257655.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-17T19:06:24.257655.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_08_17T19_06_24.257655", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T19:06:24.257655.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T19:06:24.257655.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_08_17T19_06_24.257655", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T19:06:24.257655.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T19:06:24.257655.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_08_17T19_06_24.257655", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T19:06:24.257655.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T19:06:24.257655.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_08_17T19_06_24.257655", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-17T19:06:24.257655.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-17T19:06:24.257655.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_08_17T19_06_24.257655", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-17T19:06:24.257655.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-17T19:06:24.257655.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_08_17T19_06_24.257655", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T19:06:24.257655.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T19:06:24.257655.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_08_17T19_06_24.257655", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T19:06:24.257655.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T19:06:24.257655.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_08_17T19_06_24.257655", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T19:06:24.257655.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T19:06:24.257655.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_08_17T19_06_24.257655", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T19:06:24.257655.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T19:06:24.257655.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_08_17T19_06_24.257655", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-17T19:06:24.257655.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-17T19:06:24.257655.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_08_17T19_06_24.257655", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-17T19:06:24.257655.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-17T19:06:24.257655.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_08_17T19_06_24.257655", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-17T19:06:24.257655.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-17T19:06:24.257655.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_08_17T19_06_24.257655", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T19:06:24.257655.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T19:06:24.257655.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_08_17T19_06_24.257655", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-17T19:06:24.257655.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-17T19:06:24.257655.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_08_17T19_06_24.257655", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T19:06:24.257655.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T19:06:24.257655.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_08_17T19_06_24.257655", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T19:06:24.257655.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T19:06:24.257655.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_08_17T19_06_24.257655", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-17T19:06:24.257655.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-17T19:06:24.257655.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_08_17T19_06_24.257655", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-17T19:06:24.257655.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-17T19:06:24.257655.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_08_17T19_06_24.257655", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-17T19:06:24.257655.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-17T19:06:24.257655.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_08_17T19_06_24.257655", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T19:06:24.257655.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T19:06:24.257655.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_08_17T19_06_24.257655", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-17T19:06:24.257655.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-17T19:06:24.257655.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_08_17T19_06_24.257655", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-17T19:06:24.257655.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-17T19:06:24.257655.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_08_17T19_06_24.257655", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-17T19:06:24.257655.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-17T19:06:24.257655.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_09_17T02_48_34.876063", "path": ["**/details_harness|winogrande|5_2023-09-17T02-48-34.876063.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-09-17T02-48-34.876063.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_08_17T19_06_24.257655", "path": ["results_2023-08-17T19:06:24.257655.parquet"]}, {"split": "2023_09_17T02_48_34.876063", "path": ["results_2023-09-17T02-48-34.876063.parquet"]}, {"split": "latest", "path": ["results_2023-09-17T02-48-34.876063.parquet"]}]}]}
2023-09-17T01:48:47+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of DanielSc4/RedPajama-INCITE-Chat-3B-v1-FT-LoRA-8bit-test1 ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model DanielSc4/RedPajama-INCITE-Chat-3B-v1-FT-LoRA-8bit-test1 on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-09-17T02:48:34.876063(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of DanielSc4/RedPajama-INCITE-Chat-3B-v1-FT-LoRA-8bit-test1", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model DanielSc4/RedPajama-INCITE-Chat-3B-v1-FT-LoRA-8bit-test1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-09-17T02:48:34.876063(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of DanielSc4/RedPajama-INCITE-Chat-3B-v1-FT-LoRA-8bit-test1", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model DanielSc4/RedPajama-INCITE-Chat-3B-v1-FT-LoRA-8bit-test1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-09-17T02:48:34.876063(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 35, 31, 183, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of DanielSc4/RedPajama-INCITE-Chat-3B-v1-FT-LoRA-8bit-test1## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model DanielSc4/RedPajama-INCITE-Chat-3B-v1-FT-LoRA-8bit-test1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-09-17T02:48:34.876063(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
083867164ec31cf489427f918eb425d3ca5f1b51
# Dataset Card for Evaluation run of davzoku/cria-llama2-7b-v1.3_peft ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/davzoku/cria-llama2-7b-v1.3_peft - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [davzoku/cria-llama2-7b-v1.3_peft](https://huggingface.co/davzoku/cria-llama2-7b-v1.3_peft) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_davzoku__cria-llama2-7b-v1.3_peft", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-10-14T22:14:50.964643](https://huggingface.co/datasets/open-llm-leaderboard/details_davzoku__cria-llama2-7b-v1.3_peft/blob/main/results_2023-10-14T22-14-50.964643.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.08651426174496644, "em_stderr": 0.002878951774581331, "f1": 0.14823406040268378, "f1_stderr": 0.0031363121759689235, "acc": 0.3877234732729646, "acc_stderr": 0.009844336814055742 }, "harness|drop|3": { "em": 0.08651426174496644, "em_stderr": 0.002878951774581331, "f1": 0.14823406040268378, "f1_stderr": 0.0031363121759689235 }, "harness|gsm8k|5": { "acc": 0.06747536012130402, "acc_stderr": 0.006909475136357469 }, "harness|winogrande|5": { "acc": 0.7079715864246251, "acc_stderr": 0.012779198491754015 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_davzoku__cria-llama2-7b-v1.3_peft
[ "region:us" ]
2023-08-18T17:57:12+00:00
{"pretty_name": "Evaluation run of davzoku/cria-llama2-7b-v1.3_peft", "dataset_summary": "Dataset automatically created during the evaluation run of model [davzoku/cria-llama2-7b-v1.3_peft](https://huggingface.co/davzoku/cria-llama2-7b-v1.3_peft) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_davzoku__cria-llama2-7b-v1.3_peft\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-14T22:14:50.964643](https://huggingface.co/datasets/open-llm-leaderboard/details_davzoku__cria-llama2-7b-v1.3_peft/blob/main/results_2023-10-14T22-14-50.964643.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.08651426174496644,\n \"em_stderr\": 0.002878951774581331,\n \"f1\": 0.14823406040268378,\n \"f1_stderr\": 0.0031363121759689235,\n \"acc\": 0.3877234732729646,\n \"acc_stderr\": 0.009844336814055742\n },\n \"harness|drop|3\": {\n \"em\": 0.08651426174496644,\n \"em_stderr\": 0.002878951774581331,\n \"f1\": 0.14823406040268378,\n \"f1_stderr\": 0.0031363121759689235\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.06747536012130402,\n \"acc_stderr\": 0.006909475136357469\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7079715864246251,\n \"acc_stderr\": 0.012779198491754015\n }\n}\n```", "repo_url": "https://huggingface.co/davzoku/cria-llama2-7b-v1.3_peft", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_08_17T19_00_21.145546", "path": ["**/details_harness|arc:challenge|25_2023-08-17T19:00:21.145546.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-08-17T19:00:21.145546.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_14T22_14_50.964643", "path": ["**/details_harness|drop|3_2023-10-14T22-14-50.964643.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-14T22-14-50.964643.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_14T22_14_50.964643", "path": ["**/details_harness|gsm8k|5_2023-10-14T22-14-50.964643.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-14T22-14-50.964643.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_08_17T19_00_21.145546", "path": ["**/details_harness|hellaswag|10_2023-08-17T19:00:21.145546.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-08-17T19:00:21.145546.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_08_17T19_00_21.145546", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T19:00:21.145546.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-17T19:00:21.145546.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-17T19:00:21.145546.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T19:00:21.145546.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T19:00:21.145546.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-17T19:00:21.145546.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T19:00:21.145546.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T19:00:21.145546.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T19:00:21.145546.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T19:00:21.145546.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-17T19:00:21.145546.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-17T19:00:21.145546.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T19:00:21.145546.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-17T19:00:21.145546.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T19:00:21.145546.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T19:00:21.145546.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T19:00:21.145546.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-17T19:00:21.145546.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T19:00:21.145546.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T19:00:21.145546.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T19:00:21.145546.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T19:00:21.145546.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T19:00:21.145546.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T19:00:21.145546.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T19:00:21.145546.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T19:00:21.145546.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T19:00:21.145546.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T19:00:21.145546.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T19:00:21.145546.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T19:00:21.145546.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T19:00:21.145546.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T19:00:21.145546.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-17T19:00:21.145546.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T19:00:21.145546.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-17T19:00:21.145546.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T19:00:21.145546.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T19:00:21.145546.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T19:00:21.145546.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-17T19:00:21.145546.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-17T19:00:21.145546.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T19:00:21.145546.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T19:00:21.145546.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T19:00:21.145546.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T19:00:21.145546.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-17T19:00:21.145546.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-17T19:00:21.145546.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-17T19:00:21.145546.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T19:00:21.145546.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-17T19:00:21.145546.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T19:00:21.145546.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T19:00:21.145546.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-17T19:00:21.145546.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-17T19:00:21.145546.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-17T19:00:21.145546.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T19:00:21.145546.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-17T19:00:21.145546.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-17T19:00:21.145546.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T19:00:21.145546.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-17T19:00:21.145546.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-17T19:00:21.145546.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T19:00:21.145546.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T19:00:21.145546.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-17T19:00:21.145546.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T19:00:21.145546.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T19:00:21.145546.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T19:00:21.145546.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T19:00:21.145546.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-17T19:00:21.145546.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-17T19:00:21.145546.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T19:00:21.145546.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-17T19:00:21.145546.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T19:00:21.145546.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T19:00:21.145546.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T19:00:21.145546.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-17T19:00:21.145546.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T19:00:21.145546.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T19:00:21.145546.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T19:00:21.145546.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T19:00:21.145546.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T19:00:21.145546.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T19:00:21.145546.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T19:00:21.145546.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T19:00:21.145546.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T19:00:21.145546.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T19:00:21.145546.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T19:00:21.145546.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T19:00:21.145546.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T19:00:21.145546.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T19:00:21.145546.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-17T19:00:21.145546.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T19:00:21.145546.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-17T19:00:21.145546.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T19:00:21.145546.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T19:00:21.145546.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T19:00:21.145546.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-17T19:00:21.145546.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-17T19:00:21.145546.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T19:00:21.145546.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T19:00:21.145546.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T19:00:21.145546.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T19:00:21.145546.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-17T19:00:21.145546.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-17T19:00:21.145546.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-17T19:00:21.145546.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T19:00:21.145546.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-17T19:00:21.145546.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T19:00:21.145546.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T19:00:21.145546.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-17T19:00:21.145546.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-17T19:00:21.145546.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-17T19:00:21.145546.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T19:00:21.145546.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-17T19:00:21.145546.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-17T19:00:21.145546.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_08_17T19_00_21.145546", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T19:00:21.145546.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T19:00:21.145546.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_08_17T19_00_21.145546", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-17T19:00:21.145546.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-17T19:00:21.145546.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_08_17T19_00_21.145546", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-17T19:00:21.145546.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-17T19:00:21.145546.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_08_17T19_00_21.145546", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T19:00:21.145546.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T19:00:21.145546.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_08_17T19_00_21.145546", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T19:00:21.145546.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T19:00:21.145546.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_08_17T19_00_21.145546", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-17T19:00:21.145546.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-17T19:00:21.145546.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_08_17T19_00_21.145546", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T19:00:21.145546.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T19:00:21.145546.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_08_17T19_00_21.145546", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T19:00:21.145546.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T19:00:21.145546.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_08_17T19_00_21.145546", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T19:00:21.145546.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T19:00:21.145546.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_08_17T19_00_21.145546", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T19:00:21.145546.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T19:00:21.145546.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_08_17T19_00_21.145546", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-17T19:00:21.145546.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-17T19:00:21.145546.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_08_17T19_00_21.145546", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-17T19:00:21.145546.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-17T19:00:21.145546.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_08_17T19_00_21.145546", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T19:00:21.145546.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T19:00:21.145546.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_08_17T19_00_21.145546", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-17T19:00:21.145546.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-17T19:00:21.145546.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_08_17T19_00_21.145546", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T19:00:21.145546.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T19:00:21.145546.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_08_17T19_00_21.145546", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T19:00:21.145546.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T19:00:21.145546.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_08_17T19_00_21.145546", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T19:00:21.145546.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T19:00:21.145546.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_08_17T19_00_21.145546", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-17T19:00:21.145546.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-17T19:00:21.145546.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_08_17T19_00_21.145546", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T19:00:21.145546.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T19:00:21.145546.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_08_17T19_00_21.145546", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T19:00:21.145546.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T19:00:21.145546.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_08_17T19_00_21.145546", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T19:00:21.145546.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T19:00:21.145546.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_08_17T19_00_21.145546", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T19:00:21.145546.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T19:00:21.145546.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_08_17T19_00_21.145546", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T19:00:21.145546.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T19:00:21.145546.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_08_17T19_00_21.145546", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T19:00:21.145546.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T19:00:21.145546.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_08_17T19_00_21.145546", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T19:00:21.145546.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T19:00:21.145546.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_08_17T19_00_21.145546", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T19:00:21.145546.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T19:00:21.145546.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_08_17T19_00_21.145546", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T19:00:21.145546.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T19:00:21.145546.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_08_17T19_00_21.145546", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T19:00:21.145546.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T19:00:21.145546.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_08_17T19_00_21.145546", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T19:00:21.145546.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T19:00:21.145546.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_08_17T19_00_21.145546", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T19:00:21.145546.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T19:00:21.145546.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_08_17T19_00_21.145546", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T19:00:21.145546.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T19:00:21.145546.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_08_17T19_00_21.145546", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T19:00:21.145546.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T19:00:21.145546.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_08_17T19_00_21.145546", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-17T19:00:21.145546.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-17T19:00:21.145546.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_08_17T19_00_21.145546", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T19:00:21.145546.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T19:00:21.145546.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_08_17T19_00_21.145546", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-17T19:00:21.145546.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-17T19:00:21.145546.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_08_17T19_00_21.145546", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T19:00:21.145546.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T19:00:21.145546.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_08_17T19_00_21.145546", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T19:00:21.145546.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T19:00:21.145546.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_08_17T19_00_21.145546", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T19:00:21.145546.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T19:00:21.145546.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_08_17T19_00_21.145546", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-17T19:00:21.145546.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-17T19:00:21.145546.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_08_17T19_00_21.145546", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-17T19:00:21.145546.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-17T19:00:21.145546.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_08_17T19_00_21.145546", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T19:00:21.145546.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T19:00:21.145546.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_08_17T19_00_21.145546", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T19:00:21.145546.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T19:00:21.145546.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_08_17T19_00_21.145546", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T19:00:21.145546.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T19:00:21.145546.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_08_17T19_00_21.145546", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T19:00:21.145546.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T19:00:21.145546.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_08_17T19_00_21.145546", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-17T19:00:21.145546.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-17T19:00:21.145546.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_08_17T19_00_21.145546", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-17T19:00:21.145546.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-17T19:00:21.145546.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_08_17T19_00_21.145546", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-17T19:00:21.145546.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-17T19:00:21.145546.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_08_17T19_00_21.145546", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T19:00:21.145546.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T19:00:21.145546.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_08_17T19_00_21.145546", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-17T19:00:21.145546.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-17T19:00:21.145546.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_08_17T19_00_21.145546", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T19:00:21.145546.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T19:00:21.145546.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_08_17T19_00_21.145546", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T19:00:21.145546.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T19:00:21.145546.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_08_17T19_00_21.145546", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-17T19:00:21.145546.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-17T19:00:21.145546.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_08_17T19_00_21.145546", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-17T19:00:21.145546.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-17T19:00:21.145546.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_08_17T19_00_21.145546", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-17T19:00:21.145546.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-17T19:00:21.145546.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_08_17T19_00_21.145546", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T19:00:21.145546.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T19:00:21.145546.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_08_17T19_00_21.145546", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-17T19:00:21.145546.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-17T19:00:21.145546.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_08_17T19_00_21.145546", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-17T19:00:21.145546.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-17T19:00:21.145546.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_08_17T19_00_21.145546", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-17T19:00:21.145546.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-17T19:00:21.145546.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_14T22_14_50.964643", "path": ["**/details_harness|winogrande|5_2023-10-14T22-14-50.964643.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-14T22-14-50.964643.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_08_17T19_00_21.145546", "path": ["results_2023-08-17T19:00:21.145546.parquet"]}, {"split": "2023_10_14T22_14_50.964643", "path": ["results_2023-10-14T22-14-50.964643.parquet"]}, {"split": "latest", "path": ["results_2023-10-14T22-14-50.964643.parquet"]}]}]}
2023-10-14T21:15:03+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of davzoku/cria-llama2-7b-v1.3_peft ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model davzoku/cria-llama2-7b-v1.3_peft on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-10-14T22:14:50.964643(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of davzoku/cria-llama2-7b-v1.3_peft", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model davzoku/cria-llama2-7b-v1.3_peft on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-14T22:14:50.964643(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of davzoku/cria-llama2-7b-v1.3_peft", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model davzoku/cria-llama2-7b-v1.3_peft on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-14T22:14:50.964643(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 27, 31, 175, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of davzoku/cria-llama2-7b-v1.3_peft## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model davzoku/cria-llama2-7b-v1.3_peft on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-14T22:14:50.964643(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
bc15ea1a27e46b3926efaf0b15bbd460511fd676
# Dataset Card for Evaluation run of chargoddard/llama2-22b-blocktriangular ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/chargoddard/llama2-22b-blocktriangular - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [chargoddard/llama2-22b-blocktriangular](https://huggingface.co/chargoddard/llama2-22b-blocktriangular) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_chargoddard__llama2-22b-blocktriangular", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-10-18T10:02:48.850156](https://huggingface.co/datasets/open-llm-leaderboard/details_chargoddard__llama2-22b-blocktriangular/blob/main/results_2023-10-18T10-02-48.850156.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.002202181208053691, "em_stderr": 0.000480051081661935, "f1": 0.06165897651006692, "f1_stderr": 0.0013848407345463738, "acc": 0.4357400460634537, "acc_stderr": 0.010354651175233286 }, "harness|drop|3": { "em": 0.002202181208053691, "em_stderr": 0.000480051081661935, "f1": 0.06165897651006692, "f1_stderr": 0.0013848407345463738 }, "harness|gsm8k|5": { "acc": 0.11220621683093253, "acc_stderr": 0.008693743138242383 }, "harness|winogrande|5": { "acc": 0.7592738752959748, "acc_stderr": 0.012015559212224186 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_chargoddard__llama2-22b-blocktriangular
[ "region:us" ]
2023-08-18T17:57:28+00:00
{"pretty_name": "Evaluation run of chargoddard/llama2-22b-blocktriangular", "dataset_summary": "Dataset automatically created during the evaluation run of model [chargoddard/llama2-22b-blocktriangular](https://huggingface.co/chargoddard/llama2-22b-blocktriangular) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_chargoddard__llama2-22b-blocktriangular\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-18T10:02:48.850156](https://huggingface.co/datasets/open-llm-leaderboard/details_chargoddard__llama2-22b-blocktriangular/blob/main/results_2023-10-18T10-02-48.850156.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.002202181208053691,\n \"em_stderr\": 0.000480051081661935,\n \"f1\": 0.06165897651006692,\n \"f1_stderr\": 0.0013848407345463738,\n \"acc\": 0.4357400460634537,\n \"acc_stderr\": 0.010354651175233286\n },\n \"harness|drop|3\": {\n \"em\": 0.002202181208053691,\n \"em_stderr\": 0.000480051081661935,\n \"f1\": 0.06165897651006692,\n \"f1_stderr\": 0.0013848407345463738\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.11220621683093253,\n \"acc_stderr\": 0.008693743138242383\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7592738752959748,\n \"acc_stderr\": 0.012015559212224186\n }\n}\n```", "repo_url": "https://huggingface.co/chargoddard/llama2-22b-blocktriangular", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_08_17T16_15_19.075132", "path": ["**/details_harness|arc:challenge|25_2023-08-17T16:15:19.075132.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-08-17T16:15:19.075132.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_18T05_53_32.762527", "path": ["**/details_harness|drop|3_2023-10-18T05-53-32.762527.parquet"]}, {"split": "2023_10_18T10_02_48.850156", "path": ["**/details_harness|drop|3_2023-10-18T10-02-48.850156.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-18T10-02-48.850156.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_18T05_53_32.762527", "path": ["**/details_harness|gsm8k|5_2023-10-18T05-53-32.762527.parquet"]}, {"split": "2023_10_18T10_02_48.850156", "path": ["**/details_harness|gsm8k|5_2023-10-18T10-02-48.850156.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-18T10-02-48.850156.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_08_17T16_15_19.075132", "path": ["**/details_harness|hellaswag|10_2023-08-17T16:15:19.075132.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-08-17T16:15:19.075132.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_08_17T16_15_19.075132", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T16:15:19.075132.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-17T16:15:19.075132.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-17T16:15:19.075132.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T16:15:19.075132.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T16:15:19.075132.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-17T16:15:19.075132.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T16:15:19.075132.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T16:15:19.075132.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T16:15:19.075132.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T16:15:19.075132.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-17T16:15:19.075132.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-17T16:15:19.075132.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T16:15:19.075132.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-17T16:15:19.075132.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T16:15:19.075132.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T16:15:19.075132.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T16:15:19.075132.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-17T16:15:19.075132.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T16:15:19.075132.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T16:15:19.075132.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T16:15:19.075132.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T16:15:19.075132.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T16:15:19.075132.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T16:15:19.075132.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T16:15:19.075132.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T16:15:19.075132.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T16:15:19.075132.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T16:15:19.075132.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T16:15:19.075132.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T16:15:19.075132.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T16:15:19.075132.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T16:15:19.075132.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-17T16:15:19.075132.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T16:15:19.075132.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-17T16:15:19.075132.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T16:15:19.075132.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T16:15:19.075132.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T16:15:19.075132.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-17T16:15:19.075132.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-17T16:15:19.075132.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T16:15:19.075132.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T16:15:19.075132.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T16:15:19.075132.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T16:15:19.075132.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-17T16:15:19.075132.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-17T16:15:19.075132.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-17T16:15:19.075132.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T16:15:19.075132.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-17T16:15:19.075132.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T16:15:19.075132.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T16:15:19.075132.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-17T16:15:19.075132.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-17T16:15:19.075132.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-17T16:15:19.075132.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T16:15:19.075132.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-17T16:15:19.075132.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-17T16:15:19.075132.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T16:15:19.075132.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-17T16:15:19.075132.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-17T16:15:19.075132.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T16:15:19.075132.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T16:15:19.075132.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-17T16:15:19.075132.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T16:15:19.075132.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T16:15:19.075132.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T16:15:19.075132.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T16:15:19.075132.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-17T16:15:19.075132.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-17T16:15:19.075132.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T16:15:19.075132.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-17T16:15:19.075132.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T16:15:19.075132.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T16:15:19.075132.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T16:15:19.075132.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-17T16:15:19.075132.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T16:15:19.075132.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T16:15:19.075132.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T16:15:19.075132.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T16:15:19.075132.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T16:15:19.075132.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T16:15:19.075132.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T16:15:19.075132.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T16:15:19.075132.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T16:15:19.075132.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T16:15:19.075132.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T16:15:19.075132.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T16:15:19.075132.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T16:15:19.075132.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T16:15:19.075132.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-17T16:15:19.075132.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T16:15:19.075132.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-17T16:15:19.075132.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T16:15:19.075132.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T16:15:19.075132.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T16:15:19.075132.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-17T16:15:19.075132.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-17T16:15:19.075132.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T16:15:19.075132.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T16:15:19.075132.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T16:15:19.075132.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T16:15:19.075132.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-17T16:15:19.075132.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-17T16:15:19.075132.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-17T16:15:19.075132.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T16:15:19.075132.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-17T16:15:19.075132.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T16:15:19.075132.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T16:15:19.075132.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-17T16:15:19.075132.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-17T16:15:19.075132.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-17T16:15:19.075132.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T16:15:19.075132.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-17T16:15:19.075132.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-17T16:15:19.075132.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_08_17T16_15_19.075132", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T16:15:19.075132.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T16:15:19.075132.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_08_17T16_15_19.075132", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-17T16:15:19.075132.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-17T16:15:19.075132.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_08_17T16_15_19.075132", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-17T16:15:19.075132.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-17T16:15:19.075132.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_08_17T16_15_19.075132", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T16:15:19.075132.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T16:15:19.075132.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_08_17T16_15_19.075132", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T16:15:19.075132.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T16:15:19.075132.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_08_17T16_15_19.075132", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-17T16:15:19.075132.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-17T16:15:19.075132.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_08_17T16_15_19.075132", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T16:15:19.075132.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T16:15:19.075132.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_08_17T16_15_19.075132", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T16:15:19.075132.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T16:15:19.075132.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_08_17T16_15_19.075132", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T16:15:19.075132.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T16:15:19.075132.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_08_17T16_15_19.075132", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T16:15:19.075132.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T16:15:19.075132.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_08_17T16_15_19.075132", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-17T16:15:19.075132.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-17T16:15:19.075132.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_08_17T16_15_19.075132", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-17T16:15:19.075132.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-17T16:15:19.075132.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_08_17T16_15_19.075132", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T16:15:19.075132.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T16:15:19.075132.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_08_17T16_15_19.075132", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-17T16:15:19.075132.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-17T16:15:19.075132.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_08_17T16_15_19.075132", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T16:15:19.075132.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T16:15:19.075132.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_08_17T16_15_19.075132", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T16:15:19.075132.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T16:15:19.075132.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_08_17T16_15_19.075132", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T16:15:19.075132.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T16:15:19.075132.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_08_17T16_15_19.075132", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-17T16:15:19.075132.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-17T16:15:19.075132.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_08_17T16_15_19.075132", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T16:15:19.075132.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T16:15:19.075132.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_08_17T16_15_19.075132", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T16:15:19.075132.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T16:15:19.075132.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_08_17T16_15_19.075132", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T16:15:19.075132.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T16:15:19.075132.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_08_17T16_15_19.075132", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T16:15:19.075132.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T16:15:19.075132.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_08_17T16_15_19.075132", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T16:15:19.075132.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T16:15:19.075132.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_08_17T16_15_19.075132", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T16:15:19.075132.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T16:15:19.075132.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_08_17T16_15_19.075132", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T16:15:19.075132.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T16:15:19.075132.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_08_17T16_15_19.075132", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T16:15:19.075132.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T16:15:19.075132.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_08_17T16_15_19.075132", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T16:15:19.075132.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T16:15:19.075132.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_08_17T16_15_19.075132", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T16:15:19.075132.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T16:15:19.075132.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_08_17T16_15_19.075132", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T16:15:19.075132.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T16:15:19.075132.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_08_17T16_15_19.075132", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T16:15:19.075132.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T16:15:19.075132.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_08_17T16_15_19.075132", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T16:15:19.075132.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T16:15:19.075132.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_08_17T16_15_19.075132", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T16:15:19.075132.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T16:15:19.075132.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_08_17T16_15_19.075132", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-17T16:15:19.075132.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-17T16:15:19.075132.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_08_17T16_15_19.075132", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T16:15:19.075132.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T16:15:19.075132.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_08_17T16_15_19.075132", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-17T16:15:19.075132.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-17T16:15:19.075132.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_08_17T16_15_19.075132", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T16:15:19.075132.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T16:15:19.075132.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_08_17T16_15_19.075132", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T16:15:19.075132.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T16:15:19.075132.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_08_17T16_15_19.075132", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T16:15:19.075132.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T16:15:19.075132.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_08_17T16_15_19.075132", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-17T16:15:19.075132.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-17T16:15:19.075132.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_08_17T16_15_19.075132", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-17T16:15:19.075132.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-17T16:15:19.075132.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_08_17T16_15_19.075132", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T16:15:19.075132.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T16:15:19.075132.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_08_17T16_15_19.075132", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T16:15:19.075132.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T16:15:19.075132.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_08_17T16_15_19.075132", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T16:15:19.075132.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T16:15:19.075132.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_08_17T16_15_19.075132", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T16:15:19.075132.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T16:15:19.075132.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_08_17T16_15_19.075132", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-17T16:15:19.075132.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-17T16:15:19.075132.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_08_17T16_15_19.075132", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-17T16:15:19.075132.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-17T16:15:19.075132.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_08_17T16_15_19.075132", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-17T16:15:19.075132.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-17T16:15:19.075132.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_08_17T16_15_19.075132", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T16:15:19.075132.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T16:15:19.075132.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_08_17T16_15_19.075132", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-17T16:15:19.075132.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-17T16:15:19.075132.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_08_17T16_15_19.075132", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T16:15:19.075132.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T16:15:19.075132.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_08_17T16_15_19.075132", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T16:15:19.075132.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T16:15:19.075132.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_08_17T16_15_19.075132", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-17T16:15:19.075132.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-17T16:15:19.075132.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_08_17T16_15_19.075132", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-17T16:15:19.075132.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-17T16:15:19.075132.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_08_17T16_15_19.075132", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-17T16:15:19.075132.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-17T16:15:19.075132.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_08_17T16_15_19.075132", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T16:15:19.075132.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T16:15:19.075132.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_08_17T16_15_19.075132", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-17T16:15:19.075132.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-17T16:15:19.075132.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_08_17T16_15_19.075132", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-17T16:15:19.075132.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-17T16:15:19.075132.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_08_17T16_15_19.075132", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-17T16:15:19.075132.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-17T16:15:19.075132.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_18T05_53_32.762527", "path": ["**/details_harness|winogrande|5_2023-10-18T05-53-32.762527.parquet"]}, {"split": "2023_10_18T10_02_48.850156", "path": ["**/details_harness|winogrande|5_2023-10-18T10-02-48.850156.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-18T10-02-48.850156.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_08_17T16_15_19.075132", "path": ["results_2023-08-17T16:15:19.075132.parquet"]}, {"split": "2023_10_18T05_53_32.762527", "path": ["results_2023-10-18T05-53-32.762527.parquet"]}, {"split": "2023_10_18T10_02_48.850156", "path": ["results_2023-10-18T10-02-48.850156.parquet"]}, {"split": "latest", "path": ["results_2023-10-18T10-02-48.850156.parquet"]}]}]}
2023-10-18T09:03:01+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of chargoddard/llama2-22b-blocktriangular ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model chargoddard/llama2-22b-blocktriangular on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-10-18T10:02:48.850156(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of chargoddard/llama2-22b-blocktriangular", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model chargoddard/llama2-22b-blocktriangular on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-18T10:02:48.850156(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of chargoddard/llama2-22b-blocktriangular", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model chargoddard/llama2-22b-blocktriangular on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-18T10:02:48.850156(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 24, 31, 172, 66, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of chargoddard/llama2-22b-blocktriangular## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model chargoddard/llama2-22b-blocktriangular on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-18T10:02:48.850156(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
3e0edd79ca2cad75163df55900721202acbb7570
# Dataset Card for Evaluation run of FlagAlpha/Llama2-Chinese-13b-Chat ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/FlagAlpha/Llama2-Chinese-13b-Chat - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [FlagAlpha/Llama2-Chinese-13b-Chat](https://huggingface.co/FlagAlpha/Llama2-Chinese-13b-Chat) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_FlagAlpha__Llama2-Chinese-13b-Chat", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-10-13T06:13:18.397506](https://huggingface.co/datasets/open-llm-leaderboard/details_FlagAlpha__Llama2-Chinese-13b-Chat/blob/main/results_2023-10-13T06-13-18.397506.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.3886325503355705, "em_stderr": 0.004991836977358219, "f1": 0.4460098573825516, "f1_stderr": 0.004836724027731064, "acc": 0.4437472960609105, "acc_stderr": 0.010555580633054316 }, "harness|drop|3": { "em": 0.3886325503355705, "em_stderr": 0.004991836977358219, "f1": 0.4460098573825516, "f1_stderr": 0.004836724027731064 }, "harness|gsm8k|5": { "acc": 0.12585291887793784, "acc_stderr": 0.009136212598406319 }, "harness|winogrande|5": { "acc": 0.7616416732438832, "acc_stderr": 0.011974948667702313 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_FlagAlpha__Llama2-Chinese-13b-Chat
[ "region:us" ]
2023-08-18T17:57:38+00:00
{"pretty_name": "Evaluation run of FlagAlpha/Llama2-Chinese-13b-Chat", "dataset_summary": "Dataset automatically created during the evaluation run of model [FlagAlpha/Llama2-Chinese-13b-Chat](https://huggingface.co/FlagAlpha/Llama2-Chinese-13b-Chat) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_FlagAlpha__Llama2-Chinese-13b-Chat\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-13T06:13:18.397506](https://huggingface.co/datasets/open-llm-leaderboard/details_FlagAlpha__Llama2-Chinese-13b-Chat/blob/main/results_2023-10-13T06-13-18.397506.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.3886325503355705,\n \"em_stderr\": 0.004991836977358219,\n \"f1\": 0.4460098573825516,\n \"f1_stderr\": 0.004836724027731064,\n \"acc\": 0.4437472960609105,\n \"acc_stderr\": 0.010555580633054316\n },\n \"harness|drop|3\": {\n \"em\": 0.3886325503355705,\n \"em_stderr\": 0.004991836977358219,\n \"f1\": 0.4460098573825516,\n \"f1_stderr\": 0.004836724027731064\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.12585291887793784,\n \"acc_stderr\": 0.009136212598406319\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7616416732438832,\n \"acc_stderr\": 0.011974948667702313\n }\n}\n```", "repo_url": "https://huggingface.co/FlagAlpha/Llama2-Chinese-13b-Chat", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_08_17T19_12_34.146693", "path": ["**/details_harness|arc:challenge|25_2023-08-17T19:12:34.146693.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-08-17T19:12:34.146693.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_13T06_13_18.397506", "path": ["**/details_harness|drop|3_2023-10-13T06-13-18.397506.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-13T06-13-18.397506.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_13T06_13_18.397506", "path": ["**/details_harness|gsm8k|5_2023-10-13T06-13-18.397506.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-13T06-13-18.397506.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_08_17T19_12_34.146693", "path": ["**/details_harness|hellaswag|10_2023-08-17T19:12:34.146693.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-08-17T19:12:34.146693.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_08_17T19_12_34.146693", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T19:12:34.146693.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-17T19:12:34.146693.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-17T19:12:34.146693.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T19:12:34.146693.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T19:12:34.146693.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-17T19:12:34.146693.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T19:12:34.146693.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T19:12:34.146693.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T19:12:34.146693.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T19:12:34.146693.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-17T19:12:34.146693.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-17T19:12:34.146693.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T19:12:34.146693.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-17T19:12:34.146693.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T19:12:34.146693.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T19:12:34.146693.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T19:12:34.146693.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-17T19:12:34.146693.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T19:12:34.146693.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T19:12:34.146693.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T19:12:34.146693.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T19:12:34.146693.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T19:12:34.146693.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T19:12:34.146693.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T19:12:34.146693.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T19:12:34.146693.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T19:12:34.146693.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T19:12:34.146693.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T19:12:34.146693.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T19:12:34.146693.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T19:12:34.146693.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T19:12:34.146693.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-17T19:12:34.146693.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T19:12:34.146693.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-17T19:12:34.146693.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T19:12:34.146693.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T19:12:34.146693.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T19:12:34.146693.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-17T19:12:34.146693.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-17T19:12:34.146693.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T19:12:34.146693.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T19:12:34.146693.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T19:12:34.146693.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T19:12:34.146693.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-17T19:12:34.146693.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-17T19:12:34.146693.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-17T19:12:34.146693.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T19:12:34.146693.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-17T19:12:34.146693.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T19:12:34.146693.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T19:12:34.146693.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-17T19:12:34.146693.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-17T19:12:34.146693.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-17T19:12:34.146693.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T19:12:34.146693.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-17T19:12:34.146693.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-17T19:12:34.146693.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T19:12:34.146693.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-17T19:12:34.146693.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-17T19:12:34.146693.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T19:12:34.146693.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T19:12:34.146693.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-17T19:12:34.146693.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T19:12:34.146693.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T19:12:34.146693.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T19:12:34.146693.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T19:12:34.146693.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-17T19:12:34.146693.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-17T19:12:34.146693.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T19:12:34.146693.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-17T19:12:34.146693.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T19:12:34.146693.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T19:12:34.146693.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T19:12:34.146693.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-17T19:12:34.146693.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T19:12:34.146693.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T19:12:34.146693.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T19:12:34.146693.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T19:12:34.146693.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T19:12:34.146693.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T19:12:34.146693.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T19:12:34.146693.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T19:12:34.146693.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T19:12:34.146693.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T19:12:34.146693.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T19:12:34.146693.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T19:12:34.146693.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T19:12:34.146693.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T19:12:34.146693.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-17T19:12:34.146693.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T19:12:34.146693.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-17T19:12:34.146693.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T19:12:34.146693.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T19:12:34.146693.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T19:12:34.146693.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-17T19:12:34.146693.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-17T19:12:34.146693.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T19:12:34.146693.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T19:12:34.146693.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T19:12:34.146693.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T19:12:34.146693.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-17T19:12:34.146693.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-17T19:12:34.146693.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-17T19:12:34.146693.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T19:12:34.146693.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-17T19:12:34.146693.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T19:12:34.146693.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T19:12:34.146693.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-17T19:12:34.146693.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-17T19:12:34.146693.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-17T19:12:34.146693.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T19:12:34.146693.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-17T19:12:34.146693.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-17T19:12:34.146693.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_08_17T19_12_34.146693", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T19:12:34.146693.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T19:12:34.146693.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_08_17T19_12_34.146693", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-17T19:12:34.146693.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-17T19:12:34.146693.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_08_17T19_12_34.146693", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-17T19:12:34.146693.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-17T19:12:34.146693.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_08_17T19_12_34.146693", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T19:12:34.146693.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T19:12:34.146693.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_08_17T19_12_34.146693", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T19:12:34.146693.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T19:12:34.146693.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_08_17T19_12_34.146693", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-17T19:12:34.146693.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-17T19:12:34.146693.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_08_17T19_12_34.146693", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T19:12:34.146693.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T19:12:34.146693.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_08_17T19_12_34.146693", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T19:12:34.146693.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T19:12:34.146693.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_08_17T19_12_34.146693", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T19:12:34.146693.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T19:12:34.146693.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_08_17T19_12_34.146693", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T19:12:34.146693.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T19:12:34.146693.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_08_17T19_12_34.146693", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-17T19:12:34.146693.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-17T19:12:34.146693.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_08_17T19_12_34.146693", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-17T19:12:34.146693.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-17T19:12:34.146693.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_08_17T19_12_34.146693", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T19:12:34.146693.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T19:12:34.146693.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_08_17T19_12_34.146693", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-17T19:12:34.146693.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-17T19:12:34.146693.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_08_17T19_12_34.146693", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T19:12:34.146693.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T19:12:34.146693.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_08_17T19_12_34.146693", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T19:12:34.146693.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T19:12:34.146693.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_08_17T19_12_34.146693", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T19:12:34.146693.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T19:12:34.146693.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_08_17T19_12_34.146693", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-17T19:12:34.146693.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-17T19:12:34.146693.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_08_17T19_12_34.146693", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T19:12:34.146693.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T19:12:34.146693.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_08_17T19_12_34.146693", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T19:12:34.146693.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T19:12:34.146693.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_08_17T19_12_34.146693", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T19:12:34.146693.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T19:12:34.146693.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_08_17T19_12_34.146693", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T19:12:34.146693.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T19:12:34.146693.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_08_17T19_12_34.146693", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T19:12:34.146693.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T19:12:34.146693.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_08_17T19_12_34.146693", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T19:12:34.146693.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T19:12:34.146693.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_08_17T19_12_34.146693", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T19:12:34.146693.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T19:12:34.146693.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_08_17T19_12_34.146693", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T19:12:34.146693.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T19:12:34.146693.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_08_17T19_12_34.146693", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T19:12:34.146693.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T19:12:34.146693.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_08_17T19_12_34.146693", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T19:12:34.146693.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T19:12:34.146693.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_08_17T19_12_34.146693", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T19:12:34.146693.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T19:12:34.146693.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_08_17T19_12_34.146693", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T19:12:34.146693.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T19:12:34.146693.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_08_17T19_12_34.146693", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T19:12:34.146693.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T19:12:34.146693.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_08_17T19_12_34.146693", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T19:12:34.146693.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T19:12:34.146693.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_08_17T19_12_34.146693", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-17T19:12:34.146693.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-17T19:12:34.146693.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_08_17T19_12_34.146693", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T19:12:34.146693.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T19:12:34.146693.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_08_17T19_12_34.146693", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-17T19:12:34.146693.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-17T19:12:34.146693.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_08_17T19_12_34.146693", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T19:12:34.146693.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T19:12:34.146693.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_08_17T19_12_34.146693", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T19:12:34.146693.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T19:12:34.146693.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_08_17T19_12_34.146693", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T19:12:34.146693.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T19:12:34.146693.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_08_17T19_12_34.146693", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-17T19:12:34.146693.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-17T19:12:34.146693.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_08_17T19_12_34.146693", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-17T19:12:34.146693.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-17T19:12:34.146693.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_08_17T19_12_34.146693", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T19:12:34.146693.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T19:12:34.146693.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_08_17T19_12_34.146693", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T19:12:34.146693.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T19:12:34.146693.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_08_17T19_12_34.146693", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T19:12:34.146693.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T19:12:34.146693.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_08_17T19_12_34.146693", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T19:12:34.146693.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T19:12:34.146693.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_08_17T19_12_34.146693", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-17T19:12:34.146693.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-17T19:12:34.146693.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_08_17T19_12_34.146693", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-17T19:12:34.146693.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-17T19:12:34.146693.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_08_17T19_12_34.146693", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-17T19:12:34.146693.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-17T19:12:34.146693.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_08_17T19_12_34.146693", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T19:12:34.146693.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T19:12:34.146693.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_08_17T19_12_34.146693", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-17T19:12:34.146693.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-17T19:12:34.146693.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_08_17T19_12_34.146693", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T19:12:34.146693.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T19:12:34.146693.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_08_17T19_12_34.146693", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T19:12:34.146693.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T19:12:34.146693.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_08_17T19_12_34.146693", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-17T19:12:34.146693.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-17T19:12:34.146693.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_08_17T19_12_34.146693", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-17T19:12:34.146693.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-17T19:12:34.146693.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_08_17T19_12_34.146693", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-17T19:12:34.146693.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-17T19:12:34.146693.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_08_17T19_12_34.146693", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T19:12:34.146693.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T19:12:34.146693.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_08_17T19_12_34.146693", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-17T19:12:34.146693.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-17T19:12:34.146693.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_08_17T19_12_34.146693", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-17T19:12:34.146693.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-17T19:12:34.146693.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_08_17T19_12_34.146693", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-17T19:12:34.146693.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-17T19:12:34.146693.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_13T06_13_18.397506", "path": ["**/details_harness|winogrande|5_2023-10-13T06-13-18.397506.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-13T06-13-18.397506.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_08_17T19_12_34.146693", "path": ["results_2023-08-17T19:12:34.146693.parquet"]}, {"split": "2023_10_13T06_13_18.397506", "path": ["results_2023-10-13T06-13-18.397506.parquet"]}, {"split": "latest", "path": ["results_2023-10-13T06-13-18.397506.parquet"]}]}]}
2023-10-13T05:13:31+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of FlagAlpha/Llama2-Chinese-13b-Chat ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model FlagAlpha/Llama2-Chinese-13b-Chat on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-10-13T06:13:18.397506(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of FlagAlpha/Llama2-Chinese-13b-Chat", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model FlagAlpha/Llama2-Chinese-13b-Chat on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-13T06:13:18.397506(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of FlagAlpha/Llama2-Chinese-13b-Chat", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model FlagAlpha/Llama2-Chinese-13b-Chat on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-13T06:13:18.397506(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 22, 31, 170, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of FlagAlpha/Llama2-Chinese-13b-Chat## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model FlagAlpha/Llama2-Chinese-13b-Chat on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-13T06:13:18.397506(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
19bb89770d4b19a30ec0200b1df4c2381b01e9de
# Dataset Card for Evaluation run of yihan6324/llama2-7b-instructmining-orca-40k ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/yihan6324/llama2-7b-instructmining-orca-40k - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [yihan6324/llama2-7b-instructmining-orca-40k](https://huggingface.co/yihan6324/llama2-7b-instructmining-orca-40k) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_yihan6324__llama2-7b-instructmining-orca-40k", "harness_truthfulqa_mc_0", split="train") ``` ## Latest results These are the [latest results from run 2023-08-18T00:53:27.654117](https://huggingface.co/datasets/open-llm-leaderboard/details_yihan6324__llama2-7b-instructmining-orca-40k/blob/main/results_2023-08-18T00%3A53%3A27.654117.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.4847120233306423, "acc_stderr": 0.03527399847085323, "acc_norm": 0.4884455010512822, "acc_norm_stderr": 0.035257414280301984, "mc1": 0.36107711138310894, "mc1_stderr": 0.016814312844836882, "mc2": 0.5103220670450638, "mc2_stderr": 0.015890639542177364 }, "harness|arc:challenge|25": { "acc": 0.5298634812286689, "acc_stderr": 0.014585305840007105, "acc_norm": 0.5674061433447098, "acc_norm_stderr": 0.014478005694182524 }, "harness|hellaswag|10": { "acc": 0.6196972714598685, "acc_stderr": 0.004844690404713595, "acc_norm": 0.8024297948615814, "acc_norm_stderr": 0.0039735233080143454 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.35, "acc_stderr": 0.0479372485441102, "acc_norm": 0.35, "acc_norm_stderr": 0.0479372485441102 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.4740740740740741, "acc_stderr": 0.04313531696750575, "acc_norm": 0.4740740740740741, "acc_norm_stderr": 0.04313531696750575 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.48026315789473684, "acc_stderr": 0.040657710025626036, "acc_norm": 0.48026315789473684, "acc_norm_stderr": 0.040657710025626036 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.48, "acc_stderr": 0.050211673156867795, "acc_norm": 0.48, "acc_norm_stderr": 0.050211673156867795 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.5547169811320755, "acc_stderr": 0.030588052974270655, "acc_norm": 0.5547169811320755, "acc_norm_stderr": 0.030588052974270655 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.4583333333333333, "acc_stderr": 0.04166666666666665, "acc_norm": 0.4583333333333333, "acc_norm_stderr": 0.04166666666666665 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.35, "acc_stderr": 0.0479372485441102, "acc_norm": 0.35, "acc_norm_stderr": 0.0479372485441102 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.39, "acc_stderr": 0.04902071300001975, "acc_norm": 0.39, "acc_norm_stderr": 0.04902071300001975 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.33, "acc_stderr": 0.04725815626252605, "acc_norm": 0.33, "acc_norm_stderr": 0.04725815626252605 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.43352601156069365, "acc_stderr": 0.03778621079092055, "acc_norm": 0.43352601156069365, "acc_norm_stderr": 0.03778621079092055 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.27450980392156865, "acc_stderr": 0.044405219061793275, "acc_norm": 0.27450980392156865, "acc_norm_stderr": 0.044405219061793275 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.57, "acc_stderr": 0.04975698519562428, "acc_norm": 0.57, "acc_norm_stderr": 0.04975698519562428 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.40425531914893614, "acc_stderr": 0.03208115750788684, "acc_norm": 0.40425531914893614, "acc_norm_stderr": 0.03208115750788684 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.23684210526315788, "acc_stderr": 0.039994238792813344, "acc_norm": 0.23684210526315788, "acc_norm_stderr": 0.039994238792813344 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.4206896551724138, "acc_stderr": 0.0411391498118926, "acc_norm": 0.4206896551724138, "acc_norm_stderr": 0.0411391498118926 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.291005291005291, "acc_stderr": 0.023393826500484865, "acc_norm": 0.291005291005291, "acc_norm_stderr": 0.023393826500484865 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.23809523809523808, "acc_stderr": 0.03809523809523811, "acc_norm": 0.23809523809523808, "acc_norm_stderr": 0.03809523809523811 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.38, "acc_stderr": 0.048783173121456316, "acc_norm": 0.38, "acc_norm_stderr": 0.048783173121456316 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.5645161290322581, "acc_stderr": 0.028206225591502737, "acc_norm": 0.5645161290322581, "acc_norm_stderr": 0.028206225591502737 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.39408866995073893, "acc_stderr": 0.03438157967036543, "acc_norm": 0.39408866995073893, "acc_norm_stderr": 0.03438157967036543 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.44, "acc_stderr": 0.04988876515698589, "acc_norm": 0.44, "acc_norm_stderr": 0.04988876515698589 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.6121212121212121, "acc_stderr": 0.03804913653971011, "acc_norm": 0.6121212121212121, "acc_norm_stderr": 0.03804913653971011 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.6212121212121212, "acc_stderr": 0.03456088731993747, "acc_norm": 0.6212121212121212, "acc_norm_stderr": 0.03456088731993747 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.6528497409326425, "acc_stderr": 0.03435696168361355, "acc_norm": 0.6528497409326425, "acc_norm_stderr": 0.03435696168361355 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.46153846153846156, "acc_stderr": 0.025275892070240634, "acc_norm": 0.46153846153846156, "acc_norm_stderr": 0.025275892070240634 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.24814814814814815, "acc_stderr": 0.026335739404055803, "acc_norm": 0.24814814814814815, "acc_norm_stderr": 0.026335739404055803 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.4831932773109244, "acc_stderr": 0.03246013680375308, "acc_norm": 0.4831932773109244, "acc_norm_stderr": 0.03246013680375308 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.2847682119205298, "acc_stderr": 0.03684881521389023, "acc_norm": 0.2847682119205298, "acc_norm_stderr": 0.03684881521389023 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.6623853211009174, "acc_stderr": 0.02027526598663891, "acc_norm": 0.6623853211009174, "acc_norm_stderr": 0.02027526598663891 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.4351851851851852, "acc_stderr": 0.03381200005643525, "acc_norm": 0.4351851851851852, "acc_norm_stderr": 0.03381200005643525 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.6372549019607843, "acc_stderr": 0.03374499356319355, "acc_norm": 0.6372549019607843, "acc_norm_stderr": 0.03374499356319355 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.6582278481012658, "acc_stderr": 0.03087453753755362, "acc_norm": 0.6582278481012658, "acc_norm_stderr": 0.03087453753755362 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.515695067264574, "acc_stderr": 0.0335412657542081, "acc_norm": 0.515695067264574, "acc_norm_stderr": 0.0335412657542081 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.6030534351145038, "acc_stderr": 0.04291135671009225, "acc_norm": 0.6030534351145038, "acc_norm_stderr": 0.04291135671009225 }, "harness|hendrycksTest-international_law|5": { "acc": 0.6363636363636364, "acc_stderr": 0.043913262867240704, "acc_norm": 0.6363636363636364, "acc_norm_stderr": 0.043913262867240704 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.5092592592592593, "acc_stderr": 0.04832853553437055, "acc_norm": 0.5092592592592593, "acc_norm_stderr": 0.04832853553437055 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.48466257668711654, "acc_stderr": 0.039265223787088424, "acc_norm": 0.48466257668711654, "acc_norm_stderr": 0.039265223787088424 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.35714285714285715, "acc_stderr": 0.04547960999764376, "acc_norm": 0.35714285714285715, "acc_norm_stderr": 0.04547960999764376 }, "harness|hendrycksTest-management|5": { "acc": 0.6310679611650486, "acc_stderr": 0.0477761518115674, "acc_norm": 0.6310679611650486, "acc_norm_stderr": 0.0477761518115674 }, "harness|hendrycksTest-marketing|5": { "acc": 0.6752136752136753, "acc_stderr": 0.03067902276549883, "acc_norm": 0.6752136752136753, "acc_norm_stderr": 0.03067902276549883 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.5, "acc_stderr": 0.050251890762960605, "acc_norm": 0.5, "acc_norm_stderr": 0.050251890762960605 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.6704980842911877, "acc_stderr": 0.016808322261740467, "acc_norm": 0.6704980842911877, "acc_norm_stderr": 0.016808322261740467 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.47109826589595377, "acc_stderr": 0.02687408588351835, "acc_norm": 0.47109826589595377, "acc_norm_stderr": 0.02687408588351835 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.25027932960893856, "acc_stderr": 0.014487500852850407, "acc_norm": 0.25027932960893856, "acc_norm_stderr": 0.014487500852850407 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.5522875816993464, "acc_stderr": 0.02847293847803353, "acc_norm": 0.5522875816993464, "acc_norm_stderr": 0.02847293847803353 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.5755627009646302, "acc_stderr": 0.028071928247946208, "acc_norm": 0.5755627009646302, "acc_norm_stderr": 0.028071928247946208 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.5308641975308642, "acc_stderr": 0.02776768960683393, "acc_norm": 0.5308641975308642, "acc_norm_stderr": 0.02776768960683393 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.3404255319148936, "acc_stderr": 0.028267657482650154, "acc_norm": 0.3404255319148936, "acc_norm_stderr": 0.028267657482650154 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.38722294654498046, "acc_stderr": 0.012441155326854924, "acc_norm": 0.38722294654498046, "acc_norm_stderr": 0.012441155326854924 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.47794117647058826, "acc_stderr": 0.030343264224213528, "acc_norm": 0.47794117647058826, "acc_norm_stderr": 0.030343264224213528 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.4395424836601307, "acc_stderr": 0.02007942040808792, "acc_norm": 0.4395424836601307, "acc_norm_stderr": 0.02007942040808792 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.5454545454545454, "acc_stderr": 0.04769300568972744, "acc_norm": 0.5454545454545454, "acc_norm_stderr": 0.04769300568972744 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.6081632653061224, "acc_stderr": 0.03125127591089165, "acc_norm": 0.6081632653061224, "acc_norm_stderr": 0.03125127591089165 }, "harness|hendrycksTest-sociology|5": { "acc": 0.6766169154228856, "acc_stderr": 0.03307615947979034, "acc_norm": 0.6766169154228856, "acc_norm_stderr": 0.03307615947979034 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.68, "acc_stderr": 0.04688261722621505, "acc_norm": 0.68, "acc_norm_stderr": 0.04688261722621505 }, "harness|hendrycksTest-virology|5": { "acc": 0.39156626506024095, "acc_stderr": 0.03799857454479637, "acc_norm": 0.39156626506024095, "acc_norm_stderr": 0.03799857454479637 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.6666666666666666, "acc_stderr": 0.03615507630310935, "acc_norm": 0.6666666666666666, "acc_norm_stderr": 0.03615507630310935 }, "harness|truthfulqa:mc|0": { "mc1": 0.36107711138310894, "mc1_stderr": 0.016814312844836882, "mc2": 0.5103220670450638, "mc2_stderr": 0.015890639542177364 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_yihan6324__llama2-7b-instructmining-orca-40k
[ "region:us" ]
2023-08-18T17:57:47+00:00
{"pretty_name": "Evaluation run of yihan6324/llama2-7b-instructmining-orca-40k", "dataset_summary": "Dataset automatically created during the evaluation run of model [yihan6324/llama2-7b-instructmining-orca-40k](https://huggingface.co/yihan6324/llama2-7b-instructmining-orca-40k) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_yihan6324__llama2-7b-instructmining-orca-40k\",\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-08-18T00:53:27.654117](https://huggingface.co/datasets/open-llm-leaderboard/details_yihan6324__llama2-7b-instructmining-orca-40k/blob/main/results_2023-08-18T00%3A53%3A27.654117.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.4847120233306423,\n \"acc_stderr\": 0.03527399847085323,\n \"acc_norm\": 0.4884455010512822,\n \"acc_norm_stderr\": 0.035257414280301984,\n \"mc1\": 0.36107711138310894,\n \"mc1_stderr\": 0.016814312844836882,\n \"mc2\": 0.5103220670450638,\n \"mc2_stderr\": 0.015890639542177364\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5298634812286689,\n \"acc_stderr\": 0.014585305840007105,\n \"acc_norm\": 0.5674061433447098,\n \"acc_norm_stderr\": 0.014478005694182524\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6196972714598685,\n \"acc_stderr\": 0.004844690404713595,\n \"acc_norm\": 0.8024297948615814,\n \"acc_norm_stderr\": 0.0039735233080143454\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4740740740740741,\n \"acc_stderr\": 0.04313531696750575,\n \"acc_norm\": 0.4740740740740741,\n \"acc_norm_stderr\": 0.04313531696750575\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.48026315789473684,\n \"acc_stderr\": 0.040657710025626036,\n \"acc_norm\": 0.48026315789473684,\n \"acc_norm_stderr\": 0.040657710025626036\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.5547169811320755,\n \"acc_stderr\": 0.030588052974270655,\n \"acc_norm\": 0.5547169811320755,\n \"acc_norm_stderr\": 0.030588052974270655\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.4583333333333333,\n \"acc_stderr\": 0.04166666666666665,\n \"acc_norm\": 0.4583333333333333,\n \"acc_norm_stderr\": 0.04166666666666665\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252605,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252605\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.43352601156069365,\n \"acc_stderr\": 0.03778621079092055,\n \"acc_norm\": 0.43352601156069365,\n \"acc_norm_stderr\": 0.03778621079092055\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.27450980392156865,\n \"acc_stderr\": 0.044405219061793275,\n \"acc_norm\": 0.27450980392156865,\n \"acc_norm_stderr\": 0.044405219061793275\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.57,\n \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\": 0.57,\n \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.40425531914893614,\n \"acc_stderr\": 0.03208115750788684,\n \"acc_norm\": 0.40425531914893614,\n \"acc_norm_stderr\": 0.03208115750788684\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.23684210526315788,\n \"acc_stderr\": 0.039994238792813344,\n \"acc_norm\": 0.23684210526315788,\n \"acc_norm_stderr\": 0.039994238792813344\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.4206896551724138,\n \"acc_stderr\": 0.0411391498118926,\n \"acc_norm\": 0.4206896551724138,\n \"acc_norm_stderr\": 0.0411391498118926\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.291005291005291,\n \"acc_stderr\": 0.023393826500484865,\n \"acc_norm\": 0.291005291005291,\n \"acc_norm_stderr\": 0.023393826500484865\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.23809523809523808,\n \"acc_stderr\": 0.03809523809523811,\n \"acc_norm\": 0.23809523809523808,\n \"acc_norm_stderr\": 0.03809523809523811\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.5645161290322581,\n \"acc_stderr\": 0.028206225591502737,\n \"acc_norm\": 0.5645161290322581,\n \"acc_norm_stderr\": 0.028206225591502737\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.39408866995073893,\n \"acc_stderr\": 0.03438157967036543,\n \"acc_norm\": 0.39408866995073893,\n \"acc_norm_stderr\": 0.03438157967036543\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.6121212121212121,\n \"acc_stderr\": 0.03804913653971011,\n \"acc_norm\": 0.6121212121212121,\n \"acc_norm_stderr\": 0.03804913653971011\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.6212121212121212,\n \"acc_stderr\": 0.03456088731993747,\n \"acc_norm\": 0.6212121212121212,\n \"acc_norm_stderr\": 0.03456088731993747\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.6528497409326425,\n \"acc_stderr\": 0.03435696168361355,\n \"acc_norm\": 0.6528497409326425,\n \"acc_norm_stderr\": 0.03435696168361355\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.46153846153846156,\n \"acc_stderr\": 0.025275892070240634,\n \"acc_norm\": 0.46153846153846156,\n \"acc_norm_stderr\": 0.025275892070240634\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.24814814814814815,\n \"acc_stderr\": 0.026335739404055803,\n \"acc_norm\": 0.24814814814814815,\n \"acc_norm_stderr\": 0.026335739404055803\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.4831932773109244,\n \"acc_stderr\": 0.03246013680375308,\n \"acc_norm\": 0.4831932773109244,\n \"acc_norm_stderr\": 0.03246013680375308\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.2847682119205298,\n \"acc_stderr\": 0.03684881521389023,\n \"acc_norm\": 0.2847682119205298,\n \"acc_norm_stderr\": 0.03684881521389023\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.6623853211009174,\n \"acc_stderr\": 0.02027526598663891,\n \"acc_norm\": 0.6623853211009174,\n \"acc_norm_stderr\": 0.02027526598663891\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.4351851851851852,\n \"acc_stderr\": 0.03381200005643525,\n \"acc_norm\": 0.4351851851851852,\n \"acc_norm_stderr\": 0.03381200005643525\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.6372549019607843,\n \"acc_stderr\": 0.03374499356319355,\n \"acc_norm\": 0.6372549019607843,\n \"acc_norm_stderr\": 0.03374499356319355\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.6582278481012658,\n \"acc_stderr\": 0.03087453753755362,\n \"acc_norm\": 0.6582278481012658,\n \"acc_norm_stderr\": 0.03087453753755362\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.515695067264574,\n \"acc_stderr\": 0.0335412657542081,\n \"acc_norm\": 0.515695067264574,\n \"acc_norm_stderr\": 0.0335412657542081\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.6030534351145038,\n \"acc_stderr\": 0.04291135671009225,\n \"acc_norm\": 0.6030534351145038,\n \"acc_norm_stderr\": 0.04291135671009225\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.6363636363636364,\n \"acc_stderr\": 0.043913262867240704,\n \"acc_norm\": 0.6363636363636364,\n \"acc_norm_stderr\": 0.043913262867240704\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5092592592592593,\n \"acc_stderr\": 0.04832853553437055,\n \"acc_norm\": 0.5092592592592593,\n \"acc_norm_stderr\": 0.04832853553437055\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.48466257668711654,\n \"acc_stderr\": 0.039265223787088424,\n \"acc_norm\": 0.48466257668711654,\n \"acc_norm_stderr\": 0.039265223787088424\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.35714285714285715,\n \"acc_stderr\": 0.04547960999764376,\n \"acc_norm\": 0.35714285714285715,\n \"acc_norm_stderr\": 0.04547960999764376\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.6310679611650486,\n \"acc_stderr\": 0.0477761518115674,\n \"acc_norm\": 0.6310679611650486,\n \"acc_norm_stderr\": 0.0477761518115674\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.6752136752136753,\n \"acc_stderr\": 0.03067902276549883,\n \"acc_norm\": 0.6752136752136753,\n \"acc_norm_stderr\": 0.03067902276549883\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6704980842911877,\n \"acc_stderr\": 0.016808322261740467,\n \"acc_norm\": 0.6704980842911877,\n \"acc_norm_stderr\": 0.016808322261740467\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.47109826589595377,\n \"acc_stderr\": 0.02687408588351835,\n \"acc_norm\": 0.47109826589595377,\n \"acc_norm_stderr\": 0.02687408588351835\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.25027932960893856,\n \"acc_stderr\": 0.014487500852850407,\n \"acc_norm\": 0.25027932960893856,\n \"acc_norm_stderr\": 0.014487500852850407\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.5522875816993464,\n \"acc_stderr\": 0.02847293847803353,\n \"acc_norm\": 0.5522875816993464,\n \"acc_norm_stderr\": 0.02847293847803353\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5755627009646302,\n \"acc_stderr\": 0.028071928247946208,\n \"acc_norm\": 0.5755627009646302,\n \"acc_norm_stderr\": 0.028071928247946208\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.5308641975308642,\n \"acc_stderr\": 0.02776768960683393,\n \"acc_norm\": 0.5308641975308642,\n \"acc_norm_stderr\": 0.02776768960683393\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.3404255319148936,\n \"acc_stderr\": 0.028267657482650154,\n \"acc_norm\": 0.3404255319148936,\n \"acc_norm_stderr\": 0.028267657482650154\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.38722294654498046,\n \"acc_stderr\": 0.012441155326854924,\n \"acc_norm\": 0.38722294654498046,\n \"acc_norm_stderr\": 0.012441155326854924\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.47794117647058826,\n \"acc_stderr\": 0.030343264224213528,\n \"acc_norm\": 0.47794117647058826,\n \"acc_norm_stderr\": 0.030343264224213528\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.4395424836601307,\n \"acc_stderr\": 0.02007942040808792,\n \"acc_norm\": 0.4395424836601307,\n \"acc_norm_stderr\": 0.02007942040808792\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5454545454545454,\n \"acc_stderr\": 0.04769300568972744,\n \"acc_norm\": 0.5454545454545454,\n \"acc_norm_stderr\": 0.04769300568972744\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.6081632653061224,\n \"acc_stderr\": 0.03125127591089165,\n \"acc_norm\": 0.6081632653061224,\n \"acc_norm_stderr\": 0.03125127591089165\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6766169154228856,\n \"acc_stderr\": 0.03307615947979034,\n \"acc_norm\": 0.6766169154228856,\n \"acc_norm_stderr\": 0.03307615947979034\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621505,\n \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.04688261722621505\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.39156626506024095,\n \"acc_stderr\": 0.03799857454479637,\n \"acc_norm\": 0.39156626506024095,\n \"acc_norm_stderr\": 0.03799857454479637\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.03615507630310935,\n \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.03615507630310935\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.36107711138310894,\n \"mc1_stderr\": 0.016814312844836882,\n \"mc2\": 0.5103220670450638,\n \"mc2_stderr\": 0.015890639542177364\n }\n}\n```", "repo_url": "https://huggingface.co/yihan6324/llama2-7b-instructmining-orca-40k", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_08_18T00_53_27.654117", "path": ["**/details_harness|arc:challenge|25_2023-08-18T00:53:27.654117.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-08-18T00:53:27.654117.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_08_18T00_53_27.654117", "path": ["**/details_harness|hellaswag|10_2023-08-18T00:53:27.654117.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-08-18T00:53:27.654117.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_08_18T00_53_27.654117", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T00:53:27.654117.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-18T00:53:27.654117.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-18T00:53:27.654117.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T00:53:27.654117.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T00:53:27.654117.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-18T00:53:27.654117.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T00:53:27.654117.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T00:53:27.654117.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T00:53:27.654117.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T00:53:27.654117.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-18T00:53:27.654117.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-18T00:53:27.654117.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T00:53:27.654117.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-18T00:53:27.654117.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T00:53:27.654117.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T00:53:27.654117.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T00:53:27.654117.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-18T00:53:27.654117.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T00:53:27.654117.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T00:53:27.654117.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T00:53:27.654117.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T00:53:27.654117.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T00:53:27.654117.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T00:53:27.654117.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T00:53:27.654117.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T00:53:27.654117.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T00:53:27.654117.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T00:53:27.654117.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T00:53:27.654117.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T00:53:27.654117.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T00:53:27.654117.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T00:53:27.654117.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-18T00:53:27.654117.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T00:53:27.654117.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-18T00:53:27.654117.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T00:53:27.654117.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T00:53:27.654117.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T00:53:27.654117.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-18T00:53:27.654117.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-18T00:53:27.654117.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T00:53:27.654117.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T00:53:27.654117.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T00:53:27.654117.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T00:53:27.654117.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-18T00:53:27.654117.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-18T00:53:27.654117.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-18T00:53:27.654117.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T00:53:27.654117.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-18T00:53:27.654117.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T00:53:27.654117.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T00:53:27.654117.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-18T00:53:27.654117.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-18T00:53:27.654117.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-18T00:53:27.654117.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T00:53:27.654117.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-18T00:53:27.654117.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-18T00:53:27.654117.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T00:53:27.654117.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-18T00:53:27.654117.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-18T00:53:27.654117.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T00:53:27.654117.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T00:53:27.654117.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-18T00:53:27.654117.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T00:53:27.654117.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T00:53:27.654117.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T00:53:27.654117.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T00:53:27.654117.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-18T00:53:27.654117.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-18T00:53:27.654117.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T00:53:27.654117.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-18T00:53:27.654117.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T00:53:27.654117.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T00:53:27.654117.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T00:53:27.654117.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-18T00:53:27.654117.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T00:53:27.654117.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T00:53:27.654117.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T00:53:27.654117.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T00:53:27.654117.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T00:53:27.654117.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T00:53:27.654117.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T00:53:27.654117.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T00:53:27.654117.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T00:53:27.654117.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T00:53:27.654117.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T00:53:27.654117.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T00:53:27.654117.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T00:53:27.654117.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T00:53:27.654117.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-18T00:53:27.654117.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T00:53:27.654117.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-18T00:53:27.654117.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T00:53:27.654117.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T00:53:27.654117.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T00:53:27.654117.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-18T00:53:27.654117.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-18T00:53:27.654117.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T00:53:27.654117.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T00:53:27.654117.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T00:53:27.654117.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T00:53:27.654117.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-18T00:53:27.654117.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-18T00:53:27.654117.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-18T00:53:27.654117.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T00:53:27.654117.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-18T00:53:27.654117.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T00:53:27.654117.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T00:53:27.654117.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-18T00:53:27.654117.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-18T00:53:27.654117.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-18T00:53:27.654117.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T00:53:27.654117.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-18T00:53:27.654117.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-18T00:53:27.654117.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_08_18T00_53_27.654117", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T00:53:27.654117.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T00:53:27.654117.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_08_18T00_53_27.654117", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-18T00:53:27.654117.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-18T00:53:27.654117.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_08_18T00_53_27.654117", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-18T00:53:27.654117.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-18T00:53:27.654117.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_08_18T00_53_27.654117", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T00:53:27.654117.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T00:53:27.654117.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_08_18T00_53_27.654117", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T00:53:27.654117.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T00:53:27.654117.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_08_18T00_53_27.654117", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-18T00:53:27.654117.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-18T00:53:27.654117.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_08_18T00_53_27.654117", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T00:53:27.654117.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T00:53:27.654117.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_08_18T00_53_27.654117", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T00:53:27.654117.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T00:53:27.654117.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_08_18T00_53_27.654117", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T00:53:27.654117.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T00:53:27.654117.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_08_18T00_53_27.654117", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T00:53:27.654117.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T00:53:27.654117.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_08_18T00_53_27.654117", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-18T00:53:27.654117.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-18T00:53:27.654117.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_08_18T00_53_27.654117", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-18T00:53:27.654117.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-18T00:53:27.654117.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_08_18T00_53_27.654117", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T00:53:27.654117.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T00:53:27.654117.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_08_18T00_53_27.654117", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-18T00:53:27.654117.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-18T00:53:27.654117.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_08_18T00_53_27.654117", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T00:53:27.654117.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T00:53:27.654117.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_08_18T00_53_27.654117", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T00:53:27.654117.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T00:53:27.654117.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_08_18T00_53_27.654117", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T00:53:27.654117.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T00:53:27.654117.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_08_18T00_53_27.654117", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-18T00:53:27.654117.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-18T00:53:27.654117.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_08_18T00_53_27.654117", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T00:53:27.654117.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T00:53:27.654117.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_08_18T00_53_27.654117", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T00:53:27.654117.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T00:53:27.654117.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_08_18T00_53_27.654117", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T00:53:27.654117.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T00:53:27.654117.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_08_18T00_53_27.654117", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T00:53:27.654117.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T00:53:27.654117.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_08_18T00_53_27.654117", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T00:53:27.654117.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T00:53:27.654117.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_08_18T00_53_27.654117", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T00:53:27.654117.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T00:53:27.654117.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_08_18T00_53_27.654117", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T00:53:27.654117.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T00:53:27.654117.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_08_18T00_53_27.654117", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T00:53:27.654117.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T00:53:27.654117.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_08_18T00_53_27.654117", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T00:53:27.654117.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T00:53:27.654117.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_08_18T00_53_27.654117", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T00:53:27.654117.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T00:53:27.654117.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_08_18T00_53_27.654117", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T00:53:27.654117.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T00:53:27.654117.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_08_18T00_53_27.654117", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T00:53:27.654117.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T00:53:27.654117.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_08_18T00_53_27.654117", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T00:53:27.654117.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T00:53:27.654117.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_08_18T00_53_27.654117", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T00:53:27.654117.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T00:53:27.654117.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_08_18T00_53_27.654117", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-18T00:53:27.654117.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-18T00:53:27.654117.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_08_18T00_53_27.654117", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T00:53:27.654117.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T00:53:27.654117.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_08_18T00_53_27.654117", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-18T00:53:27.654117.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-18T00:53:27.654117.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_08_18T00_53_27.654117", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T00:53:27.654117.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T00:53:27.654117.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_08_18T00_53_27.654117", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T00:53:27.654117.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T00:53:27.654117.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_08_18T00_53_27.654117", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T00:53:27.654117.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T00:53:27.654117.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_08_18T00_53_27.654117", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-18T00:53:27.654117.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-18T00:53:27.654117.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_08_18T00_53_27.654117", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-18T00:53:27.654117.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-18T00:53:27.654117.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_08_18T00_53_27.654117", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T00:53:27.654117.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T00:53:27.654117.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_08_18T00_53_27.654117", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T00:53:27.654117.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T00:53:27.654117.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_08_18T00_53_27.654117", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T00:53:27.654117.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T00:53:27.654117.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_08_18T00_53_27.654117", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T00:53:27.654117.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T00:53:27.654117.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_08_18T00_53_27.654117", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-18T00:53:27.654117.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-18T00:53:27.654117.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_08_18T00_53_27.654117", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-18T00:53:27.654117.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-18T00:53:27.654117.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_08_18T00_53_27.654117", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-18T00:53:27.654117.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-18T00:53:27.654117.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_08_18T00_53_27.654117", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T00:53:27.654117.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T00:53:27.654117.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_08_18T00_53_27.654117", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-18T00:53:27.654117.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-18T00:53:27.654117.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_08_18T00_53_27.654117", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T00:53:27.654117.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T00:53:27.654117.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_08_18T00_53_27.654117", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T00:53:27.654117.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T00:53:27.654117.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_08_18T00_53_27.654117", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-18T00:53:27.654117.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-18T00:53:27.654117.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_08_18T00_53_27.654117", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-18T00:53:27.654117.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-18T00:53:27.654117.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_08_18T00_53_27.654117", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-18T00:53:27.654117.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-18T00:53:27.654117.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_08_18T00_53_27.654117", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T00:53:27.654117.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T00:53:27.654117.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_08_18T00_53_27.654117", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-18T00:53:27.654117.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-18T00:53:27.654117.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_08_18T00_53_27.654117", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-18T00:53:27.654117.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-18T00:53:27.654117.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_08_18T00_53_27.654117", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-18T00:53:27.654117.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-18T00:53:27.654117.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_08_18T00_53_27.654117", "path": ["results_2023-08-18T00:53:27.654117.parquet"]}, {"split": "latest", "path": ["results_2023-08-18T00:53:27.654117.parquet"]}]}]}
2023-08-27T11:41:55+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of yihan6324/llama2-7b-instructmining-orca-40k ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model yihan6324/llama2-7b-instructmining-orca-40k on the Open LLM Leaderboard. The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-08-18T00:53:27.654117 (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of yihan6324/llama2-7b-instructmining-orca-40k", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model yihan6324/llama2-7b-instructmining-orca-40k on the Open LLM Leaderboard.\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-08-18T00:53:27.654117 (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of yihan6324/llama2-7b-instructmining-orca-40k", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model yihan6324/llama2-7b-instructmining-orca-40k on the Open LLM Leaderboard.\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-08-18T00:53:27.654117 (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 28, 31, 176, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of yihan6324/llama2-7b-instructmining-orca-40k## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model yihan6324/llama2-7b-instructmining-orca-40k on the Open LLM Leaderboard.\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-08-18T00:53:27.654117 (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
360c14ae4ba87ea9b323f4e601657d2d1c9d1e19
# Dataset Card for Evaluation run of yihan6324/llama2-13b-instructmining-40k-sharegpt ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/yihan6324/llama2-13b-instructmining-40k-sharegpt - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [yihan6324/llama2-13b-instructmining-40k-sharegpt](https://huggingface.co/yihan6324/llama2-13b-instructmining-40k-sharegpt) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_yihan6324__llama2-13b-instructmining-40k-sharegpt", "harness_truthfulqa_mc_0", split="train") ``` ## Latest results These are the [latest results from run 2023-08-17T15:06:33.773565](https://huggingface.co/datasets/open-llm-leaderboard/details_yihan6324__llama2-13b-instructmining-40k-sharegpt/blob/main/results_2023-08-17T15%3A06%3A33.773565.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.5659221179678169, "acc_stderr": 0.03435610194042996, "acc_norm": 0.5698526105353496, "acc_norm_stderr": 0.03433517528186645, "mc1": 0.35862913096695226, "mc1_stderr": 0.016789289499502022, "mc2": 0.5244082340441981, "mc2_stderr": 0.015623466277080963 }, "harness|arc:challenge|25": { "acc": 0.5656996587030717, "acc_stderr": 0.01448470304885736, "acc_norm": 0.5998293515358362, "acc_norm_stderr": 0.014317197787809169 }, "harness|hellaswag|10": { "acc": 0.6328420633339972, "acc_stderr": 0.004810449343572395, "acc_norm": 0.8306114319856602, "acc_norm_stderr": 0.003743281749373634 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.35, "acc_stderr": 0.0479372485441102, "acc_norm": 0.35, "acc_norm_stderr": 0.0479372485441102 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.48148148148148145, "acc_stderr": 0.043163785995113245, "acc_norm": 0.48148148148148145, "acc_norm_stderr": 0.043163785995113245 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.625, "acc_stderr": 0.039397364351956274, "acc_norm": 0.625, "acc_norm_stderr": 0.039397364351956274 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.59, "acc_stderr": 0.049431107042371025, "acc_norm": 0.59, "acc_norm_stderr": 0.049431107042371025 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.6150943396226415, "acc_stderr": 0.02994649856769995, "acc_norm": 0.6150943396226415, "acc_norm_stderr": 0.02994649856769995 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.6111111111111112, "acc_stderr": 0.04076663253918567, "acc_norm": 0.6111111111111112, "acc_norm_stderr": 0.04076663253918567 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.41, "acc_stderr": 0.04943110704237102, "acc_norm": 0.41, "acc_norm_stderr": 0.04943110704237102 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.47, "acc_stderr": 0.05016135580465919, "acc_norm": 0.47, "acc_norm_stderr": 0.05016135580465919 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.37, "acc_stderr": 0.048523658709391, "acc_norm": 0.37, "acc_norm_stderr": 0.048523658709391 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.5895953757225434, "acc_stderr": 0.03750757044895537, "acc_norm": 0.5895953757225434, "acc_norm_stderr": 0.03750757044895537 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.3431372549019608, "acc_stderr": 0.04724007352383887, "acc_norm": 0.3431372549019608, "acc_norm_stderr": 0.04724007352383887 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.71, "acc_stderr": 0.045604802157206845, "acc_norm": 0.71, "acc_norm_stderr": 0.045604802157206845 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.4595744680851064, "acc_stderr": 0.03257901482099835, "acc_norm": 0.4595744680851064, "acc_norm_stderr": 0.03257901482099835 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.23684210526315788, "acc_stderr": 0.039994238792813344, "acc_norm": 0.23684210526315788, "acc_norm_stderr": 0.039994238792813344 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.503448275862069, "acc_stderr": 0.04166567577101579, "acc_norm": 0.503448275862069, "acc_norm_stderr": 0.04166567577101579 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.29365079365079366, "acc_stderr": 0.023456037383982026, "acc_norm": 0.29365079365079366, "acc_norm_stderr": 0.023456037383982026 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.36507936507936506, "acc_stderr": 0.04306241259127153, "acc_norm": 0.36507936507936506, "acc_norm_stderr": 0.04306241259127153 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.35, "acc_stderr": 0.047937248544110196, "acc_norm": 0.35, "acc_norm_stderr": 0.047937248544110196 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.6709677419354839, "acc_stderr": 0.026729499068349958, "acc_norm": 0.6709677419354839, "acc_norm_stderr": 0.026729499068349958 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.45320197044334976, "acc_stderr": 0.03502544650845872, "acc_norm": 0.45320197044334976, "acc_norm_stderr": 0.03502544650845872 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.56, "acc_stderr": 0.049888765156985884, "acc_norm": 0.56, "acc_norm_stderr": 0.049888765156985884 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.6787878787878788, "acc_stderr": 0.0364620496325381, "acc_norm": 0.6787878787878788, "acc_norm_stderr": 0.0364620496325381 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.7171717171717171, "acc_stderr": 0.03208779558786753, "acc_norm": 0.7171717171717171, "acc_norm_stderr": 0.03208779558786753 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.8134715025906736, "acc_stderr": 0.02811209121011748, "acc_norm": 0.8134715025906736, "acc_norm_stderr": 0.02811209121011748 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.5333333333333333, "acc_stderr": 0.025294608023986472, "acc_norm": 0.5333333333333333, "acc_norm_stderr": 0.025294608023986472 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.26666666666666666, "acc_stderr": 0.026962424325073838, "acc_norm": 0.26666666666666666, "acc_norm_stderr": 0.026962424325073838 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.5378151260504201, "acc_stderr": 0.032385469487589795, "acc_norm": 0.5378151260504201, "acc_norm_stderr": 0.032385469487589795 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.304635761589404, "acc_stderr": 0.037579499229433426, "acc_norm": 0.304635761589404, "acc_norm_stderr": 0.037579499229433426 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.7669724770642202, "acc_stderr": 0.018125669180861507, "acc_norm": 0.7669724770642202, "acc_norm_stderr": 0.018125669180861507 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.4305555555555556, "acc_stderr": 0.03376922151252336, "acc_norm": 0.4305555555555556, "acc_norm_stderr": 0.03376922151252336 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.7647058823529411, "acc_stderr": 0.029771775228145638, "acc_norm": 0.7647058823529411, "acc_norm_stderr": 0.029771775228145638 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.7510548523206751, "acc_stderr": 0.028146970599422644, "acc_norm": 0.7510548523206751, "acc_norm_stderr": 0.028146970599422644 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.672645739910314, "acc_stderr": 0.03149384670994131, "acc_norm": 0.672645739910314, "acc_norm_stderr": 0.03149384670994131 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.6183206106870229, "acc_stderr": 0.04260735157644559, "acc_norm": 0.6183206106870229, "acc_norm_stderr": 0.04260735157644559 }, "harness|hendrycksTest-international_law|5": { "acc": 0.71900826446281, "acc_stderr": 0.041032038305145124, "acc_norm": 0.71900826446281, "acc_norm_stderr": 0.041032038305145124 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.7129629629629629, "acc_stderr": 0.043733130409147614, "acc_norm": 0.7129629629629629, "acc_norm_stderr": 0.043733130409147614 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.6625766871165644, "acc_stderr": 0.03714908409935574, "acc_norm": 0.6625766871165644, "acc_norm_stderr": 0.03714908409935574 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.33035714285714285, "acc_stderr": 0.04464285714285713, "acc_norm": 0.33035714285714285, "acc_norm_stderr": 0.04464285714285713 }, "harness|hendrycksTest-management|5": { "acc": 0.6796116504854369, "acc_stderr": 0.04620284082280041, "acc_norm": 0.6796116504854369, "acc_norm_stderr": 0.04620284082280041 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8418803418803419, "acc_stderr": 0.02390232554956041, "acc_norm": 0.8418803418803419, "acc_norm_stderr": 0.02390232554956041 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.58, "acc_stderr": 0.04960449637488583, "acc_norm": 0.58, "acc_norm_stderr": 0.04960449637488583 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.7509578544061303, "acc_stderr": 0.015464676163395958, "acc_norm": 0.7509578544061303, "acc_norm_stderr": 0.015464676163395958 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.630057803468208, "acc_stderr": 0.025992472029306386, "acc_norm": 0.630057803468208, "acc_norm_stderr": 0.025992472029306386 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.4122905027932961, "acc_stderr": 0.01646320023811453, "acc_norm": 0.4122905027932961, "acc_norm_stderr": 0.01646320023811453 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.6013071895424836, "acc_stderr": 0.028036092273891776, "acc_norm": 0.6013071895424836, "acc_norm_stderr": 0.028036092273891776 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.6205787781350482, "acc_stderr": 0.027559949802347817, "acc_norm": 0.6205787781350482, "acc_norm_stderr": 0.027559949802347817 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.6265432098765432, "acc_stderr": 0.026915003011380154, "acc_norm": 0.6265432098765432, "acc_norm_stderr": 0.026915003011380154 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.4219858156028369, "acc_stderr": 0.029462189233370593, "acc_norm": 0.4219858156028369, "acc_norm_stderr": 0.029462189233370593 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.4256844850065189, "acc_stderr": 0.012628393551811943, "acc_norm": 0.4256844850065189, "acc_norm_stderr": 0.012628393551811943 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.5735294117647058, "acc_stderr": 0.03004261583271486, "acc_norm": 0.5735294117647058, "acc_norm_stderr": 0.03004261583271486 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.5784313725490197, "acc_stderr": 0.01997742260022747, "acc_norm": 0.5784313725490197, "acc_norm_stderr": 0.01997742260022747 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6181818181818182, "acc_stderr": 0.046534298079135075, "acc_norm": 0.6181818181818182, "acc_norm_stderr": 0.046534298079135075 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.6285714285714286, "acc_stderr": 0.03093285879278985, "acc_norm": 0.6285714285714286, "acc_norm_stderr": 0.03093285879278985 }, "harness|hendrycksTest-sociology|5": { "acc": 0.7761194029850746, "acc_stderr": 0.029475250236017193, "acc_norm": 0.7761194029850746, "acc_norm_stderr": 0.029475250236017193 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.82, "acc_stderr": 0.038612291966536934, "acc_norm": 0.82, "acc_norm_stderr": 0.038612291966536934 }, "harness|hendrycksTest-virology|5": { "acc": 0.4939759036144578, "acc_stderr": 0.03892212195333045, "acc_norm": 0.4939759036144578, "acc_norm_stderr": 0.03892212195333045 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.7719298245614035, "acc_stderr": 0.032180937956023566, "acc_norm": 0.7719298245614035, "acc_norm_stderr": 0.032180937956023566 }, "harness|truthfulqa:mc|0": { "mc1": 0.35862913096695226, "mc1_stderr": 0.016789289499502022, "mc2": 0.5244082340441981, "mc2_stderr": 0.015623466277080963 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_yihan6324__llama2-13b-instructmining-40k-sharegpt
[ "region:us" ]
2023-08-18T17:57:56+00:00
{"pretty_name": "Evaluation run of yihan6324/llama2-13b-instructmining-40k-sharegpt", "dataset_summary": "Dataset automatically created during the evaluation run of model [yihan6324/llama2-13b-instructmining-40k-sharegpt](https://huggingface.co/yihan6324/llama2-13b-instructmining-40k-sharegpt) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_yihan6324__llama2-13b-instructmining-40k-sharegpt\",\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-08-17T15:06:33.773565](https://huggingface.co/datasets/open-llm-leaderboard/details_yihan6324__llama2-13b-instructmining-40k-sharegpt/blob/main/results_2023-08-17T15%3A06%3A33.773565.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5659221179678169,\n \"acc_stderr\": 0.03435610194042996,\n \"acc_norm\": 0.5698526105353496,\n \"acc_norm_stderr\": 0.03433517528186645,\n \"mc1\": 0.35862913096695226,\n \"mc1_stderr\": 0.016789289499502022,\n \"mc2\": 0.5244082340441981,\n \"mc2_stderr\": 0.015623466277080963\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5656996587030717,\n \"acc_stderr\": 0.01448470304885736,\n \"acc_norm\": 0.5998293515358362,\n \"acc_norm_stderr\": 0.014317197787809169\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6328420633339972,\n \"acc_stderr\": 0.004810449343572395,\n \"acc_norm\": 0.8306114319856602,\n \"acc_norm_stderr\": 0.003743281749373634\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.48148148148148145,\n \"acc_stderr\": 0.043163785995113245,\n \"acc_norm\": 0.48148148148148145,\n \"acc_norm_stderr\": 0.043163785995113245\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.625,\n \"acc_stderr\": 0.039397364351956274,\n \"acc_norm\": 0.625,\n \"acc_norm_stderr\": 0.039397364351956274\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.59,\n \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.59,\n \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6150943396226415,\n \"acc_stderr\": 0.02994649856769995,\n \"acc_norm\": 0.6150943396226415,\n \"acc_norm_stderr\": 0.02994649856769995\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6111111111111112,\n \"acc_stderr\": 0.04076663253918567,\n \"acc_norm\": 0.6111111111111112,\n \"acc_norm_stderr\": 0.04076663253918567\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.41,\n \"acc_stderr\": 0.04943110704237102,\n \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.04943110704237102\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.47,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5895953757225434,\n \"acc_stderr\": 0.03750757044895537,\n \"acc_norm\": 0.5895953757225434,\n \"acc_norm_stderr\": 0.03750757044895537\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.3431372549019608,\n \"acc_stderr\": 0.04724007352383887,\n \"acc_norm\": 0.3431372549019608,\n \"acc_norm_stderr\": 0.04724007352383887\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.4595744680851064,\n \"acc_stderr\": 0.03257901482099835,\n \"acc_norm\": 0.4595744680851064,\n \"acc_norm_stderr\": 0.03257901482099835\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.23684210526315788,\n \"acc_stderr\": 0.039994238792813344,\n \"acc_norm\": 0.23684210526315788,\n \"acc_norm_stderr\": 0.039994238792813344\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.503448275862069,\n \"acc_stderr\": 0.04166567577101579,\n \"acc_norm\": 0.503448275862069,\n \"acc_norm_stderr\": 0.04166567577101579\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.29365079365079366,\n \"acc_stderr\": 0.023456037383982026,\n \"acc_norm\": 0.29365079365079366,\n \"acc_norm_stderr\": 0.023456037383982026\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.36507936507936506,\n \"acc_stderr\": 0.04306241259127153,\n \"acc_norm\": 0.36507936507936506,\n \"acc_norm_stderr\": 0.04306241259127153\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6709677419354839,\n \"acc_stderr\": 0.026729499068349958,\n \"acc_norm\": 0.6709677419354839,\n \"acc_norm_stderr\": 0.026729499068349958\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.45320197044334976,\n \"acc_stderr\": 0.03502544650845872,\n \"acc_norm\": 0.45320197044334976,\n \"acc_norm_stderr\": 0.03502544650845872\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.56,\n \"acc_stderr\": 0.049888765156985884,\n \"acc_norm\": 0.56,\n \"acc_norm_stderr\": 0.049888765156985884\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.6787878787878788,\n \"acc_stderr\": 0.0364620496325381,\n \"acc_norm\": 0.6787878787878788,\n \"acc_norm_stderr\": 0.0364620496325381\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7171717171717171,\n \"acc_stderr\": 0.03208779558786753,\n \"acc_norm\": 0.7171717171717171,\n \"acc_norm_stderr\": 0.03208779558786753\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8134715025906736,\n \"acc_stderr\": 0.02811209121011748,\n \"acc_norm\": 0.8134715025906736,\n \"acc_norm_stderr\": 0.02811209121011748\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.5333333333333333,\n \"acc_stderr\": 0.025294608023986472,\n \"acc_norm\": 0.5333333333333333,\n \"acc_norm_stderr\": 0.025294608023986472\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.26666666666666666,\n \"acc_stderr\": 0.026962424325073838,\n \"acc_norm\": 0.26666666666666666,\n \"acc_norm_stderr\": 0.026962424325073838\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.5378151260504201,\n \"acc_stderr\": 0.032385469487589795,\n \"acc_norm\": 0.5378151260504201,\n \"acc_norm_stderr\": 0.032385469487589795\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.304635761589404,\n \"acc_stderr\": 0.037579499229433426,\n \"acc_norm\": 0.304635761589404,\n \"acc_norm_stderr\": 0.037579499229433426\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.7669724770642202,\n \"acc_stderr\": 0.018125669180861507,\n \"acc_norm\": 0.7669724770642202,\n \"acc_norm_stderr\": 0.018125669180861507\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.4305555555555556,\n \"acc_stderr\": 0.03376922151252336,\n \"acc_norm\": 0.4305555555555556,\n \"acc_norm_stderr\": 0.03376922151252336\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7647058823529411,\n \"acc_stderr\": 0.029771775228145638,\n \"acc_norm\": 0.7647058823529411,\n \"acc_norm_stderr\": 0.029771775228145638\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7510548523206751,\n \"acc_stderr\": 0.028146970599422644,\n \"acc_norm\": 0.7510548523206751,\n \"acc_norm_stderr\": 0.028146970599422644\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.672645739910314,\n \"acc_stderr\": 0.03149384670994131,\n \"acc_norm\": 0.672645739910314,\n \"acc_norm_stderr\": 0.03149384670994131\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.6183206106870229,\n \"acc_stderr\": 0.04260735157644559,\n \"acc_norm\": 0.6183206106870229,\n \"acc_norm_stderr\": 0.04260735157644559\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.71900826446281,\n \"acc_stderr\": 0.041032038305145124,\n \"acc_norm\": 0.71900826446281,\n \"acc_norm_stderr\": 0.041032038305145124\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7129629629629629,\n \"acc_stderr\": 0.043733130409147614,\n \"acc_norm\": 0.7129629629629629,\n \"acc_norm_stderr\": 0.043733130409147614\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.6625766871165644,\n \"acc_stderr\": 0.03714908409935574,\n \"acc_norm\": 0.6625766871165644,\n \"acc_norm_stderr\": 0.03714908409935574\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.33035714285714285,\n \"acc_stderr\": 0.04464285714285713,\n \"acc_norm\": 0.33035714285714285,\n \"acc_norm_stderr\": 0.04464285714285713\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.6796116504854369,\n \"acc_stderr\": 0.04620284082280041,\n \"acc_norm\": 0.6796116504854369,\n \"acc_norm_stderr\": 0.04620284082280041\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8418803418803419,\n \"acc_stderr\": 0.02390232554956041,\n \"acc_norm\": 0.8418803418803419,\n \"acc_norm_stderr\": 0.02390232554956041\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.58,\n \"acc_stderr\": 0.04960449637488583,\n \"acc_norm\": 0.58,\n \"acc_norm_stderr\": 0.04960449637488583\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7509578544061303,\n \"acc_stderr\": 0.015464676163395958,\n \"acc_norm\": 0.7509578544061303,\n \"acc_norm_stderr\": 0.015464676163395958\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.630057803468208,\n \"acc_stderr\": 0.025992472029306386,\n \"acc_norm\": 0.630057803468208,\n \"acc_norm_stderr\": 0.025992472029306386\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4122905027932961,\n \"acc_stderr\": 0.01646320023811453,\n \"acc_norm\": 0.4122905027932961,\n \"acc_norm_stderr\": 0.01646320023811453\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.6013071895424836,\n \"acc_stderr\": 0.028036092273891776,\n \"acc_norm\": 0.6013071895424836,\n \"acc_norm_stderr\": 0.028036092273891776\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6205787781350482,\n \"acc_stderr\": 0.027559949802347817,\n \"acc_norm\": 0.6205787781350482,\n \"acc_norm_stderr\": 0.027559949802347817\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.6265432098765432,\n \"acc_stderr\": 0.026915003011380154,\n \"acc_norm\": 0.6265432098765432,\n \"acc_norm_stderr\": 0.026915003011380154\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4219858156028369,\n \"acc_stderr\": 0.029462189233370593,\n \"acc_norm\": 0.4219858156028369,\n \"acc_norm_stderr\": 0.029462189233370593\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4256844850065189,\n \"acc_stderr\": 0.012628393551811943,\n \"acc_norm\": 0.4256844850065189,\n \"acc_norm_stderr\": 0.012628393551811943\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.5735294117647058,\n \"acc_stderr\": 0.03004261583271486,\n \"acc_norm\": 0.5735294117647058,\n \"acc_norm_stderr\": 0.03004261583271486\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.5784313725490197,\n \"acc_stderr\": 0.01997742260022747,\n \"acc_norm\": 0.5784313725490197,\n \"acc_norm_stderr\": 0.01997742260022747\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6181818181818182,\n \"acc_stderr\": 0.046534298079135075,\n \"acc_norm\": 0.6181818181818182,\n \"acc_norm_stderr\": 0.046534298079135075\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.6285714285714286,\n \"acc_stderr\": 0.03093285879278985,\n \"acc_norm\": 0.6285714285714286,\n \"acc_norm_stderr\": 0.03093285879278985\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7761194029850746,\n \"acc_stderr\": 0.029475250236017193,\n \"acc_norm\": 0.7761194029850746,\n \"acc_norm_stderr\": 0.029475250236017193\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.82,\n \"acc_stderr\": 0.038612291966536934,\n \"acc_norm\": 0.82,\n \"acc_norm_stderr\": 0.038612291966536934\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4939759036144578,\n \"acc_stderr\": 0.03892212195333045,\n \"acc_norm\": 0.4939759036144578,\n \"acc_norm_stderr\": 0.03892212195333045\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.7719298245614035,\n \"acc_stderr\": 0.032180937956023566,\n \"acc_norm\": 0.7719298245614035,\n \"acc_norm_stderr\": 0.032180937956023566\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.35862913096695226,\n \"mc1_stderr\": 0.016789289499502022,\n \"mc2\": 0.5244082340441981,\n \"mc2_stderr\": 0.015623466277080963\n }\n}\n```", "repo_url": "https://huggingface.co/yihan6324/llama2-13b-instructmining-40k-sharegpt", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_08_17T15_06_33.773565", "path": ["**/details_harness|arc:challenge|25_2023-08-17T15:06:33.773565.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-08-17T15:06:33.773565.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_08_17T15_06_33.773565", "path": ["**/details_harness|hellaswag|10_2023-08-17T15:06:33.773565.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-08-17T15:06:33.773565.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_08_17T15_06_33.773565", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T15:06:33.773565.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-17T15:06:33.773565.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-17T15:06:33.773565.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T15:06:33.773565.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T15:06:33.773565.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-17T15:06:33.773565.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T15:06:33.773565.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T15:06:33.773565.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T15:06:33.773565.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T15:06:33.773565.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-17T15:06:33.773565.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-17T15:06:33.773565.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T15:06:33.773565.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-17T15:06:33.773565.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T15:06:33.773565.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T15:06:33.773565.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T15:06:33.773565.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-17T15:06:33.773565.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T15:06:33.773565.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T15:06:33.773565.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T15:06:33.773565.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T15:06:33.773565.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T15:06:33.773565.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T15:06:33.773565.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T15:06:33.773565.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T15:06:33.773565.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T15:06:33.773565.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T15:06:33.773565.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T15:06:33.773565.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T15:06:33.773565.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T15:06:33.773565.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T15:06:33.773565.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-17T15:06:33.773565.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T15:06:33.773565.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-17T15:06:33.773565.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T15:06:33.773565.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T15:06:33.773565.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T15:06:33.773565.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-17T15:06:33.773565.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-17T15:06:33.773565.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T15:06:33.773565.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T15:06:33.773565.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T15:06:33.773565.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T15:06:33.773565.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-17T15:06:33.773565.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-17T15:06:33.773565.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-17T15:06:33.773565.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T15:06:33.773565.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-17T15:06:33.773565.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T15:06:33.773565.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T15:06:33.773565.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-17T15:06:33.773565.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-17T15:06:33.773565.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-17T15:06:33.773565.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T15:06:33.773565.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-17T15:06:33.773565.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-17T15:06:33.773565.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T15:06:33.773565.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-17T15:06:33.773565.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-17T15:06:33.773565.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T15:06:33.773565.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T15:06:33.773565.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-17T15:06:33.773565.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T15:06:33.773565.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T15:06:33.773565.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T15:06:33.773565.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T15:06:33.773565.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-17T15:06:33.773565.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-17T15:06:33.773565.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T15:06:33.773565.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-17T15:06:33.773565.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T15:06:33.773565.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T15:06:33.773565.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T15:06:33.773565.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-17T15:06:33.773565.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T15:06:33.773565.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T15:06:33.773565.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T15:06:33.773565.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T15:06:33.773565.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T15:06:33.773565.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T15:06:33.773565.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T15:06:33.773565.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T15:06:33.773565.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T15:06:33.773565.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T15:06:33.773565.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T15:06:33.773565.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T15:06:33.773565.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T15:06:33.773565.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T15:06:33.773565.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-17T15:06:33.773565.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T15:06:33.773565.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-17T15:06:33.773565.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T15:06:33.773565.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T15:06:33.773565.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T15:06:33.773565.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-17T15:06:33.773565.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-17T15:06:33.773565.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T15:06:33.773565.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T15:06:33.773565.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T15:06:33.773565.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T15:06:33.773565.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-17T15:06:33.773565.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-17T15:06:33.773565.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-17T15:06:33.773565.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T15:06:33.773565.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-17T15:06:33.773565.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T15:06:33.773565.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T15:06:33.773565.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-17T15:06:33.773565.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-17T15:06:33.773565.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-17T15:06:33.773565.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T15:06:33.773565.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-17T15:06:33.773565.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-17T15:06:33.773565.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_08_17T15_06_33.773565", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T15:06:33.773565.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T15:06:33.773565.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_08_17T15_06_33.773565", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-17T15:06:33.773565.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-17T15:06:33.773565.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_08_17T15_06_33.773565", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-17T15:06:33.773565.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-17T15:06:33.773565.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_08_17T15_06_33.773565", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T15:06:33.773565.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T15:06:33.773565.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_08_17T15_06_33.773565", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T15:06:33.773565.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T15:06:33.773565.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_08_17T15_06_33.773565", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-17T15:06:33.773565.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-17T15:06:33.773565.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_08_17T15_06_33.773565", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T15:06:33.773565.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T15:06:33.773565.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_08_17T15_06_33.773565", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T15:06:33.773565.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T15:06:33.773565.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_08_17T15_06_33.773565", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T15:06:33.773565.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T15:06:33.773565.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_08_17T15_06_33.773565", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T15:06:33.773565.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T15:06:33.773565.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_08_17T15_06_33.773565", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-17T15:06:33.773565.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-17T15:06:33.773565.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_08_17T15_06_33.773565", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-17T15:06:33.773565.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-17T15:06:33.773565.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_08_17T15_06_33.773565", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T15:06:33.773565.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T15:06:33.773565.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_08_17T15_06_33.773565", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-17T15:06:33.773565.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-17T15:06:33.773565.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_08_17T15_06_33.773565", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T15:06:33.773565.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T15:06:33.773565.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_08_17T15_06_33.773565", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T15:06:33.773565.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T15:06:33.773565.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_08_17T15_06_33.773565", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T15:06:33.773565.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T15:06:33.773565.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_08_17T15_06_33.773565", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-17T15:06:33.773565.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-17T15:06:33.773565.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_08_17T15_06_33.773565", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T15:06:33.773565.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T15:06:33.773565.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_08_17T15_06_33.773565", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T15:06:33.773565.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T15:06:33.773565.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_08_17T15_06_33.773565", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T15:06:33.773565.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T15:06:33.773565.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_08_17T15_06_33.773565", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T15:06:33.773565.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T15:06:33.773565.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_08_17T15_06_33.773565", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T15:06:33.773565.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T15:06:33.773565.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_08_17T15_06_33.773565", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T15:06:33.773565.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T15:06:33.773565.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_08_17T15_06_33.773565", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T15:06:33.773565.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T15:06:33.773565.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_08_17T15_06_33.773565", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T15:06:33.773565.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T15:06:33.773565.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_08_17T15_06_33.773565", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T15:06:33.773565.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T15:06:33.773565.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_08_17T15_06_33.773565", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T15:06:33.773565.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T15:06:33.773565.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_08_17T15_06_33.773565", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T15:06:33.773565.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T15:06:33.773565.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_08_17T15_06_33.773565", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T15:06:33.773565.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T15:06:33.773565.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_08_17T15_06_33.773565", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T15:06:33.773565.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T15:06:33.773565.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_08_17T15_06_33.773565", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T15:06:33.773565.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T15:06:33.773565.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_08_17T15_06_33.773565", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-17T15:06:33.773565.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-17T15:06:33.773565.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_08_17T15_06_33.773565", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T15:06:33.773565.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T15:06:33.773565.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_08_17T15_06_33.773565", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-17T15:06:33.773565.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-17T15:06:33.773565.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_08_17T15_06_33.773565", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T15:06:33.773565.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T15:06:33.773565.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_08_17T15_06_33.773565", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T15:06:33.773565.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T15:06:33.773565.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_08_17T15_06_33.773565", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T15:06:33.773565.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T15:06:33.773565.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_08_17T15_06_33.773565", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-17T15:06:33.773565.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-17T15:06:33.773565.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_08_17T15_06_33.773565", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-17T15:06:33.773565.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-17T15:06:33.773565.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_08_17T15_06_33.773565", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T15:06:33.773565.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T15:06:33.773565.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_08_17T15_06_33.773565", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T15:06:33.773565.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T15:06:33.773565.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_08_17T15_06_33.773565", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T15:06:33.773565.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T15:06:33.773565.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_08_17T15_06_33.773565", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T15:06:33.773565.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T15:06:33.773565.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_08_17T15_06_33.773565", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-17T15:06:33.773565.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-17T15:06:33.773565.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_08_17T15_06_33.773565", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-17T15:06:33.773565.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-17T15:06:33.773565.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_08_17T15_06_33.773565", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-17T15:06:33.773565.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-17T15:06:33.773565.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_08_17T15_06_33.773565", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T15:06:33.773565.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T15:06:33.773565.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_08_17T15_06_33.773565", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-17T15:06:33.773565.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-17T15:06:33.773565.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_08_17T15_06_33.773565", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T15:06:33.773565.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T15:06:33.773565.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_08_17T15_06_33.773565", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T15:06:33.773565.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T15:06:33.773565.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_08_17T15_06_33.773565", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-17T15:06:33.773565.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-17T15:06:33.773565.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_08_17T15_06_33.773565", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-17T15:06:33.773565.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-17T15:06:33.773565.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_08_17T15_06_33.773565", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-17T15:06:33.773565.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-17T15:06:33.773565.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_08_17T15_06_33.773565", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T15:06:33.773565.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T15:06:33.773565.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_08_17T15_06_33.773565", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-17T15:06:33.773565.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-17T15:06:33.773565.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_08_17T15_06_33.773565", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-17T15:06:33.773565.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-17T15:06:33.773565.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_08_17T15_06_33.773565", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-17T15:06:33.773565.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-17T15:06:33.773565.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_08_17T15_06_33.773565", "path": ["results_2023-08-17T15:06:33.773565.parquet"]}, {"split": "latest", "path": ["results_2023-08-17T15:06:33.773565.parquet"]}]}]}
2023-08-27T11:41:57+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of yihan6324/llama2-13b-instructmining-40k-sharegpt ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model yihan6324/llama2-13b-instructmining-40k-sharegpt on the Open LLM Leaderboard. The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-08-17T15:06:33.773565 (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of yihan6324/llama2-13b-instructmining-40k-sharegpt", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model yihan6324/llama2-13b-instructmining-40k-sharegpt on the Open LLM Leaderboard.\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-08-17T15:06:33.773565 (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of yihan6324/llama2-13b-instructmining-40k-sharegpt", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model yihan6324/llama2-13b-instructmining-40k-sharegpt on the Open LLM Leaderboard.\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-08-17T15:06:33.773565 (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 29, 31, 177, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of yihan6324/llama2-13b-instructmining-40k-sharegpt## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model yihan6324/llama2-13b-instructmining-40k-sharegpt on the Open LLM Leaderboard.\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-08-17T15:06:33.773565 (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
9f4fe6ec3bf2c4e132d5ca9c6a2980a6d0b473af
# Dataset Card for Evaluation run of heegyu/LIMA-13b-hf ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/heegyu/LIMA-13b-hf - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [heegyu/LIMA-13b-hf](https://huggingface.co/heegyu/LIMA-13b-hf) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_heegyu__LIMA-13b-hf", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-10-22T00:08:39.312434](https://huggingface.co/datasets/open-llm-leaderboard/details_heegyu__LIMA-13b-hf/blob/main/results_2023-10-22T00-08-39.312434.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.0020973154362416107, "em_stderr": 0.000468506503036833, "f1": 0.05783347315436248, "f1_stderr": 0.0013197558360646307, "acc": 0.4303028471618438, "acc_stderr": 0.009812237277361156 }, "harness|drop|3": { "em": 0.0020973154362416107, "em_stderr": 0.000468506503036833, "f1": 0.05783347315436248, "f1_stderr": 0.0013197558360646307 }, "harness|gsm8k|5": { "acc": 0.0887035633055345, "acc_stderr": 0.00783145873705871 }, "harness|winogrande|5": { "acc": 0.7719021310181531, "acc_stderr": 0.0117930158176636 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_heegyu__LIMA-13b-hf
[ "region:us" ]
2023-08-18T17:58:06+00:00
{"pretty_name": "Evaluation run of heegyu/LIMA-13b-hf", "dataset_summary": "Dataset automatically created during the evaluation run of model [heegyu/LIMA-13b-hf](https://huggingface.co/heegyu/LIMA-13b-hf) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_heegyu__LIMA-13b-hf\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-22T00:08:39.312434](https://huggingface.co/datasets/open-llm-leaderboard/details_heegyu__LIMA-13b-hf/blob/main/results_2023-10-22T00-08-39.312434.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0020973154362416107,\n \"em_stderr\": 0.000468506503036833,\n \"f1\": 0.05783347315436248,\n \"f1_stderr\": 0.0013197558360646307,\n \"acc\": 0.4303028471618438,\n \"acc_stderr\": 0.009812237277361156\n },\n \"harness|drop|3\": {\n \"em\": 0.0020973154362416107,\n \"em_stderr\": 0.000468506503036833,\n \"f1\": 0.05783347315436248,\n \"f1_stderr\": 0.0013197558360646307\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0887035633055345,\n \"acc_stderr\": 0.00783145873705871\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7719021310181531,\n \"acc_stderr\": 0.0117930158176636\n }\n}\n```", "repo_url": "https://huggingface.co/heegyu/LIMA-13b-hf", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_08_17T19_40_51.725558", "path": ["**/details_harness|arc:challenge|25_2023-08-17T19:40:51.725558.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-08-17T19:40:51.725558.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_22T00_08_39.312434", "path": ["**/details_harness|drop|3_2023-10-22T00-08-39.312434.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-22T00-08-39.312434.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_22T00_08_39.312434", "path": ["**/details_harness|gsm8k|5_2023-10-22T00-08-39.312434.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-22T00-08-39.312434.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_08_17T19_40_51.725558", "path": ["**/details_harness|hellaswag|10_2023-08-17T19:40:51.725558.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-08-17T19:40:51.725558.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_08_17T19_40_51.725558", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T19:40:51.725558.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-17T19:40:51.725558.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-17T19:40:51.725558.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T19:40:51.725558.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T19:40:51.725558.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-17T19:40:51.725558.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T19:40:51.725558.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T19:40:51.725558.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T19:40:51.725558.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T19:40:51.725558.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-17T19:40:51.725558.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-17T19:40:51.725558.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T19:40:51.725558.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-17T19:40:51.725558.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T19:40:51.725558.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T19:40:51.725558.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T19:40:51.725558.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-17T19:40:51.725558.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T19:40:51.725558.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T19:40:51.725558.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T19:40:51.725558.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T19:40:51.725558.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T19:40:51.725558.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T19:40:51.725558.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T19:40:51.725558.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T19:40:51.725558.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T19:40:51.725558.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T19:40:51.725558.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T19:40:51.725558.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T19:40:51.725558.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T19:40:51.725558.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T19:40:51.725558.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-17T19:40:51.725558.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T19:40:51.725558.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-17T19:40:51.725558.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T19:40:51.725558.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T19:40:51.725558.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T19:40:51.725558.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-17T19:40:51.725558.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-17T19:40:51.725558.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T19:40:51.725558.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T19:40:51.725558.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T19:40:51.725558.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T19:40:51.725558.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-17T19:40:51.725558.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-17T19:40:51.725558.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-17T19:40:51.725558.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T19:40:51.725558.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-17T19:40:51.725558.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T19:40:51.725558.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T19:40:51.725558.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-17T19:40:51.725558.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-17T19:40:51.725558.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-17T19:40:51.725558.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T19:40:51.725558.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-17T19:40:51.725558.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-17T19:40:51.725558.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T19:40:51.725558.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-17T19:40:51.725558.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-17T19:40:51.725558.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T19:40:51.725558.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T19:40:51.725558.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-17T19:40:51.725558.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T19:40:51.725558.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T19:40:51.725558.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T19:40:51.725558.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T19:40:51.725558.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-17T19:40:51.725558.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-17T19:40:51.725558.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T19:40:51.725558.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-17T19:40:51.725558.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T19:40:51.725558.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T19:40:51.725558.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T19:40:51.725558.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-17T19:40:51.725558.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T19:40:51.725558.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T19:40:51.725558.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T19:40:51.725558.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T19:40:51.725558.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T19:40:51.725558.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T19:40:51.725558.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T19:40:51.725558.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T19:40:51.725558.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T19:40:51.725558.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T19:40:51.725558.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T19:40:51.725558.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T19:40:51.725558.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T19:40:51.725558.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T19:40:51.725558.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-17T19:40:51.725558.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T19:40:51.725558.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-17T19:40:51.725558.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T19:40:51.725558.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T19:40:51.725558.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T19:40:51.725558.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-17T19:40:51.725558.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-17T19:40:51.725558.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T19:40:51.725558.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T19:40:51.725558.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T19:40:51.725558.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T19:40:51.725558.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-17T19:40:51.725558.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-17T19:40:51.725558.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-17T19:40:51.725558.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T19:40:51.725558.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-17T19:40:51.725558.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T19:40:51.725558.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T19:40:51.725558.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-17T19:40:51.725558.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-17T19:40:51.725558.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-17T19:40:51.725558.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T19:40:51.725558.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-17T19:40:51.725558.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-17T19:40:51.725558.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_08_17T19_40_51.725558", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T19:40:51.725558.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T19:40:51.725558.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_08_17T19_40_51.725558", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-17T19:40:51.725558.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-17T19:40:51.725558.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_08_17T19_40_51.725558", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-17T19:40:51.725558.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-17T19:40:51.725558.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_08_17T19_40_51.725558", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T19:40:51.725558.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T19:40:51.725558.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_08_17T19_40_51.725558", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T19:40:51.725558.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T19:40:51.725558.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_08_17T19_40_51.725558", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-17T19:40:51.725558.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-17T19:40:51.725558.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_08_17T19_40_51.725558", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T19:40:51.725558.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T19:40:51.725558.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_08_17T19_40_51.725558", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T19:40:51.725558.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T19:40:51.725558.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_08_17T19_40_51.725558", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T19:40:51.725558.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T19:40:51.725558.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_08_17T19_40_51.725558", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T19:40:51.725558.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T19:40:51.725558.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_08_17T19_40_51.725558", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-17T19:40:51.725558.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-17T19:40:51.725558.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_08_17T19_40_51.725558", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-17T19:40:51.725558.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-17T19:40:51.725558.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_08_17T19_40_51.725558", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T19:40:51.725558.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T19:40:51.725558.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_08_17T19_40_51.725558", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-17T19:40:51.725558.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-17T19:40:51.725558.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_08_17T19_40_51.725558", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T19:40:51.725558.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T19:40:51.725558.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_08_17T19_40_51.725558", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T19:40:51.725558.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T19:40:51.725558.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_08_17T19_40_51.725558", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T19:40:51.725558.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T19:40:51.725558.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_08_17T19_40_51.725558", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-17T19:40:51.725558.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-17T19:40:51.725558.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_08_17T19_40_51.725558", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T19:40:51.725558.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T19:40:51.725558.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_08_17T19_40_51.725558", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T19:40:51.725558.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T19:40:51.725558.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_08_17T19_40_51.725558", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T19:40:51.725558.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T19:40:51.725558.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_08_17T19_40_51.725558", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T19:40:51.725558.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T19:40:51.725558.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_08_17T19_40_51.725558", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T19:40:51.725558.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T19:40:51.725558.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_08_17T19_40_51.725558", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T19:40:51.725558.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T19:40:51.725558.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_08_17T19_40_51.725558", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T19:40:51.725558.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T19:40:51.725558.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_08_17T19_40_51.725558", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T19:40:51.725558.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T19:40:51.725558.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_08_17T19_40_51.725558", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T19:40:51.725558.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T19:40:51.725558.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_08_17T19_40_51.725558", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T19:40:51.725558.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T19:40:51.725558.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_08_17T19_40_51.725558", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T19:40:51.725558.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T19:40:51.725558.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_08_17T19_40_51.725558", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T19:40:51.725558.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T19:40:51.725558.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_08_17T19_40_51.725558", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T19:40:51.725558.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T19:40:51.725558.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_08_17T19_40_51.725558", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T19:40:51.725558.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T19:40:51.725558.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_08_17T19_40_51.725558", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-17T19:40:51.725558.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-17T19:40:51.725558.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_08_17T19_40_51.725558", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T19:40:51.725558.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T19:40:51.725558.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_08_17T19_40_51.725558", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-17T19:40:51.725558.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-17T19:40:51.725558.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_08_17T19_40_51.725558", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T19:40:51.725558.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T19:40:51.725558.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_08_17T19_40_51.725558", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T19:40:51.725558.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T19:40:51.725558.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_08_17T19_40_51.725558", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T19:40:51.725558.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T19:40:51.725558.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_08_17T19_40_51.725558", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-17T19:40:51.725558.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-17T19:40:51.725558.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_08_17T19_40_51.725558", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-17T19:40:51.725558.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-17T19:40:51.725558.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_08_17T19_40_51.725558", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T19:40:51.725558.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T19:40:51.725558.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_08_17T19_40_51.725558", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T19:40:51.725558.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T19:40:51.725558.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_08_17T19_40_51.725558", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T19:40:51.725558.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T19:40:51.725558.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_08_17T19_40_51.725558", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T19:40:51.725558.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T19:40:51.725558.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_08_17T19_40_51.725558", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-17T19:40:51.725558.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-17T19:40:51.725558.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_08_17T19_40_51.725558", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-17T19:40:51.725558.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-17T19:40:51.725558.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_08_17T19_40_51.725558", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-17T19:40:51.725558.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-17T19:40:51.725558.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_08_17T19_40_51.725558", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T19:40:51.725558.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T19:40:51.725558.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_08_17T19_40_51.725558", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-17T19:40:51.725558.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-17T19:40:51.725558.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_08_17T19_40_51.725558", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T19:40:51.725558.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T19:40:51.725558.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_08_17T19_40_51.725558", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T19:40:51.725558.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T19:40:51.725558.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_08_17T19_40_51.725558", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-17T19:40:51.725558.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-17T19:40:51.725558.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_08_17T19_40_51.725558", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-17T19:40:51.725558.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-17T19:40:51.725558.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_08_17T19_40_51.725558", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-17T19:40:51.725558.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-17T19:40:51.725558.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_08_17T19_40_51.725558", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T19:40:51.725558.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T19:40:51.725558.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_08_17T19_40_51.725558", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-17T19:40:51.725558.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-17T19:40:51.725558.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_08_17T19_40_51.725558", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-17T19:40:51.725558.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-17T19:40:51.725558.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_08_17T19_40_51.725558", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-17T19:40:51.725558.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-17T19:40:51.725558.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_22T00_08_39.312434", "path": ["**/details_harness|winogrande|5_2023-10-22T00-08-39.312434.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-22T00-08-39.312434.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_08_17T19_40_51.725558", "path": ["results_2023-08-17T19:40:51.725558.parquet"]}, {"split": "2023_10_22T00_08_39.312434", "path": ["results_2023-10-22T00-08-39.312434.parquet"]}, {"split": "latest", "path": ["results_2023-10-22T00-08-39.312434.parquet"]}]}]}
2023-10-21T23:08:52+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of heegyu/LIMA-13b-hf ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model heegyu/LIMA-13b-hf on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-10-22T00:08:39.312434(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of heegyu/LIMA-13b-hf", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model heegyu/LIMA-13b-hf on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-22T00:08:39.312434(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of heegyu/LIMA-13b-hf", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model heegyu/LIMA-13b-hf on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-22T00:08:39.312434(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 19, 31, 167, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of heegyu/LIMA-13b-hf## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model heegyu/LIMA-13b-hf on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-22T00:08:39.312434(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
7905b4857b0db10a03cfbd46c3bff756ac8d4710
# Dataset Card for Evaluation run of minlik/chinese-alpaca-33b-merged ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/minlik/chinese-alpaca-33b-merged - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [minlik/chinese-alpaca-33b-merged](https://huggingface.co/minlik/chinese-alpaca-33b-merged) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_minlik__chinese-alpaca-33b-merged", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-09-23T18:26:45.770833](https://huggingface.co/datasets/open-llm-leaderboard/details_minlik__chinese-alpaca-33b-merged/blob/main/results_2023-09-23T18-26-45.770833.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.34375, "em_stderr": 0.004864023482291936, "f1": 0.39666317114094085, "f1_stderr": 0.00476283705283174, "acc": 0.4206081596579169, "acc_stderr": 0.00973840020904149 }, "harness|drop|3": { "em": 0.34375, "em_stderr": 0.004864023482291936, "f1": 0.39666317114094085, "f1_stderr": 0.00476283705283174 }, "harness|gsm8k|5": { "acc": 0.0803639120545868, "acc_stderr": 0.007488258573239077 }, "harness|winogrande|5": { "acc": 0.760852407261247, "acc_stderr": 0.011988541844843902 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_minlik__chinese-alpaca-33b-merged
[ "region:us" ]
2023-08-18T17:58:16+00:00
{"pretty_name": "Evaluation run of minlik/chinese-alpaca-33b-merged", "dataset_summary": "Dataset automatically created during the evaluation run of model [minlik/chinese-alpaca-33b-merged](https://huggingface.co/minlik/chinese-alpaca-33b-merged) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_minlik__chinese-alpaca-33b-merged\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-09-23T18:26:45.770833](https://huggingface.co/datasets/open-llm-leaderboard/details_minlik__chinese-alpaca-33b-merged/blob/main/results_2023-09-23T18-26-45.770833.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.34375,\n \"em_stderr\": 0.004864023482291936,\n \"f1\": 0.39666317114094085,\n \"f1_stderr\": 0.00476283705283174,\n \"acc\": 0.4206081596579169,\n \"acc_stderr\": 0.00973840020904149\n },\n \"harness|drop|3\": {\n \"em\": 0.34375,\n \"em_stderr\": 0.004864023482291936,\n \"f1\": 0.39666317114094085,\n \"f1_stderr\": 0.00476283705283174\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0803639120545868,\n \"acc_stderr\": 0.007488258573239077\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.760852407261247,\n \"acc_stderr\": 0.011988541844843902\n }\n}\n```", "repo_url": "https://huggingface.co/minlik/chinese-alpaca-33b-merged", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_08_18T14_44_03.944390", "path": ["**/details_harness|arc:challenge|25_2023-08-18T14:44:03.944390.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-08-18T14:44:03.944390.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_09_23T18_26_45.770833", "path": ["**/details_harness|drop|3_2023-09-23T18-26-45.770833.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-09-23T18-26-45.770833.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_09_23T18_26_45.770833", "path": ["**/details_harness|gsm8k|5_2023-09-23T18-26-45.770833.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-09-23T18-26-45.770833.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_08_18T14_44_03.944390", "path": ["**/details_harness|hellaswag|10_2023-08-18T14:44:03.944390.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-08-18T14:44:03.944390.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_08_18T14_44_03.944390", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T14:44:03.944390.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-18T14:44:03.944390.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-18T14:44:03.944390.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T14:44:03.944390.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T14:44:03.944390.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-18T14:44:03.944390.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T14:44:03.944390.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T14:44:03.944390.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T14:44:03.944390.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T14:44:03.944390.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-18T14:44:03.944390.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-18T14:44:03.944390.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T14:44:03.944390.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-18T14:44:03.944390.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T14:44:03.944390.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T14:44:03.944390.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T14:44:03.944390.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-18T14:44:03.944390.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T14:44:03.944390.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T14:44:03.944390.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T14:44:03.944390.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T14:44:03.944390.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T14:44:03.944390.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T14:44:03.944390.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T14:44:03.944390.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T14:44:03.944390.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T14:44:03.944390.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T14:44:03.944390.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T14:44:03.944390.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T14:44:03.944390.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T14:44:03.944390.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T14:44:03.944390.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-18T14:44:03.944390.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T14:44:03.944390.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-18T14:44:03.944390.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T14:44:03.944390.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T14:44:03.944390.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T14:44:03.944390.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-18T14:44:03.944390.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-18T14:44:03.944390.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T14:44:03.944390.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T14:44:03.944390.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T14:44:03.944390.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T14:44:03.944390.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-18T14:44:03.944390.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-18T14:44:03.944390.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-18T14:44:03.944390.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T14:44:03.944390.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-18T14:44:03.944390.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T14:44:03.944390.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T14:44:03.944390.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-18T14:44:03.944390.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-18T14:44:03.944390.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-18T14:44:03.944390.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T14:44:03.944390.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-18T14:44:03.944390.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-18T14:44:03.944390.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T14:44:03.944390.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-18T14:44:03.944390.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-18T14:44:03.944390.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T14:44:03.944390.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T14:44:03.944390.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-18T14:44:03.944390.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T14:44:03.944390.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T14:44:03.944390.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T14:44:03.944390.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T14:44:03.944390.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-18T14:44:03.944390.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-18T14:44:03.944390.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T14:44:03.944390.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-18T14:44:03.944390.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T14:44:03.944390.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T14:44:03.944390.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T14:44:03.944390.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-18T14:44:03.944390.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T14:44:03.944390.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T14:44:03.944390.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T14:44:03.944390.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T14:44:03.944390.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T14:44:03.944390.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T14:44:03.944390.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T14:44:03.944390.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T14:44:03.944390.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T14:44:03.944390.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T14:44:03.944390.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T14:44:03.944390.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T14:44:03.944390.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T14:44:03.944390.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T14:44:03.944390.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-18T14:44:03.944390.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T14:44:03.944390.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-18T14:44:03.944390.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T14:44:03.944390.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T14:44:03.944390.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T14:44:03.944390.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-18T14:44:03.944390.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-18T14:44:03.944390.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T14:44:03.944390.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T14:44:03.944390.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T14:44:03.944390.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T14:44:03.944390.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-18T14:44:03.944390.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-18T14:44:03.944390.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-18T14:44:03.944390.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T14:44:03.944390.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-18T14:44:03.944390.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T14:44:03.944390.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T14:44:03.944390.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-18T14:44:03.944390.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-18T14:44:03.944390.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-18T14:44:03.944390.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T14:44:03.944390.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-18T14:44:03.944390.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-18T14:44:03.944390.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_08_18T14_44_03.944390", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T14:44:03.944390.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T14:44:03.944390.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_08_18T14_44_03.944390", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-18T14:44:03.944390.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-18T14:44:03.944390.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_08_18T14_44_03.944390", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-18T14:44:03.944390.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-18T14:44:03.944390.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_08_18T14_44_03.944390", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T14:44:03.944390.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T14:44:03.944390.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_08_18T14_44_03.944390", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T14:44:03.944390.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T14:44:03.944390.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_08_18T14_44_03.944390", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-18T14:44:03.944390.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-18T14:44:03.944390.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_08_18T14_44_03.944390", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T14:44:03.944390.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T14:44:03.944390.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_08_18T14_44_03.944390", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T14:44:03.944390.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T14:44:03.944390.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_08_18T14_44_03.944390", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T14:44:03.944390.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T14:44:03.944390.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_08_18T14_44_03.944390", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T14:44:03.944390.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T14:44:03.944390.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_08_18T14_44_03.944390", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-18T14:44:03.944390.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-18T14:44:03.944390.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_08_18T14_44_03.944390", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-18T14:44:03.944390.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-18T14:44:03.944390.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_08_18T14_44_03.944390", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T14:44:03.944390.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T14:44:03.944390.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_08_18T14_44_03.944390", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-18T14:44:03.944390.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-18T14:44:03.944390.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_08_18T14_44_03.944390", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T14:44:03.944390.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T14:44:03.944390.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_08_18T14_44_03.944390", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T14:44:03.944390.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T14:44:03.944390.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_08_18T14_44_03.944390", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T14:44:03.944390.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T14:44:03.944390.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_08_18T14_44_03.944390", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-18T14:44:03.944390.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-18T14:44:03.944390.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_08_18T14_44_03.944390", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T14:44:03.944390.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T14:44:03.944390.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_08_18T14_44_03.944390", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T14:44:03.944390.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T14:44:03.944390.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_08_18T14_44_03.944390", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T14:44:03.944390.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T14:44:03.944390.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_08_18T14_44_03.944390", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T14:44:03.944390.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T14:44:03.944390.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_08_18T14_44_03.944390", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T14:44:03.944390.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T14:44:03.944390.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_08_18T14_44_03.944390", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T14:44:03.944390.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T14:44:03.944390.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_08_18T14_44_03.944390", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T14:44:03.944390.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T14:44:03.944390.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_08_18T14_44_03.944390", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T14:44:03.944390.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T14:44:03.944390.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_08_18T14_44_03.944390", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T14:44:03.944390.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T14:44:03.944390.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_08_18T14_44_03.944390", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T14:44:03.944390.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T14:44:03.944390.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_08_18T14_44_03.944390", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T14:44:03.944390.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T14:44:03.944390.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_08_18T14_44_03.944390", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T14:44:03.944390.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T14:44:03.944390.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_08_18T14_44_03.944390", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T14:44:03.944390.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T14:44:03.944390.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_08_18T14_44_03.944390", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T14:44:03.944390.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T14:44:03.944390.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_08_18T14_44_03.944390", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-18T14:44:03.944390.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-18T14:44:03.944390.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_08_18T14_44_03.944390", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T14:44:03.944390.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T14:44:03.944390.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_08_18T14_44_03.944390", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-18T14:44:03.944390.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-18T14:44:03.944390.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_08_18T14_44_03.944390", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T14:44:03.944390.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T14:44:03.944390.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_08_18T14_44_03.944390", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T14:44:03.944390.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T14:44:03.944390.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_08_18T14_44_03.944390", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T14:44:03.944390.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T14:44:03.944390.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_08_18T14_44_03.944390", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-18T14:44:03.944390.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-18T14:44:03.944390.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_08_18T14_44_03.944390", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-18T14:44:03.944390.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-18T14:44:03.944390.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_08_18T14_44_03.944390", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T14:44:03.944390.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T14:44:03.944390.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_08_18T14_44_03.944390", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T14:44:03.944390.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T14:44:03.944390.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_08_18T14_44_03.944390", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T14:44:03.944390.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T14:44:03.944390.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_08_18T14_44_03.944390", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T14:44:03.944390.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T14:44:03.944390.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_08_18T14_44_03.944390", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-18T14:44:03.944390.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-18T14:44:03.944390.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_08_18T14_44_03.944390", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-18T14:44:03.944390.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-18T14:44:03.944390.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_08_18T14_44_03.944390", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-18T14:44:03.944390.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-18T14:44:03.944390.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_08_18T14_44_03.944390", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T14:44:03.944390.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T14:44:03.944390.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_08_18T14_44_03.944390", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-18T14:44:03.944390.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-18T14:44:03.944390.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_08_18T14_44_03.944390", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T14:44:03.944390.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T14:44:03.944390.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_08_18T14_44_03.944390", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T14:44:03.944390.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T14:44:03.944390.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_08_18T14_44_03.944390", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-18T14:44:03.944390.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-18T14:44:03.944390.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_08_18T14_44_03.944390", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-18T14:44:03.944390.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-18T14:44:03.944390.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_08_18T14_44_03.944390", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-18T14:44:03.944390.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-18T14:44:03.944390.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_08_18T14_44_03.944390", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T14:44:03.944390.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T14:44:03.944390.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_08_18T14_44_03.944390", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-18T14:44:03.944390.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-18T14:44:03.944390.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_08_18T14_44_03.944390", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-18T14:44:03.944390.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-18T14:44:03.944390.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_08_18T14_44_03.944390", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-18T14:44:03.944390.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-18T14:44:03.944390.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_09_23T18_26_45.770833", "path": ["**/details_harness|winogrande|5_2023-09-23T18-26-45.770833.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-09-23T18-26-45.770833.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_08_18T14_44_03.944390", "path": ["results_2023-08-18T14:44:03.944390.parquet"]}, {"split": "2023_09_23T18_26_45.770833", "path": ["results_2023-09-23T18-26-45.770833.parquet"]}, {"split": "latest", "path": ["results_2023-09-23T18-26-45.770833.parquet"]}]}]}
2023-09-23T17:26:57+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of minlik/chinese-alpaca-33b-merged ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model minlik/chinese-alpaca-33b-merged on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-09-23T18:26:45.770833(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of minlik/chinese-alpaca-33b-merged", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model minlik/chinese-alpaca-33b-merged on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-09-23T18:26:45.770833(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of minlik/chinese-alpaca-33b-merged", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model minlik/chinese-alpaca-33b-merged on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-09-23T18:26:45.770833(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 22, 31, 170, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of minlik/chinese-alpaca-33b-merged## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model minlik/chinese-alpaca-33b-merged on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-09-23T18:26:45.770833(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
2a1858c7bd7f034c925e714967e2cc8a202d2e70
# Dataset Card for Evaluation run of bertin-project/bertin-gpt-j-6B-alpaca ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/bertin-project/bertin-gpt-j-6B-alpaca - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [bertin-project/bertin-gpt-j-6B-alpaca](https://huggingface.co/bertin-project/bertin-gpt-j-6B-alpaca) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_bertin-project__bertin-gpt-j-6B-alpaca", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-09-22T17:02:02.199354](https://huggingface.co/datasets/open-llm-leaderboard/details_bertin-project__bertin-gpt-j-6B-alpaca/blob/main/results_2023-09-22T17-02-02.199354.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.016568791946308725, "em_stderr": 0.0013072452323527502, "f1": 0.07589660234899354, "f1_stderr": 0.0018842940437008274, "acc": 0.27900552486187846, "acc_stderr": 0.006978792039554494 }, "harness|drop|3": { "em": 0.016568791946308725, "em_stderr": 0.0013072452323527502, "f1": 0.07589660234899354, "f1_stderr": 0.0018842940437008274 }, "harness|gsm8k|5": { "acc": 0.0, "acc_stderr": 0.0 }, "harness|winogrande|5": { "acc": 0.5580110497237569, "acc_stderr": 0.013957584079108989 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_bertin-project__bertin-gpt-j-6B-alpaca
[ "region:us" ]
2023-08-18T17:58:24+00:00
{"pretty_name": "Evaluation run of bertin-project/bertin-gpt-j-6B-alpaca", "dataset_summary": "Dataset automatically created during the evaluation run of model [bertin-project/bertin-gpt-j-6B-alpaca](https://huggingface.co/bertin-project/bertin-gpt-j-6B-alpaca) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_bertin-project__bertin-gpt-j-6B-alpaca\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-09-22T17:02:02.199354](https://huggingface.co/datasets/open-llm-leaderboard/details_bertin-project__bertin-gpt-j-6B-alpaca/blob/main/results_2023-09-22T17-02-02.199354.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.016568791946308725,\n \"em_stderr\": 0.0013072452323527502,\n \"f1\": 0.07589660234899354,\n \"f1_stderr\": 0.0018842940437008274,\n \"acc\": 0.27900552486187846,\n \"acc_stderr\": 0.006978792039554494\n },\n \"harness|drop|3\": {\n \"em\": 0.016568791946308725,\n \"em_stderr\": 0.0013072452323527502,\n \"f1\": 0.07589660234899354,\n \"f1_stderr\": 0.0018842940437008274\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\": 0.0\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.5580110497237569,\n \"acc_stderr\": 0.013957584079108989\n }\n}\n```", "repo_url": "https://huggingface.co/bertin-project/bertin-gpt-j-6B-alpaca", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_08_17T15_41_33.782681", "path": ["**/details_harness|arc:challenge|25_2023-08-17T15:41:33.782681.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-08-17T15:41:33.782681.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_09_22T17_02_02.199354", "path": ["**/details_harness|drop|3_2023-09-22T17-02-02.199354.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-09-22T17-02-02.199354.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_09_22T17_02_02.199354", "path": ["**/details_harness|gsm8k|5_2023-09-22T17-02-02.199354.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-09-22T17-02-02.199354.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_08_17T15_41_33.782681", "path": ["**/details_harness|hellaswag|10_2023-08-17T15:41:33.782681.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-08-17T15:41:33.782681.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_08_17T15_41_33.782681", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T15:41:33.782681.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-17T15:41:33.782681.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-17T15:41:33.782681.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T15:41:33.782681.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T15:41:33.782681.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-17T15:41:33.782681.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T15:41:33.782681.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T15:41:33.782681.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T15:41:33.782681.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T15:41:33.782681.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-17T15:41:33.782681.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-17T15:41:33.782681.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T15:41:33.782681.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-17T15:41:33.782681.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T15:41:33.782681.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T15:41:33.782681.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T15:41:33.782681.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-17T15:41:33.782681.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T15:41:33.782681.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T15:41:33.782681.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T15:41:33.782681.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T15:41:33.782681.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T15:41:33.782681.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T15:41:33.782681.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T15:41:33.782681.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T15:41:33.782681.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T15:41:33.782681.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T15:41:33.782681.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T15:41:33.782681.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T15:41:33.782681.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T15:41:33.782681.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T15:41:33.782681.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-17T15:41:33.782681.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T15:41:33.782681.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-17T15:41:33.782681.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T15:41:33.782681.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T15:41:33.782681.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T15:41:33.782681.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-17T15:41:33.782681.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-17T15:41:33.782681.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T15:41:33.782681.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T15:41:33.782681.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T15:41:33.782681.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T15:41:33.782681.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-17T15:41:33.782681.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-17T15:41:33.782681.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-17T15:41:33.782681.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T15:41:33.782681.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-17T15:41:33.782681.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T15:41:33.782681.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T15:41:33.782681.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-17T15:41:33.782681.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-17T15:41:33.782681.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-17T15:41:33.782681.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T15:41:33.782681.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-17T15:41:33.782681.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-17T15:41:33.782681.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T15:41:33.782681.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-17T15:41:33.782681.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-17T15:41:33.782681.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T15:41:33.782681.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T15:41:33.782681.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-17T15:41:33.782681.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T15:41:33.782681.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T15:41:33.782681.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T15:41:33.782681.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T15:41:33.782681.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-17T15:41:33.782681.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-17T15:41:33.782681.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T15:41:33.782681.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-17T15:41:33.782681.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T15:41:33.782681.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T15:41:33.782681.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T15:41:33.782681.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-17T15:41:33.782681.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T15:41:33.782681.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T15:41:33.782681.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T15:41:33.782681.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T15:41:33.782681.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T15:41:33.782681.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T15:41:33.782681.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T15:41:33.782681.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T15:41:33.782681.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T15:41:33.782681.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T15:41:33.782681.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T15:41:33.782681.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T15:41:33.782681.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T15:41:33.782681.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T15:41:33.782681.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-17T15:41:33.782681.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T15:41:33.782681.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-17T15:41:33.782681.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T15:41:33.782681.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T15:41:33.782681.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T15:41:33.782681.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-17T15:41:33.782681.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-17T15:41:33.782681.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T15:41:33.782681.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T15:41:33.782681.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T15:41:33.782681.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T15:41:33.782681.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-17T15:41:33.782681.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-17T15:41:33.782681.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-17T15:41:33.782681.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T15:41:33.782681.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-17T15:41:33.782681.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T15:41:33.782681.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T15:41:33.782681.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-17T15:41:33.782681.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-17T15:41:33.782681.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-17T15:41:33.782681.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T15:41:33.782681.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-17T15:41:33.782681.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-17T15:41:33.782681.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_08_17T15_41_33.782681", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T15:41:33.782681.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T15:41:33.782681.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_08_17T15_41_33.782681", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-17T15:41:33.782681.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-17T15:41:33.782681.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_08_17T15_41_33.782681", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-17T15:41:33.782681.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-17T15:41:33.782681.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_08_17T15_41_33.782681", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T15:41:33.782681.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T15:41:33.782681.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_08_17T15_41_33.782681", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T15:41:33.782681.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T15:41:33.782681.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_08_17T15_41_33.782681", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-17T15:41:33.782681.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-17T15:41:33.782681.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_08_17T15_41_33.782681", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T15:41:33.782681.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T15:41:33.782681.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_08_17T15_41_33.782681", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T15:41:33.782681.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T15:41:33.782681.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_08_17T15_41_33.782681", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T15:41:33.782681.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T15:41:33.782681.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_08_17T15_41_33.782681", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T15:41:33.782681.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T15:41:33.782681.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_08_17T15_41_33.782681", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-17T15:41:33.782681.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-17T15:41:33.782681.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_08_17T15_41_33.782681", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-17T15:41:33.782681.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-17T15:41:33.782681.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_08_17T15_41_33.782681", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T15:41:33.782681.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T15:41:33.782681.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_08_17T15_41_33.782681", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-17T15:41:33.782681.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-17T15:41:33.782681.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_08_17T15_41_33.782681", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T15:41:33.782681.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T15:41:33.782681.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_08_17T15_41_33.782681", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T15:41:33.782681.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T15:41:33.782681.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_08_17T15_41_33.782681", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T15:41:33.782681.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T15:41:33.782681.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_08_17T15_41_33.782681", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-17T15:41:33.782681.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-17T15:41:33.782681.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_08_17T15_41_33.782681", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T15:41:33.782681.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T15:41:33.782681.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_08_17T15_41_33.782681", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T15:41:33.782681.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T15:41:33.782681.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_08_17T15_41_33.782681", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T15:41:33.782681.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T15:41:33.782681.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_08_17T15_41_33.782681", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T15:41:33.782681.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T15:41:33.782681.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_08_17T15_41_33.782681", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T15:41:33.782681.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T15:41:33.782681.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_08_17T15_41_33.782681", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T15:41:33.782681.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T15:41:33.782681.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_08_17T15_41_33.782681", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T15:41:33.782681.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T15:41:33.782681.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_08_17T15_41_33.782681", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T15:41:33.782681.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T15:41:33.782681.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_08_17T15_41_33.782681", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T15:41:33.782681.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T15:41:33.782681.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_08_17T15_41_33.782681", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T15:41:33.782681.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T15:41:33.782681.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_08_17T15_41_33.782681", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T15:41:33.782681.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T15:41:33.782681.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_08_17T15_41_33.782681", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T15:41:33.782681.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T15:41:33.782681.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_08_17T15_41_33.782681", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T15:41:33.782681.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T15:41:33.782681.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_08_17T15_41_33.782681", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T15:41:33.782681.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T15:41:33.782681.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_08_17T15_41_33.782681", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-17T15:41:33.782681.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-17T15:41:33.782681.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_08_17T15_41_33.782681", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T15:41:33.782681.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T15:41:33.782681.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_08_17T15_41_33.782681", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-17T15:41:33.782681.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-17T15:41:33.782681.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_08_17T15_41_33.782681", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T15:41:33.782681.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T15:41:33.782681.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_08_17T15_41_33.782681", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T15:41:33.782681.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T15:41:33.782681.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_08_17T15_41_33.782681", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T15:41:33.782681.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T15:41:33.782681.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_08_17T15_41_33.782681", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-17T15:41:33.782681.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-17T15:41:33.782681.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_08_17T15_41_33.782681", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-17T15:41:33.782681.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-17T15:41:33.782681.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_08_17T15_41_33.782681", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T15:41:33.782681.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T15:41:33.782681.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_08_17T15_41_33.782681", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T15:41:33.782681.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T15:41:33.782681.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_08_17T15_41_33.782681", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T15:41:33.782681.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T15:41:33.782681.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_08_17T15_41_33.782681", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T15:41:33.782681.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T15:41:33.782681.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_08_17T15_41_33.782681", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-17T15:41:33.782681.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-17T15:41:33.782681.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_08_17T15_41_33.782681", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-17T15:41:33.782681.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-17T15:41:33.782681.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_08_17T15_41_33.782681", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-17T15:41:33.782681.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-17T15:41:33.782681.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_08_17T15_41_33.782681", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T15:41:33.782681.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T15:41:33.782681.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_08_17T15_41_33.782681", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-17T15:41:33.782681.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-17T15:41:33.782681.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_08_17T15_41_33.782681", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T15:41:33.782681.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T15:41:33.782681.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_08_17T15_41_33.782681", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T15:41:33.782681.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T15:41:33.782681.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_08_17T15_41_33.782681", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-17T15:41:33.782681.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-17T15:41:33.782681.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_08_17T15_41_33.782681", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-17T15:41:33.782681.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-17T15:41:33.782681.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_08_17T15_41_33.782681", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-17T15:41:33.782681.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-17T15:41:33.782681.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_08_17T15_41_33.782681", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T15:41:33.782681.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T15:41:33.782681.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_08_17T15_41_33.782681", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-17T15:41:33.782681.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-17T15:41:33.782681.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_08_17T15_41_33.782681", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-17T15:41:33.782681.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-17T15:41:33.782681.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_08_17T15_41_33.782681", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-17T15:41:33.782681.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-17T15:41:33.782681.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_09_22T17_02_02.199354", "path": ["**/details_harness|winogrande|5_2023-09-22T17-02-02.199354.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-09-22T17-02-02.199354.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_08_17T15_41_33.782681", "path": ["results_2023-08-17T15:41:33.782681.parquet"]}, {"split": "2023_09_22T17_02_02.199354", "path": ["results_2023-09-22T17-02-02.199354.parquet"]}, {"split": "latest", "path": ["results_2023-09-22T17-02-02.199354.parquet"]}]}]}
2023-09-22T16:02:18+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of bertin-project/bertin-gpt-j-6B-alpaca ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model bertin-project/bertin-gpt-j-6B-alpaca on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-09-22T17:02:02.199354(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of bertin-project/bertin-gpt-j-6B-alpaca", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model bertin-project/bertin-gpt-j-6B-alpaca on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-09-22T17:02:02.199354(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of bertin-project/bertin-gpt-j-6B-alpaca", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model bertin-project/bertin-gpt-j-6B-alpaca on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-09-22T17:02:02.199354(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 26, 31, 174, 66, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of bertin-project/bertin-gpt-j-6B-alpaca## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model bertin-project/bertin-gpt-j-6B-alpaca on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-09-22T17:02:02.199354(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
f544df86af2bb2a01af0b6c5b20f73d0ee50a79e
# Dataset Card for "Thunderbird_BERT_Baseline" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
EgilKarlsen/Thunderbird_BERT_Baseline
[ "region:us" ]
2023-08-18T18:06:26+00:00
{"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "test", "path": "data/test-*"}]}], "dataset_info": {"features": [{"name": "0", "dtype": "float32"}, {"name": "1", "dtype": "float32"}, {"name": "2", "dtype": "float32"}, {"name": "3", "dtype": "float32"}, {"name": "4", "dtype": "float32"}, {"name": "5", "dtype": "float32"}, {"name": "6", "dtype": "float32"}, {"name": "7", "dtype": "float32"}, {"name": "8", "dtype": "float32"}, {"name": "9", "dtype": "float32"}, {"name": "10", "dtype": "float32"}, {"name": "11", "dtype": "float32"}, {"name": "12", "dtype": "float32"}, {"name": "13", "dtype": "float32"}, {"name": "14", "dtype": "float32"}, {"name": "15", "dtype": "float32"}, {"name": "16", "dtype": "float32"}, {"name": "17", "dtype": "float32"}, {"name": "18", "dtype": "float32"}, {"name": "19", "dtype": "float32"}, {"name": "20", "dtype": "float32"}, {"name": "21", "dtype": "float32"}, {"name": "22", "dtype": "float32"}, {"name": "23", "dtype": "float32"}, {"name": "24", "dtype": "float32"}, {"name": "25", "dtype": "float32"}, {"name": "26", "dtype": "float32"}, {"name": "27", "dtype": "float32"}, {"name": "28", "dtype": "float32"}, {"name": "29", "dtype": "float32"}, {"name": "30", "dtype": "float32"}, {"name": "31", "dtype": "float32"}, {"name": "32", "dtype": "float32"}, {"name": "33", "dtype": "float32"}, {"name": "34", "dtype": "float32"}, {"name": "35", "dtype": "float32"}, {"name": "36", "dtype": "float32"}, {"name": "37", "dtype": "float32"}, {"name": "38", "dtype": "float32"}, {"name": "39", "dtype": "float32"}, {"name": "40", "dtype": "float32"}, {"name": "41", "dtype": "float32"}, {"name": "42", "dtype": "float32"}, {"name": "43", "dtype": "float32"}, {"name": "44", "dtype": "float32"}, {"name": "45", "dtype": "float32"}, {"name": "46", "dtype": "float32"}, {"name": "47", "dtype": "float32"}, {"name": "48", "dtype": "float32"}, {"name": "49", "dtype": "float32"}, {"name": "50", "dtype": "float32"}, {"name": "51", "dtype": "float32"}, {"name": "52", "dtype": "float32"}, {"name": "53", "dtype": "float32"}, {"name": "54", "dtype": "float32"}, {"name": "55", "dtype": "float32"}, {"name": "56", "dtype": "float32"}, {"name": "57", "dtype": "float32"}, {"name": "58", "dtype": "float32"}, {"name": "59", "dtype": "float32"}, {"name": "60", "dtype": "float32"}, {"name": "61", "dtype": "float32"}, {"name": "62", "dtype": "float32"}, {"name": "63", "dtype": "float32"}, {"name": "64", "dtype": "float32"}, {"name": "65", "dtype": "float32"}, {"name": "66", "dtype": "float32"}, {"name": "67", "dtype": "float32"}, {"name": "68", "dtype": "float32"}, {"name": "69", "dtype": "float32"}, {"name": "70", "dtype": "float32"}, {"name": "71", "dtype": "float32"}, {"name": "72", "dtype": "float32"}, {"name": "73", "dtype": "float32"}, {"name": "74", "dtype": "float32"}, {"name": "75", "dtype": "float32"}, {"name": "76", "dtype": "float32"}, {"name": "77", "dtype": "float32"}, {"name": "78", "dtype": "float32"}, {"name": "79", "dtype": "float32"}, {"name": "80", "dtype": "float32"}, {"name": "81", "dtype": "float32"}, {"name": "82", "dtype": "float32"}, {"name": "83", "dtype": "float32"}, {"name": "84", "dtype": "float32"}, {"name": "85", "dtype": "float32"}, {"name": "86", "dtype": "float32"}, {"name": "87", "dtype": "float32"}, {"name": "88", "dtype": "float32"}, {"name": "89", "dtype": "float32"}, {"name": "90", "dtype": "float32"}, {"name": "91", "dtype": "float32"}, {"name": "92", "dtype": "float32"}, {"name": "93", "dtype": "float32"}, {"name": "94", "dtype": "float32"}, {"name": "95", "dtype": "float32"}, {"name": "96", "dtype": "float32"}, {"name": "97", "dtype": "float32"}, {"name": "98", "dtype": "float32"}, {"name": "99", "dtype": "float32"}, {"name": "100", "dtype": "float32"}, {"name": "101", "dtype": "float32"}, {"name": "102", "dtype": "float32"}, {"name": "103", "dtype": "float32"}, {"name": "104", "dtype": "float32"}, {"name": "105", "dtype": "float32"}, {"name": "106", "dtype": "float32"}, {"name": "107", "dtype": "float32"}, {"name": "108", "dtype": "float32"}, {"name": "109", "dtype": "float32"}, {"name": "110", "dtype": "float32"}, {"name": "111", "dtype": "float32"}, {"name": "112", "dtype": "float32"}, {"name": "113", "dtype": "float32"}, {"name": "114", "dtype": "float32"}, {"name": "115", "dtype": "float32"}, {"name": "116", "dtype": "float32"}, {"name": "117", "dtype": "float32"}, {"name": "118", "dtype": "float32"}, {"name": "119", "dtype": "float32"}, {"name": "120", "dtype": "float32"}, {"name": "121", "dtype": "float32"}, {"name": "122", "dtype": "float32"}, {"name": "123", "dtype": "float32"}, {"name": "124", "dtype": "float32"}, {"name": "125", "dtype": "float32"}, {"name": "126", "dtype": "float32"}, {"name": "127", "dtype": "float32"}, {"name": "128", "dtype": "float32"}, {"name": "129", "dtype": "float32"}, {"name": "130", "dtype": "float32"}, {"name": "131", "dtype": "float32"}, {"name": "132", "dtype": "float32"}, {"name": "133", "dtype": "float32"}, {"name": "134", "dtype": "float32"}, {"name": "135", "dtype": "float32"}, {"name": "136", "dtype": "float32"}, {"name": "137", "dtype": "float32"}, {"name": "138", "dtype": "float32"}, {"name": "139", "dtype": "float32"}, {"name": "140", "dtype": "float32"}, {"name": "141", "dtype": "float32"}, {"name": "142", "dtype": "float32"}, {"name": "143", "dtype": "float32"}, {"name": "144", "dtype": "float32"}, {"name": "145", "dtype": "float32"}, {"name": "146", "dtype": "float32"}, {"name": "147", "dtype": "float32"}, {"name": "148", "dtype": "float32"}, {"name": "149", "dtype": "float32"}, {"name": "150", "dtype": "float32"}, {"name": "151", "dtype": "float32"}, {"name": "152", "dtype": "float32"}, {"name": "153", "dtype": "float32"}, {"name": "154", "dtype": "float32"}, {"name": "155", "dtype": "float32"}, {"name": "156", "dtype": "float32"}, {"name": "157", "dtype": "float32"}, {"name": "158", "dtype": "float32"}, {"name": "159", "dtype": "float32"}, {"name": "160", "dtype": "float32"}, {"name": "161", "dtype": "float32"}, {"name": "162", "dtype": "float32"}, {"name": "163", "dtype": "float32"}, {"name": "164", "dtype": "float32"}, {"name": "165", "dtype": "float32"}, {"name": "166", "dtype": "float32"}, {"name": "167", "dtype": "float32"}, {"name": "168", "dtype": "float32"}, {"name": "169", "dtype": "float32"}, {"name": "170", "dtype": "float32"}, {"name": "171", "dtype": "float32"}, {"name": "172", "dtype": "float32"}, {"name": "173", "dtype": "float32"}, {"name": "174", "dtype": "float32"}, {"name": "175", "dtype": "float32"}, {"name": "176", "dtype": "float32"}, {"name": "177", "dtype": "float32"}, {"name": "178", "dtype": "float32"}, {"name": "179", "dtype": "float32"}, {"name": "180", "dtype": "float32"}, {"name": "181", "dtype": "float32"}, {"name": "182", "dtype": "float32"}, {"name": "183", "dtype": "float32"}, {"name": "184", "dtype": "float32"}, {"name": "185", "dtype": "float32"}, {"name": "186", "dtype": "float32"}, {"name": "187", "dtype": "float32"}, {"name": "188", "dtype": "float32"}, {"name": "189", "dtype": "float32"}, {"name": "190", "dtype": "float32"}, {"name": "191", "dtype": "float32"}, {"name": "192", "dtype": "float32"}, {"name": "193", "dtype": "float32"}, {"name": "194", "dtype": "float32"}, {"name": "195", "dtype": "float32"}, {"name": "196", "dtype": "float32"}, {"name": "197", "dtype": "float32"}, {"name": "198", "dtype": "float32"}, {"name": "199", "dtype": "float32"}, {"name": "200", "dtype": "float32"}, {"name": "201", "dtype": "float32"}, {"name": "202", "dtype": "float32"}, {"name": "203", "dtype": "float32"}, {"name": "204", "dtype": "float32"}, {"name": "205", "dtype": "float32"}, {"name": "206", "dtype": "float32"}, {"name": "207", "dtype": "float32"}, {"name": "208", "dtype": "float32"}, {"name": "209", "dtype": "float32"}, {"name": "210", "dtype": "float32"}, {"name": "211", "dtype": "float32"}, {"name": "212", "dtype": "float32"}, {"name": "213", "dtype": "float32"}, {"name": "214", "dtype": "float32"}, {"name": "215", "dtype": "float32"}, {"name": "216", "dtype": "float32"}, {"name": "217", "dtype": "float32"}, {"name": "218", "dtype": "float32"}, {"name": "219", "dtype": "float32"}, {"name": "220", "dtype": "float32"}, {"name": "221", "dtype": "float32"}, {"name": "222", "dtype": "float32"}, {"name": "223", "dtype": "float32"}, {"name": "224", "dtype": "float32"}, {"name": "225", "dtype": "float32"}, {"name": "226", "dtype": "float32"}, {"name": "227", "dtype": "float32"}, {"name": "228", "dtype": "float32"}, {"name": "229", "dtype": "float32"}, {"name": "230", "dtype": "float32"}, {"name": "231", "dtype": "float32"}, {"name": "232", "dtype": "float32"}, {"name": "233", "dtype": "float32"}, {"name": "234", "dtype": "float32"}, {"name": "235", "dtype": "float32"}, {"name": "236", "dtype": "float32"}, {"name": "237", "dtype": "float32"}, {"name": "238", "dtype": "float32"}, {"name": "239", "dtype": "float32"}, {"name": "240", "dtype": "float32"}, {"name": "241", "dtype": "float32"}, {"name": "242", "dtype": "float32"}, {"name": "243", "dtype": "float32"}, {"name": "244", "dtype": "float32"}, {"name": "245", "dtype": "float32"}, {"name": "246", "dtype": "float32"}, {"name": "247", "dtype": "float32"}, {"name": "248", "dtype": "float32"}, {"name": "249", "dtype": "float32"}, {"name": "250", "dtype": "float32"}, {"name": "251", "dtype": "float32"}, {"name": "252", "dtype": "float32"}, {"name": "253", "dtype": "float32"}, {"name": "254", "dtype": "float32"}, {"name": "255", "dtype": "float32"}, {"name": "256", "dtype": "float32"}, {"name": "257", "dtype": "float32"}, {"name": "258", "dtype": "float32"}, {"name": "259", "dtype": "float32"}, {"name": "260", "dtype": "float32"}, {"name": "261", "dtype": "float32"}, {"name": "262", "dtype": "float32"}, {"name": "263", "dtype": "float32"}, {"name": "264", "dtype": "float32"}, {"name": "265", "dtype": "float32"}, {"name": "266", "dtype": "float32"}, {"name": "267", "dtype": "float32"}, {"name": "268", "dtype": "float32"}, {"name": "269", "dtype": "float32"}, {"name": "270", "dtype": "float32"}, {"name": "271", "dtype": "float32"}, {"name": "272", "dtype": "float32"}, {"name": "273", "dtype": "float32"}, {"name": "274", "dtype": "float32"}, {"name": "275", "dtype": "float32"}, {"name": "276", "dtype": "float32"}, {"name": "277", "dtype": "float32"}, {"name": "278", "dtype": "float32"}, {"name": "279", "dtype": "float32"}, {"name": "280", "dtype": "float32"}, {"name": "281", "dtype": "float32"}, {"name": "282", "dtype": "float32"}, {"name": "283", "dtype": "float32"}, {"name": "284", "dtype": "float32"}, {"name": "285", "dtype": "float32"}, {"name": "286", "dtype": "float32"}, {"name": "287", "dtype": "float32"}, {"name": "288", "dtype": "float32"}, {"name": "289", "dtype": "float32"}, {"name": "290", "dtype": "float32"}, {"name": "291", "dtype": "float32"}, {"name": "292", "dtype": "float32"}, {"name": "293", "dtype": "float32"}, {"name": "294", "dtype": "float32"}, {"name": "295", "dtype": "float32"}, {"name": "296", "dtype": "float32"}, {"name": "297", "dtype": "float32"}, {"name": "298", "dtype": "float32"}, {"name": "299", "dtype": "float32"}, {"name": "300", "dtype": "float32"}, {"name": "301", "dtype": "float32"}, {"name": "302", "dtype": "float32"}, {"name": "303", "dtype": "float32"}, {"name": "304", "dtype": "float32"}, {"name": "305", "dtype": "float32"}, {"name": "306", "dtype": "float32"}, {"name": "307", "dtype": "float32"}, {"name": "308", "dtype": "float32"}, {"name": "309", "dtype": "float32"}, {"name": "310", "dtype": "float32"}, {"name": "311", "dtype": "float32"}, {"name": "312", "dtype": "float32"}, {"name": "313", "dtype": "float32"}, {"name": "314", "dtype": "float32"}, {"name": "315", "dtype": "float32"}, {"name": "316", "dtype": "float32"}, {"name": "317", "dtype": "float32"}, {"name": "318", "dtype": "float32"}, {"name": "319", "dtype": "float32"}, {"name": "320", "dtype": "float32"}, {"name": "321", "dtype": "float32"}, {"name": "322", "dtype": "float32"}, {"name": "323", "dtype": "float32"}, {"name": "324", "dtype": "float32"}, {"name": "325", "dtype": "float32"}, {"name": "326", "dtype": "float32"}, {"name": "327", "dtype": "float32"}, {"name": "328", "dtype": "float32"}, {"name": "329", "dtype": "float32"}, {"name": "330", "dtype": "float32"}, {"name": "331", "dtype": "float32"}, {"name": "332", "dtype": "float32"}, {"name": "333", "dtype": "float32"}, {"name": "334", "dtype": "float32"}, {"name": "335", "dtype": "float32"}, {"name": "336", "dtype": "float32"}, {"name": "337", "dtype": "float32"}, {"name": "338", "dtype": "float32"}, {"name": "339", "dtype": "float32"}, {"name": "340", "dtype": "float32"}, {"name": "341", "dtype": "float32"}, {"name": "342", "dtype": "float32"}, {"name": "343", "dtype": "float32"}, {"name": "344", "dtype": "float32"}, {"name": "345", "dtype": "float32"}, {"name": "346", "dtype": "float32"}, {"name": "347", "dtype": "float32"}, {"name": "348", "dtype": "float32"}, {"name": "349", "dtype": "float32"}, {"name": "350", "dtype": "float32"}, {"name": "351", "dtype": "float32"}, {"name": "352", "dtype": "float32"}, {"name": "353", "dtype": "float32"}, {"name": "354", "dtype": "float32"}, {"name": "355", "dtype": "float32"}, {"name": "356", "dtype": "float32"}, {"name": "357", "dtype": "float32"}, {"name": "358", "dtype": "float32"}, {"name": "359", "dtype": "float32"}, {"name": "360", "dtype": "float32"}, {"name": "361", "dtype": "float32"}, {"name": "362", "dtype": "float32"}, {"name": "363", "dtype": "float32"}, {"name": "364", "dtype": "float32"}, {"name": "365", "dtype": "float32"}, {"name": "366", "dtype": "float32"}, {"name": "367", "dtype": "float32"}, {"name": "368", "dtype": "float32"}, {"name": "369", "dtype": "float32"}, {"name": "370", "dtype": "float32"}, {"name": "371", "dtype": "float32"}, {"name": "372", "dtype": "float32"}, {"name": "373", "dtype": "float32"}, {"name": "374", "dtype": "float32"}, {"name": "375", "dtype": "float32"}, {"name": "376", "dtype": "float32"}, {"name": "377", "dtype": "float32"}, {"name": "378", "dtype": "float32"}, {"name": "379", "dtype": "float32"}, {"name": "380", "dtype": "float32"}, {"name": "381", "dtype": "float32"}, {"name": "382", "dtype": "float32"}, {"name": "383", "dtype": "float32"}, {"name": "384", "dtype": "float32"}, {"name": "385", "dtype": "float32"}, {"name": "386", "dtype": "float32"}, {"name": "387", "dtype": "float32"}, {"name": "388", "dtype": "float32"}, {"name": "389", "dtype": "float32"}, {"name": "390", "dtype": "float32"}, {"name": "391", "dtype": "float32"}, {"name": "392", "dtype": "float32"}, {"name": "393", "dtype": "float32"}, {"name": "394", "dtype": "float32"}, {"name": "395", "dtype": "float32"}, {"name": "396", "dtype": "float32"}, {"name": "397", "dtype": "float32"}, {"name": "398", "dtype": "float32"}, {"name": "399", "dtype": "float32"}, {"name": "400", "dtype": "float32"}, {"name": "401", "dtype": "float32"}, {"name": "402", "dtype": "float32"}, {"name": "403", "dtype": "float32"}, {"name": "404", "dtype": "float32"}, {"name": "405", "dtype": "float32"}, {"name": "406", "dtype": "float32"}, {"name": "407", "dtype": "float32"}, {"name": "408", "dtype": "float32"}, {"name": "409", "dtype": "float32"}, {"name": "410", "dtype": "float32"}, {"name": "411", "dtype": "float32"}, {"name": "412", "dtype": "float32"}, {"name": "413", "dtype": "float32"}, {"name": "414", "dtype": "float32"}, {"name": "415", "dtype": "float32"}, {"name": "416", "dtype": "float32"}, {"name": "417", "dtype": "float32"}, {"name": "418", "dtype": "float32"}, {"name": "419", "dtype": "float32"}, {"name": "420", "dtype": "float32"}, {"name": "421", "dtype": "float32"}, {"name": "422", "dtype": "float32"}, {"name": "423", "dtype": "float32"}, {"name": "424", "dtype": "float32"}, {"name": "425", "dtype": "float32"}, {"name": "426", "dtype": "float32"}, {"name": "427", "dtype": "float32"}, {"name": "428", "dtype": "float32"}, {"name": "429", "dtype": "float32"}, {"name": "430", "dtype": "float32"}, {"name": "431", "dtype": "float32"}, {"name": "432", "dtype": "float32"}, {"name": "433", "dtype": "float32"}, {"name": "434", "dtype": "float32"}, {"name": "435", "dtype": "float32"}, {"name": "436", "dtype": "float32"}, {"name": "437", "dtype": "float32"}, {"name": "438", "dtype": "float32"}, {"name": "439", "dtype": "float32"}, {"name": "440", "dtype": "float32"}, {"name": "441", "dtype": "float32"}, {"name": "442", "dtype": "float32"}, {"name": "443", "dtype": "float32"}, {"name": "444", "dtype": "float32"}, {"name": "445", "dtype": "float32"}, {"name": "446", "dtype": "float32"}, {"name": "447", "dtype": "float32"}, {"name": "448", "dtype": "float32"}, {"name": "449", "dtype": "float32"}, {"name": "450", "dtype": "float32"}, {"name": "451", "dtype": "float32"}, {"name": "452", "dtype": "float32"}, {"name": "453", "dtype": "float32"}, {"name": "454", "dtype": "float32"}, {"name": "455", "dtype": "float32"}, {"name": "456", "dtype": "float32"}, {"name": "457", "dtype": "float32"}, {"name": "458", "dtype": "float32"}, {"name": "459", "dtype": "float32"}, {"name": "460", "dtype": "float32"}, {"name": "461", "dtype": "float32"}, {"name": "462", "dtype": "float32"}, {"name": "463", "dtype": "float32"}, {"name": "464", "dtype": "float32"}, {"name": "465", "dtype": "float32"}, {"name": "466", "dtype": "float32"}, {"name": "467", "dtype": "float32"}, {"name": "468", "dtype": "float32"}, {"name": "469", "dtype": "float32"}, {"name": "470", "dtype": "float32"}, {"name": "471", "dtype": "float32"}, {"name": "472", "dtype": "float32"}, {"name": "473", "dtype": "float32"}, {"name": "474", "dtype": "float32"}, {"name": "475", "dtype": "float32"}, {"name": "476", "dtype": "float32"}, {"name": "477", "dtype": "float32"}, {"name": "478", "dtype": "float32"}, {"name": "479", "dtype": "float32"}, {"name": "480", "dtype": "float32"}, {"name": "481", "dtype": "float32"}, {"name": "482", "dtype": "float32"}, {"name": "483", "dtype": "float32"}, {"name": "484", "dtype": "float32"}, {"name": "485", "dtype": "float32"}, {"name": "486", "dtype": "float32"}, {"name": "487", "dtype": "float32"}, {"name": "488", "dtype": "float32"}, {"name": "489", "dtype": "float32"}, {"name": "490", "dtype": "float32"}, {"name": "491", "dtype": "float32"}, {"name": "492", "dtype": "float32"}, {"name": "493", "dtype": "float32"}, {"name": "494", "dtype": "float32"}, {"name": "495", "dtype": "float32"}, {"name": "496", "dtype": "float32"}, {"name": "497", "dtype": "float32"}, {"name": "498", "dtype": "float32"}, {"name": "499", "dtype": "float32"}, {"name": "500", "dtype": "float32"}, {"name": "501", "dtype": "float32"}, {"name": "502", "dtype": "float32"}, {"name": "503", "dtype": "float32"}, {"name": "504", "dtype": "float32"}, {"name": "505", "dtype": "float32"}, {"name": "506", "dtype": "float32"}, {"name": "507", "dtype": "float32"}, {"name": "508", "dtype": "float32"}, {"name": "509", "dtype": "float32"}, {"name": "510", "dtype": "float32"}, {"name": "511", "dtype": "float32"}, {"name": "512", "dtype": "float32"}, {"name": "513", "dtype": "float32"}, {"name": "514", "dtype": "float32"}, {"name": "515", "dtype": "float32"}, {"name": "516", "dtype": "float32"}, {"name": "517", "dtype": "float32"}, {"name": "518", "dtype": "float32"}, {"name": "519", "dtype": "float32"}, {"name": "520", "dtype": "float32"}, {"name": "521", "dtype": "float32"}, {"name": "522", "dtype": "float32"}, {"name": "523", "dtype": "float32"}, {"name": "524", "dtype": "float32"}, {"name": "525", "dtype": "float32"}, {"name": "526", "dtype": "float32"}, {"name": "527", "dtype": "float32"}, {"name": "528", "dtype": "float32"}, {"name": "529", "dtype": "float32"}, {"name": "530", "dtype": "float32"}, {"name": "531", "dtype": "float32"}, {"name": "532", "dtype": "float32"}, {"name": "533", "dtype": "float32"}, {"name": "534", "dtype": "float32"}, {"name": "535", "dtype": "float32"}, {"name": "536", "dtype": "float32"}, {"name": "537", "dtype": "float32"}, {"name": "538", "dtype": "float32"}, {"name": "539", "dtype": "float32"}, {"name": "540", "dtype": "float32"}, {"name": "541", "dtype": "float32"}, {"name": "542", "dtype": "float32"}, {"name": "543", "dtype": "float32"}, {"name": "544", "dtype": "float32"}, {"name": "545", "dtype": "float32"}, {"name": "546", "dtype": "float32"}, {"name": "547", "dtype": "float32"}, {"name": "548", "dtype": "float32"}, {"name": "549", "dtype": "float32"}, {"name": "550", "dtype": "float32"}, {"name": "551", "dtype": "float32"}, {"name": "552", "dtype": "float32"}, {"name": "553", "dtype": "float32"}, {"name": "554", "dtype": "float32"}, {"name": "555", "dtype": "float32"}, {"name": "556", "dtype": "float32"}, {"name": "557", "dtype": "float32"}, {"name": "558", "dtype": "float32"}, {"name": "559", "dtype": "float32"}, {"name": "560", "dtype": "float32"}, {"name": "561", "dtype": "float32"}, {"name": "562", "dtype": "float32"}, {"name": "563", "dtype": "float32"}, {"name": "564", "dtype": "float32"}, {"name": "565", "dtype": "float32"}, {"name": "566", "dtype": "float32"}, {"name": "567", "dtype": "float32"}, {"name": "568", "dtype": "float32"}, {"name": "569", "dtype": "float32"}, {"name": "570", "dtype": "float32"}, {"name": "571", "dtype": "float32"}, {"name": "572", "dtype": "float32"}, {"name": "573", "dtype": "float32"}, {"name": "574", "dtype": "float32"}, {"name": "575", "dtype": "float32"}, {"name": "576", "dtype": "float32"}, {"name": "577", "dtype": "float32"}, {"name": "578", "dtype": "float32"}, {"name": "579", "dtype": "float32"}, {"name": "580", "dtype": "float32"}, {"name": "581", "dtype": "float32"}, {"name": "582", "dtype": "float32"}, {"name": "583", "dtype": "float32"}, {"name": "584", "dtype": "float32"}, {"name": "585", "dtype": "float32"}, {"name": "586", "dtype": "float32"}, {"name": "587", "dtype": "float32"}, {"name": "588", "dtype": "float32"}, {"name": "589", "dtype": "float32"}, {"name": "590", "dtype": "float32"}, {"name": "591", "dtype": "float32"}, {"name": "592", "dtype": "float32"}, {"name": "593", "dtype": "float32"}, {"name": "594", "dtype": "float32"}, {"name": "595", "dtype": "float32"}, {"name": "596", "dtype": "float32"}, {"name": "597", "dtype": "float32"}, {"name": "598", "dtype": "float32"}, {"name": "599", "dtype": "float32"}, {"name": "600", "dtype": "float32"}, {"name": "601", "dtype": "float32"}, {"name": "602", "dtype": "float32"}, {"name": "603", "dtype": "float32"}, {"name": "604", "dtype": "float32"}, {"name": "605", "dtype": "float32"}, {"name": "606", "dtype": "float32"}, {"name": "607", "dtype": "float32"}, {"name": "608", "dtype": "float32"}, {"name": "609", "dtype": "float32"}, {"name": "610", "dtype": "float32"}, {"name": "611", "dtype": "float32"}, {"name": "612", "dtype": "float32"}, {"name": "613", "dtype": "float32"}, {"name": "614", "dtype": "float32"}, {"name": "615", "dtype": "float32"}, {"name": "616", "dtype": "float32"}, {"name": "617", "dtype": "float32"}, {"name": "618", "dtype": "float32"}, {"name": "619", "dtype": "float32"}, {"name": "620", "dtype": "float32"}, {"name": "621", "dtype": "float32"}, {"name": "622", "dtype": "float32"}, {"name": "623", "dtype": "float32"}, {"name": "624", "dtype": "float32"}, {"name": "625", "dtype": "float32"}, {"name": "626", "dtype": "float32"}, {"name": "627", "dtype": "float32"}, {"name": "628", "dtype": "float32"}, {"name": "629", "dtype": "float32"}, {"name": "630", "dtype": "float32"}, {"name": "631", "dtype": "float32"}, {"name": "632", "dtype": "float32"}, {"name": "633", "dtype": "float32"}, {"name": "634", "dtype": "float32"}, {"name": "635", "dtype": "float32"}, {"name": "636", "dtype": "float32"}, {"name": "637", "dtype": "float32"}, {"name": "638", "dtype": "float32"}, {"name": "639", "dtype": "float32"}, {"name": "640", "dtype": "float32"}, {"name": "641", "dtype": "float32"}, {"name": "642", "dtype": "float32"}, {"name": "643", "dtype": "float32"}, {"name": "644", "dtype": "float32"}, {"name": "645", "dtype": "float32"}, {"name": "646", "dtype": "float32"}, {"name": "647", "dtype": "float32"}, {"name": "648", "dtype": "float32"}, {"name": "649", "dtype": "float32"}, {"name": "650", "dtype": "float32"}, {"name": "651", "dtype": "float32"}, {"name": "652", "dtype": "float32"}, {"name": "653", "dtype": "float32"}, {"name": "654", "dtype": "float32"}, {"name": "655", "dtype": "float32"}, {"name": "656", "dtype": "float32"}, {"name": "657", "dtype": "float32"}, {"name": "658", "dtype": "float32"}, {"name": "659", "dtype": "float32"}, {"name": "660", "dtype": "float32"}, {"name": "661", "dtype": "float32"}, {"name": "662", "dtype": "float32"}, {"name": "663", "dtype": "float32"}, {"name": "664", "dtype": "float32"}, {"name": "665", "dtype": "float32"}, {"name": "666", "dtype": "float32"}, {"name": "667", "dtype": "float32"}, {"name": "668", "dtype": "float32"}, {"name": "669", "dtype": "float32"}, {"name": "670", "dtype": "float32"}, {"name": "671", "dtype": "float32"}, {"name": "672", "dtype": "float32"}, {"name": "673", "dtype": "float32"}, {"name": "674", "dtype": "float32"}, {"name": "675", "dtype": "float32"}, {"name": "676", "dtype": "float32"}, {"name": "677", "dtype": "float32"}, {"name": "678", "dtype": "float32"}, {"name": "679", "dtype": "float32"}, {"name": "680", "dtype": "float32"}, {"name": "681", "dtype": "float32"}, {"name": "682", "dtype": "float32"}, {"name": "683", "dtype": "float32"}, {"name": "684", "dtype": "float32"}, {"name": "685", "dtype": "float32"}, {"name": "686", "dtype": "float32"}, {"name": "687", "dtype": "float32"}, {"name": "688", "dtype": "float32"}, {"name": "689", "dtype": "float32"}, {"name": "690", "dtype": "float32"}, {"name": "691", "dtype": "float32"}, {"name": "692", "dtype": "float32"}, {"name": "693", "dtype": "float32"}, {"name": "694", "dtype": "float32"}, {"name": "695", "dtype": "float32"}, {"name": "696", "dtype": "float32"}, {"name": "697", "dtype": "float32"}, {"name": "698", "dtype": "float32"}, {"name": "699", "dtype": "float32"}, {"name": "700", "dtype": "float32"}, {"name": "701", "dtype": "float32"}, {"name": "702", "dtype": "float32"}, {"name": "703", "dtype": "float32"}, {"name": "704", "dtype": "float32"}, {"name": "705", "dtype": "float32"}, {"name": "706", "dtype": "float32"}, {"name": "707", "dtype": "float32"}, {"name": "708", "dtype": "float32"}, {"name": "709", "dtype": "float32"}, {"name": "710", "dtype": "float32"}, {"name": "711", "dtype": "float32"}, {"name": "712", "dtype": "float32"}, {"name": "713", "dtype": "float32"}, {"name": "714", "dtype": "float32"}, {"name": "715", "dtype": "float32"}, {"name": "716", "dtype": "float32"}, {"name": "717", "dtype": "float32"}, {"name": "718", "dtype": "float32"}, {"name": "719", "dtype": "float32"}, {"name": "720", "dtype": "float32"}, {"name": "721", "dtype": "float32"}, {"name": "722", "dtype": "float32"}, {"name": "723", "dtype": "float32"}, {"name": "724", "dtype": "float32"}, {"name": "725", "dtype": "float32"}, {"name": "726", "dtype": "float32"}, {"name": "727", "dtype": "float32"}, {"name": "728", "dtype": "float32"}, {"name": "729", "dtype": "float32"}, {"name": "730", "dtype": "float32"}, {"name": "731", "dtype": "float32"}, {"name": "732", "dtype": "float32"}, {"name": "733", "dtype": "float32"}, {"name": "734", "dtype": "float32"}, {"name": "735", "dtype": "float32"}, {"name": "736", "dtype": "float32"}, {"name": "737", "dtype": "float32"}, {"name": "738", "dtype": "float32"}, {"name": "739", "dtype": "float32"}, {"name": "740", "dtype": "float32"}, {"name": "741", "dtype": "float32"}, {"name": "742", "dtype": "float32"}, {"name": "743", "dtype": "float32"}, {"name": "744", "dtype": "float32"}, {"name": "745", "dtype": "float32"}, {"name": "746", "dtype": "float32"}, {"name": "747", "dtype": "float32"}, {"name": "748", "dtype": "float32"}, {"name": "749", "dtype": "float32"}, {"name": "750", "dtype": "float32"}, {"name": "751", "dtype": "float32"}, {"name": "752", "dtype": "float32"}, {"name": "753", "dtype": "float32"}, {"name": "754", "dtype": "float32"}, {"name": "755", "dtype": "float32"}, {"name": "756", "dtype": "float32"}, {"name": "757", "dtype": "float32"}, {"name": "758", "dtype": "float32"}, {"name": "759", "dtype": "float32"}, {"name": "760", "dtype": "float32"}, {"name": "761", "dtype": "float32"}, {"name": "762", "dtype": "float32"}, {"name": "763", "dtype": "float32"}, {"name": "764", "dtype": "float32"}, {"name": "765", "dtype": "float32"}, {"name": "766", "dtype": "float32"}, {"name": "767", "dtype": "float32"}, {"name": "label", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 115576729.6875, "num_examples": 37500}, {"name": "test", "num_bytes": 38525577.5, "num_examples": 12500}], "download_size": 0, "dataset_size": 154102307.1875}}
2023-08-23T02:39:03+00:00
[]
[]
TAGS #region-us
# Dataset Card for "Thunderbird_BERT_Baseline" More Information needed
[ "# Dataset Card for \"Thunderbird_BERT_Baseline\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"Thunderbird_BERT_Baseline\"\n\nMore Information needed" ]
[ 6, 19 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"Thunderbird_BERT_Baseline\"\n\nMore Information needed" ]
44b7d8e735bd3506713bea5e54c88c94ef4a511e
# Dataset Card for "Thunderbird_RoBERTa_Baseline" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
EgilKarlsen/Thunderbird_RoBERTa_Baseline
[ "region:us" ]
2023-08-18T18:13:32+00:00
{"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "test", "path": "data/test-*"}]}], "dataset_info": {"features": [{"name": "0", "dtype": "float32"}, {"name": "1", "dtype": "float32"}, {"name": "2", "dtype": "float32"}, {"name": "3", "dtype": "float32"}, {"name": "4", "dtype": "float32"}, {"name": "5", "dtype": "float32"}, {"name": "6", "dtype": "float32"}, {"name": "7", "dtype": "float32"}, {"name": "8", "dtype": "float32"}, {"name": "9", "dtype": "float32"}, {"name": "10", "dtype": "float32"}, {"name": "11", "dtype": "float32"}, {"name": "12", "dtype": "float32"}, {"name": "13", "dtype": "float32"}, {"name": "14", "dtype": "float32"}, {"name": "15", "dtype": "float32"}, {"name": "16", "dtype": "float32"}, {"name": "17", "dtype": "float32"}, {"name": "18", "dtype": "float32"}, {"name": "19", "dtype": "float32"}, {"name": "20", "dtype": "float32"}, {"name": "21", "dtype": "float32"}, {"name": "22", "dtype": "float32"}, {"name": "23", "dtype": "float32"}, {"name": "24", "dtype": "float32"}, {"name": "25", "dtype": "float32"}, {"name": "26", "dtype": "float32"}, {"name": "27", "dtype": "float32"}, {"name": "28", "dtype": "float32"}, {"name": "29", "dtype": "float32"}, {"name": "30", "dtype": "float32"}, {"name": "31", "dtype": "float32"}, {"name": "32", "dtype": "float32"}, {"name": "33", "dtype": "float32"}, {"name": "34", "dtype": "float32"}, {"name": "35", "dtype": "float32"}, {"name": "36", "dtype": "float32"}, {"name": "37", "dtype": "float32"}, {"name": "38", "dtype": "float32"}, {"name": "39", "dtype": "float32"}, {"name": "40", "dtype": "float32"}, {"name": "41", "dtype": "float32"}, {"name": "42", "dtype": "float32"}, {"name": "43", "dtype": "float32"}, {"name": "44", "dtype": "float32"}, {"name": "45", "dtype": "float32"}, {"name": "46", "dtype": "float32"}, {"name": "47", "dtype": "float32"}, {"name": "48", "dtype": "float32"}, {"name": "49", "dtype": "float32"}, {"name": "50", "dtype": "float32"}, {"name": "51", "dtype": "float32"}, {"name": "52", "dtype": "float32"}, {"name": "53", "dtype": "float32"}, {"name": "54", "dtype": "float32"}, {"name": "55", "dtype": "float32"}, {"name": "56", "dtype": "float32"}, {"name": "57", "dtype": "float32"}, {"name": "58", "dtype": "float32"}, {"name": "59", "dtype": "float32"}, {"name": "60", "dtype": "float32"}, {"name": "61", "dtype": "float32"}, {"name": "62", "dtype": "float32"}, {"name": "63", "dtype": "float32"}, {"name": "64", "dtype": "float32"}, {"name": "65", "dtype": "float32"}, {"name": "66", "dtype": "float32"}, {"name": "67", "dtype": "float32"}, {"name": "68", "dtype": "float32"}, {"name": "69", "dtype": "float32"}, {"name": "70", "dtype": "float32"}, {"name": "71", "dtype": "float32"}, {"name": "72", "dtype": "float32"}, {"name": "73", "dtype": "float32"}, {"name": "74", "dtype": "float32"}, {"name": "75", "dtype": "float32"}, {"name": "76", "dtype": "float32"}, {"name": "77", "dtype": "float32"}, {"name": "78", "dtype": "float32"}, {"name": "79", "dtype": "float32"}, {"name": "80", "dtype": "float32"}, {"name": "81", "dtype": "float32"}, {"name": "82", "dtype": "float32"}, {"name": "83", "dtype": "float32"}, {"name": "84", "dtype": "float32"}, {"name": "85", "dtype": "float32"}, {"name": "86", "dtype": "float32"}, {"name": "87", "dtype": "float32"}, {"name": "88", "dtype": "float32"}, {"name": "89", "dtype": "float32"}, {"name": "90", "dtype": "float32"}, {"name": "91", "dtype": "float32"}, {"name": "92", "dtype": "float32"}, {"name": "93", "dtype": "float32"}, {"name": "94", "dtype": "float32"}, {"name": "95", "dtype": "float32"}, {"name": "96", "dtype": "float32"}, {"name": "97", "dtype": "float32"}, {"name": "98", "dtype": "float32"}, {"name": "99", "dtype": "float32"}, {"name": "100", "dtype": "float32"}, {"name": "101", "dtype": "float32"}, {"name": "102", "dtype": "float32"}, {"name": "103", "dtype": "float32"}, {"name": "104", "dtype": "float32"}, {"name": "105", "dtype": "float32"}, {"name": "106", "dtype": "float32"}, {"name": "107", "dtype": "float32"}, {"name": "108", "dtype": "float32"}, {"name": "109", "dtype": "float32"}, {"name": "110", "dtype": "float32"}, {"name": "111", "dtype": "float32"}, {"name": "112", "dtype": "float32"}, {"name": "113", "dtype": "float32"}, {"name": "114", "dtype": "float32"}, {"name": "115", "dtype": "float32"}, {"name": "116", "dtype": "float32"}, {"name": "117", "dtype": "float32"}, {"name": "118", "dtype": "float32"}, {"name": "119", "dtype": "float32"}, {"name": "120", "dtype": "float32"}, {"name": "121", "dtype": "float32"}, {"name": "122", "dtype": "float32"}, {"name": "123", "dtype": "float32"}, {"name": "124", "dtype": "float32"}, {"name": "125", "dtype": "float32"}, {"name": "126", "dtype": "float32"}, {"name": "127", "dtype": "float32"}, {"name": "128", "dtype": "float32"}, {"name": "129", "dtype": "float32"}, {"name": "130", "dtype": "float32"}, {"name": "131", "dtype": "float32"}, {"name": "132", "dtype": "float32"}, {"name": "133", "dtype": "float32"}, {"name": "134", "dtype": "float32"}, {"name": "135", "dtype": "float32"}, {"name": "136", "dtype": "float32"}, {"name": "137", "dtype": "float32"}, {"name": "138", "dtype": "float32"}, {"name": "139", "dtype": "float32"}, {"name": "140", "dtype": "float32"}, {"name": "141", "dtype": "float32"}, {"name": "142", "dtype": "float32"}, {"name": "143", "dtype": "float32"}, {"name": "144", "dtype": "float32"}, {"name": "145", "dtype": "float32"}, {"name": "146", "dtype": "float32"}, {"name": "147", "dtype": "float32"}, {"name": "148", "dtype": "float32"}, {"name": "149", "dtype": "float32"}, {"name": "150", "dtype": "float32"}, {"name": "151", "dtype": "float32"}, {"name": "152", "dtype": "float32"}, {"name": "153", "dtype": "float32"}, {"name": "154", "dtype": "float32"}, {"name": "155", "dtype": "float32"}, {"name": "156", "dtype": "float32"}, {"name": "157", "dtype": "float32"}, {"name": "158", "dtype": "float32"}, {"name": "159", "dtype": "float32"}, {"name": "160", "dtype": "float32"}, {"name": "161", "dtype": "float32"}, {"name": "162", "dtype": "float32"}, {"name": "163", "dtype": "float32"}, {"name": "164", "dtype": "float32"}, {"name": "165", "dtype": "float32"}, {"name": "166", "dtype": "float32"}, {"name": "167", "dtype": "float32"}, {"name": "168", "dtype": "float32"}, {"name": "169", "dtype": "float32"}, {"name": "170", "dtype": "float32"}, {"name": "171", "dtype": "float32"}, {"name": "172", "dtype": "float32"}, {"name": "173", "dtype": "float32"}, {"name": "174", "dtype": "float32"}, {"name": "175", "dtype": "float32"}, {"name": "176", "dtype": "float32"}, {"name": "177", "dtype": "float32"}, {"name": "178", "dtype": "float32"}, {"name": "179", "dtype": "float32"}, {"name": "180", "dtype": "float32"}, {"name": "181", "dtype": "float32"}, {"name": "182", "dtype": "float32"}, {"name": "183", "dtype": "float32"}, {"name": "184", "dtype": "float32"}, {"name": "185", "dtype": "float32"}, {"name": "186", "dtype": "float32"}, {"name": "187", "dtype": "float32"}, {"name": "188", "dtype": "float32"}, {"name": "189", "dtype": "float32"}, {"name": "190", "dtype": "float32"}, {"name": "191", "dtype": "float32"}, {"name": "192", "dtype": "float32"}, {"name": "193", "dtype": "float32"}, {"name": "194", "dtype": "float32"}, {"name": "195", "dtype": "float32"}, {"name": "196", "dtype": "float32"}, {"name": "197", "dtype": "float32"}, {"name": "198", "dtype": "float32"}, {"name": "199", "dtype": "float32"}, {"name": "200", "dtype": "float32"}, {"name": "201", "dtype": "float32"}, {"name": "202", "dtype": "float32"}, {"name": "203", "dtype": "float32"}, {"name": "204", "dtype": "float32"}, {"name": "205", "dtype": "float32"}, {"name": "206", "dtype": "float32"}, {"name": "207", "dtype": "float32"}, {"name": "208", "dtype": "float32"}, {"name": "209", "dtype": "float32"}, {"name": "210", "dtype": "float32"}, {"name": "211", "dtype": "float32"}, {"name": "212", "dtype": "float32"}, {"name": "213", "dtype": "float32"}, {"name": "214", "dtype": "float32"}, {"name": "215", "dtype": "float32"}, {"name": "216", "dtype": "float32"}, {"name": "217", "dtype": "float32"}, {"name": "218", "dtype": "float32"}, {"name": "219", "dtype": "float32"}, {"name": "220", "dtype": "float32"}, {"name": "221", "dtype": "float32"}, {"name": "222", "dtype": "float32"}, {"name": "223", "dtype": "float32"}, {"name": "224", "dtype": "float32"}, {"name": "225", "dtype": "float32"}, {"name": "226", "dtype": "float32"}, {"name": "227", "dtype": "float32"}, {"name": "228", "dtype": "float32"}, {"name": "229", "dtype": "float32"}, {"name": "230", "dtype": "float32"}, {"name": "231", "dtype": "float32"}, {"name": "232", "dtype": "float32"}, {"name": "233", "dtype": "float32"}, {"name": "234", "dtype": "float32"}, {"name": "235", "dtype": "float32"}, {"name": "236", "dtype": "float32"}, {"name": "237", "dtype": "float32"}, {"name": "238", "dtype": "float32"}, {"name": "239", "dtype": "float32"}, {"name": "240", "dtype": "float32"}, {"name": "241", "dtype": "float32"}, {"name": "242", "dtype": "float32"}, {"name": "243", "dtype": "float32"}, {"name": "244", "dtype": "float32"}, {"name": "245", "dtype": "float32"}, {"name": "246", "dtype": "float32"}, {"name": "247", "dtype": "float32"}, {"name": "248", "dtype": "float32"}, {"name": "249", "dtype": "float32"}, {"name": "250", "dtype": "float32"}, {"name": "251", "dtype": "float32"}, {"name": "252", "dtype": "float32"}, {"name": "253", "dtype": "float32"}, {"name": "254", "dtype": "float32"}, {"name": "255", "dtype": "float32"}, {"name": "256", "dtype": "float32"}, {"name": "257", "dtype": "float32"}, {"name": "258", "dtype": "float32"}, {"name": "259", "dtype": "float32"}, {"name": "260", "dtype": "float32"}, {"name": "261", "dtype": "float32"}, {"name": "262", "dtype": "float32"}, {"name": "263", "dtype": "float32"}, {"name": "264", "dtype": "float32"}, {"name": "265", "dtype": "float32"}, {"name": "266", "dtype": "float32"}, {"name": "267", "dtype": "float32"}, {"name": "268", "dtype": "float32"}, {"name": "269", "dtype": "float32"}, {"name": "270", "dtype": "float32"}, {"name": "271", "dtype": "float32"}, {"name": "272", "dtype": "float32"}, {"name": "273", "dtype": "float32"}, {"name": "274", "dtype": "float32"}, {"name": "275", "dtype": "float32"}, {"name": "276", "dtype": "float32"}, {"name": "277", "dtype": "float32"}, {"name": "278", "dtype": "float32"}, {"name": "279", "dtype": "float32"}, {"name": "280", "dtype": "float32"}, {"name": "281", "dtype": "float32"}, {"name": "282", "dtype": "float32"}, {"name": "283", "dtype": "float32"}, {"name": "284", "dtype": "float32"}, {"name": "285", "dtype": "float32"}, {"name": "286", "dtype": "float32"}, {"name": "287", "dtype": "float32"}, {"name": "288", "dtype": "float32"}, {"name": "289", "dtype": "float32"}, {"name": "290", "dtype": "float32"}, {"name": "291", "dtype": "float32"}, {"name": "292", "dtype": "float32"}, {"name": "293", "dtype": "float32"}, {"name": "294", "dtype": "float32"}, {"name": "295", "dtype": "float32"}, {"name": "296", "dtype": "float32"}, {"name": "297", "dtype": "float32"}, {"name": "298", "dtype": "float32"}, {"name": "299", "dtype": "float32"}, {"name": "300", "dtype": "float32"}, {"name": "301", "dtype": "float32"}, {"name": "302", "dtype": "float32"}, {"name": "303", "dtype": "float32"}, {"name": "304", "dtype": "float32"}, {"name": "305", "dtype": "float32"}, {"name": "306", "dtype": "float32"}, {"name": "307", "dtype": "float32"}, {"name": "308", "dtype": "float32"}, {"name": "309", "dtype": "float32"}, {"name": "310", "dtype": "float32"}, {"name": "311", "dtype": "float32"}, {"name": "312", "dtype": "float32"}, {"name": "313", "dtype": "float32"}, {"name": "314", "dtype": "float32"}, {"name": "315", "dtype": "float32"}, {"name": "316", "dtype": "float32"}, {"name": "317", "dtype": "float32"}, {"name": "318", "dtype": "float32"}, {"name": "319", "dtype": "float32"}, {"name": "320", "dtype": "float32"}, {"name": "321", "dtype": "float32"}, {"name": "322", "dtype": "float32"}, {"name": "323", "dtype": "float32"}, {"name": "324", "dtype": "float32"}, {"name": "325", "dtype": "float32"}, {"name": "326", "dtype": "float32"}, {"name": "327", "dtype": "float32"}, {"name": "328", "dtype": "float32"}, {"name": "329", "dtype": "float32"}, {"name": "330", "dtype": "float32"}, {"name": "331", "dtype": "float32"}, {"name": "332", "dtype": "float32"}, {"name": "333", "dtype": "float32"}, {"name": "334", "dtype": "float32"}, {"name": "335", "dtype": "float32"}, {"name": "336", "dtype": "float32"}, {"name": "337", "dtype": "float32"}, {"name": "338", "dtype": "float32"}, {"name": "339", "dtype": "float32"}, {"name": "340", "dtype": "float32"}, {"name": "341", "dtype": "float32"}, {"name": "342", "dtype": "float32"}, {"name": "343", "dtype": "float32"}, {"name": "344", "dtype": "float32"}, {"name": "345", "dtype": "float32"}, {"name": "346", "dtype": "float32"}, {"name": "347", "dtype": "float32"}, {"name": "348", "dtype": "float32"}, {"name": "349", "dtype": "float32"}, {"name": "350", "dtype": "float32"}, {"name": "351", "dtype": "float32"}, {"name": "352", "dtype": "float32"}, {"name": "353", "dtype": "float32"}, {"name": "354", "dtype": "float32"}, {"name": "355", "dtype": "float32"}, {"name": "356", "dtype": "float32"}, {"name": "357", "dtype": "float32"}, {"name": "358", "dtype": "float32"}, {"name": "359", "dtype": "float32"}, {"name": "360", "dtype": "float32"}, {"name": "361", "dtype": "float32"}, {"name": "362", "dtype": "float32"}, {"name": "363", "dtype": "float32"}, {"name": "364", "dtype": "float32"}, {"name": "365", "dtype": "float32"}, {"name": "366", "dtype": "float32"}, {"name": "367", "dtype": "float32"}, {"name": "368", "dtype": "float32"}, {"name": "369", "dtype": "float32"}, {"name": "370", "dtype": "float32"}, {"name": "371", "dtype": "float32"}, {"name": "372", "dtype": "float32"}, {"name": "373", "dtype": "float32"}, {"name": "374", "dtype": "float32"}, {"name": "375", "dtype": "float32"}, {"name": "376", "dtype": "float32"}, {"name": "377", "dtype": "float32"}, {"name": "378", "dtype": "float32"}, {"name": "379", "dtype": "float32"}, {"name": "380", "dtype": "float32"}, {"name": "381", "dtype": "float32"}, {"name": "382", "dtype": "float32"}, {"name": "383", "dtype": "float32"}, {"name": "384", "dtype": "float32"}, {"name": "385", "dtype": "float32"}, {"name": "386", "dtype": "float32"}, {"name": "387", "dtype": "float32"}, {"name": "388", "dtype": "float32"}, {"name": "389", "dtype": "float32"}, {"name": "390", "dtype": "float32"}, {"name": "391", "dtype": "float32"}, {"name": "392", "dtype": "float32"}, {"name": "393", "dtype": "float32"}, {"name": "394", "dtype": "float32"}, {"name": "395", "dtype": "float32"}, {"name": "396", "dtype": "float32"}, {"name": "397", "dtype": "float32"}, {"name": "398", "dtype": "float32"}, {"name": "399", "dtype": "float32"}, {"name": "400", "dtype": "float32"}, {"name": "401", "dtype": "float32"}, {"name": "402", "dtype": "float32"}, {"name": "403", "dtype": "float32"}, {"name": "404", "dtype": "float32"}, {"name": "405", "dtype": "float32"}, {"name": "406", "dtype": "float32"}, {"name": "407", "dtype": "float32"}, {"name": "408", "dtype": "float32"}, {"name": "409", "dtype": "float32"}, {"name": "410", "dtype": "float32"}, {"name": "411", "dtype": "float32"}, {"name": "412", "dtype": "float32"}, {"name": "413", "dtype": "float32"}, {"name": "414", "dtype": "float32"}, {"name": "415", "dtype": "float32"}, {"name": "416", "dtype": "float32"}, {"name": "417", "dtype": "float32"}, {"name": "418", "dtype": "float32"}, {"name": "419", "dtype": "float32"}, {"name": "420", "dtype": "float32"}, {"name": "421", "dtype": "float32"}, {"name": "422", "dtype": "float32"}, {"name": "423", "dtype": "float32"}, {"name": "424", "dtype": "float32"}, {"name": "425", "dtype": "float32"}, {"name": "426", "dtype": "float32"}, {"name": "427", "dtype": "float32"}, {"name": "428", "dtype": "float32"}, {"name": "429", "dtype": "float32"}, {"name": "430", "dtype": "float32"}, {"name": "431", "dtype": "float32"}, {"name": "432", "dtype": "float32"}, {"name": "433", "dtype": "float32"}, {"name": "434", "dtype": "float32"}, {"name": "435", "dtype": "float32"}, {"name": "436", "dtype": "float32"}, {"name": "437", "dtype": "float32"}, {"name": "438", "dtype": "float32"}, {"name": "439", "dtype": "float32"}, {"name": "440", "dtype": "float32"}, {"name": "441", "dtype": "float32"}, {"name": "442", "dtype": "float32"}, {"name": "443", "dtype": "float32"}, {"name": "444", "dtype": "float32"}, {"name": "445", "dtype": "float32"}, {"name": "446", "dtype": "float32"}, {"name": "447", "dtype": "float32"}, {"name": "448", "dtype": "float32"}, {"name": "449", "dtype": "float32"}, {"name": "450", "dtype": "float32"}, {"name": "451", "dtype": "float32"}, {"name": "452", "dtype": "float32"}, {"name": "453", "dtype": "float32"}, {"name": "454", "dtype": "float32"}, {"name": "455", "dtype": "float32"}, {"name": "456", "dtype": "float32"}, {"name": "457", "dtype": "float32"}, {"name": "458", "dtype": "float32"}, {"name": "459", "dtype": "float32"}, {"name": "460", "dtype": "float32"}, {"name": "461", "dtype": "float32"}, {"name": "462", "dtype": "float32"}, {"name": "463", "dtype": "float32"}, {"name": "464", "dtype": "float32"}, {"name": "465", "dtype": "float32"}, {"name": "466", "dtype": "float32"}, {"name": "467", "dtype": "float32"}, {"name": "468", "dtype": "float32"}, {"name": "469", "dtype": "float32"}, {"name": "470", "dtype": "float32"}, {"name": "471", "dtype": "float32"}, {"name": "472", "dtype": "float32"}, {"name": "473", "dtype": "float32"}, {"name": "474", "dtype": "float32"}, {"name": "475", "dtype": "float32"}, {"name": "476", "dtype": "float32"}, {"name": "477", "dtype": "float32"}, {"name": "478", "dtype": "float32"}, {"name": "479", "dtype": "float32"}, {"name": "480", "dtype": "float32"}, {"name": "481", "dtype": "float32"}, {"name": "482", "dtype": "float32"}, {"name": "483", "dtype": "float32"}, {"name": "484", "dtype": "float32"}, {"name": "485", "dtype": "float32"}, {"name": "486", "dtype": "float32"}, {"name": "487", "dtype": "float32"}, {"name": "488", "dtype": "float32"}, {"name": "489", "dtype": "float32"}, {"name": "490", "dtype": "float32"}, {"name": "491", "dtype": "float32"}, {"name": "492", "dtype": "float32"}, {"name": "493", "dtype": "float32"}, {"name": "494", "dtype": "float32"}, {"name": "495", "dtype": "float32"}, {"name": "496", "dtype": "float32"}, {"name": "497", "dtype": "float32"}, {"name": "498", "dtype": "float32"}, {"name": "499", "dtype": "float32"}, {"name": "500", "dtype": "float32"}, {"name": "501", "dtype": "float32"}, {"name": "502", "dtype": "float32"}, {"name": "503", "dtype": "float32"}, {"name": "504", "dtype": "float32"}, {"name": "505", "dtype": "float32"}, {"name": "506", "dtype": "float32"}, {"name": "507", "dtype": "float32"}, {"name": "508", "dtype": "float32"}, {"name": "509", "dtype": "float32"}, {"name": "510", "dtype": "float32"}, {"name": "511", "dtype": "float32"}, {"name": "512", "dtype": "float32"}, {"name": "513", "dtype": "float32"}, {"name": "514", "dtype": "float32"}, {"name": "515", "dtype": "float32"}, {"name": "516", "dtype": "float32"}, {"name": "517", "dtype": "float32"}, {"name": "518", "dtype": "float32"}, {"name": "519", "dtype": "float32"}, {"name": "520", "dtype": "float32"}, {"name": "521", "dtype": "float32"}, {"name": "522", "dtype": "float32"}, {"name": "523", "dtype": "float32"}, {"name": "524", "dtype": "float32"}, {"name": "525", "dtype": "float32"}, {"name": "526", "dtype": "float32"}, {"name": "527", "dtype": "float32"}, {"name": "528", "dtype": "float32"}, {"name": "529", "dtype": "float32"}, {"name": "530", "dtype": "float32"}, {"name": "531", "dtype": "float32"}, {"name": "532", "dtype": "float32"}, {"name": "533", "dtype": "float32"}, {"name": "534", "dtype": "float32"}, {"name": "535", "dtype": "float32"}, {"name": "536", "dtype": "float32"}, {"name": "537", "dtype": "float32"}, {"name": "538", "dtype": "float32"}, {"name": "539", "dtype": "float32"}, {"name": "540", "dtype": "float32"}, {"name": "541", "dtype": "float32"}, {"name": "542", "dtype": "float32"}, {"name": "543", "dtype": "float32"}, {"name": "544", "dtype": "float32"}, {"name": "545", "dtype": "float32"}, {"name": "546", "dtype": "float32"}, {"name": "547", "dtype": "float32"}, {"name": "548", "dtype": "float32"}, {"name": "549", "dtype": "float32"}, {"name": "550", "dtype": "float32"}, {"name": "551", "dtype": "float32"}, {"name": "552", "dtype": "float32"}, {"name": "553", "dtype": "float32"}, {"name": "554", "dtype": "float32"}, {"name": "555", "dtype": "float32"}, {"name": "556", "dtype": "float32"}, {"name": "557", "dtype": "float32"}, {"name": "558", "dtype": "float32"}, {"name": "559", "dtype": "float32"}, {"name": "560", "dtype": "float32"}, {"name": "561", "dtype": "float32"}, {"name": "562", "dtype": "float32"}, {"name": "563", "dtype": "float32"}, {"name": "564", "dtype": "float32"}, {"name": "565", "dtype": "float32"}, {"name": "566", "dtype": "float32"}, {"name": "567", "dtype": "float32"}, {"name": "568", "dtype": "float32"}, {"name": "569", "dtype": "float32"}, {"name": "570", "dtype": "float32"}, {"name": "571", "dtype": "float32"}, {"name": "572", "dtype": "float32"}, {"name": "573", "dtype": "float32"}, {"name": "574", "dtype": "float32"}, {"name": "575", "dtype": "float32"}, {"name": "576", "dtype": "float32"}, {"name": "577", "dtype": "float32"}, {"name": "578", "dtype": "float32"}, {"name": "579", "dtype": "float32"}, {"name": "580", "dtype": "float32"}, {"name": "581", "dtype": "float32"}, {"name": "582", "dtype": "float32"}, {"name": "583", "dtype": "float32"}, {"name": "584", "dtype": "float32"}, {"name": "585", "dtype": "float32"}, {"name": "586", "dtype": "float32"}, {"name": "587", "dtype": "float32"}, {"name": "588", "dtype": "float32"}, {"name": "589", "dtype": "float32"}, {"name": "590", "dtype": "float32"}, {"name": "591", "dtype": "float32"}, {"name": "592", "dtype": "float32"}, {"name": "593", "dtype": "float32"}, {"name": "594", "dtype": "float32"}, {"name": "595", "dtype": "float32"}, {"name": "596", "dtype": "float32"}, {"name": "597", "dtype": "float32"}, {"name": "598", "dtype": "float32"}, {"name": "599", "dtype": "float32"}, {"name": "600", "dtype": "float32"}, {"name": "601", "dtype": "float32"}, {"name": "602", "dtype": "float32"}, {"name": "603", "dtype": "float32"}, {"name": "604", "dtype": "float32"}, {"name": "605", "dtype": "float32"}, {"name": "606", "dtype": "float32"}, {"name": "607", "dtype": "float32"}, {"name": "608", "dtype": "float32"}, {"name": "609", "dtype": "float32"}, {"name": "610", "dtype": "float32"}, {"name": "611", "dtype": "float32"}, {"name": "612", "dtype": "float32"}, {"name": "613", "dtype": "float32"}, {"name": "614", "dtype": "float32"}, {"name": "615", "dtype": "float32"}, {"name": "616", "dtype": "float32"}, {"name": "617", "dtype": "float32"}, {"name": "618", "dtype": "float32"}, {"name": "619", "dtype": "float32"}, {"name": "620", "dtype": "float32"}, {"name": "621", "dtype": "float32"}, {"name": "622", "dtype": "float32"}, {"name": "623", "dtype": "float32"}, {"name": "624", "dtype": "float32"}, {"name": "625", "dtype": "float32"}, {"name": "626", "dtype": "float32"}, {"name": "627", "dtype": "float32"}, {"name": "628", "dtype": "float32"}, {"name": "629", "dtype": "float32"}, {"name": "630", "dtype": "float32"}, {"name": "631", "dtype": "float32"}, {"name": "632", "dtype": "float32"}, {"name": "633", "dtype": "float32"}, {"name": "634", "dtype": "float32"}, {"name": "635", "dtype": "float32"}, {"name": "636", "dtype": "float32"}, {"name": "637", "dtype": "float32"}, {"name": "638", "dtype": "float32"}, {"name": "639", "dtype": "float32"}, {"name": "640", "dtype": "float32"}, {"name": "641", "dtype": "float32"}, {"name": "642", "dtype": "float32"}, {"name": "643", "dtype": "float32"}, {"name": "644", "dtype": "float32"}, {"name": "645", "dtype": "float32"}, {"name": "646", "dtype": "float32"}, {"name": "647", "dtype": "float32"}, {"name": "648", "dtype": "float32"}, {"name": "649", "dtype": "float32"}, {"name": "650", "dtype": "float32"}, {"name": "651", "dtype": "float32"}, {"name": "652", "dtype": "float32"}, {"name": "653", "dtype": "float32"}, {"name": "654", "dtype": "float32"}, {"name": "655", "dtype": "float32"}, {"name": "656", "dtype": "float32"}, {"name": "657", "dtype": "float32"}, {"name": "658", "dtype": "float32"}, {"name": "659", "dtype": "float32"}, {"name": "660", "dtype": "float32"}, {"name": "661", "dtype": "float32"}, {"name": "662", "dtype": "float32"}, {"name": "663", "dtype": "float32"}, {"name": "664", "dtype": "float32"}, {"name": "665", "dtype": "float32"}, {"name": "666", "dtype": "float32"}, {"name": "667", "dtype": "float32"}, {"name": "668", "dtype": "float32"}, {"name": "669", "dtype": "float32"}, {"name": "670", "dtype": "float32"}, {"name": "671", "dtype": "float32"}, {"name": "672", "dtype": "float32"}, {"name": "673", "dtype": "float32"}, {"name": "674", "dtype": "float32"}, {"name": "675", "dtype": "float32"}, {"name": "676", "dtype": "float32"}, {"name": "677", "dtype": "float32"}, {"name": "678", "dtype": "float32"}, {"name": "679", "dtype": "float32"}, {"name": "680", "dtype": "float32"}, {"name": "681", "dtype": "float32"}, {"name": "682", "dtype": "float32"}, {"name": "683", "dtype": "float32"}, {"name": "684", "dtype": "float32"}, {"name": "685", "dtype": "float32"}, {"name": "686", "dtype": "float32"}, {"name": "687", "dtype": "float32"}, {"name": "688", "dtype": "float32"}, {"name": "689", "dtype": "float32"}, {"name": "690", "dtype": "float32"}, {"name": "691", "dtype": "float32"}, {"name": "692", "dtype": "float32"}, {"name": "693", "dtype": "float32"}, {"name": "694", "dtype": "float32"}, {"name": "695", "dtype": "float32"}, {"name": "696", "dtype": "float32"}, {"name": "697", "dtype": "float32"}, {"name": "698", "dtype": "float32"}, {"name": "699", "dtype": "float32"}, {"name": "700", "dtype": "float32"}, {"name": "701", "dtype": "float32"}, {"name": "702", "dtype": "float32"}, {"name": "703", "dtype": "float32"}, {"name": "704", "dtype": "float32"}, {"name": "705", "dtype": "float32"}, {"name": "706", "dtype": "float32"}, {"name": "707", "dtype": "float32"}, {"name": "708", "dtype": "float32"}, {"name": "709", "dtype": "float32"}, {"name": "710", "dtype": "float32"}, {"name": "711", "dtype": "float32"}, {"name": "712", "dtype": "float32"}, {"name": "713", "dtype": "float32"}, {"name": "714", "dtype": "float32"}, {"name": "715", "dtype": "float32"}, {"name": "716", "dtype": "float32"}, {"name": "717", "dtype": "float32"}, {"name": "718", "dtype": "float32"}, {"name": "719", "dtype": "float32"}, {"name": "720", "dtype": "float32"}, {"name": "721", "dtype": "float32"}, {"name": "722", "dtype": "float32"}, {"name": "723", "dtype": "float32"}, {"name": "724", "dtype": "float32"}, {"name": "725", "dtype": "float32"}, {"name": "726", "dtype": "float32"}, {"name": "727", "dtype": "float32"}, {"name": "728", "dtype": "float32"}, {"name": "729", "dtype": "float32"}, {"name": "730", "dtype": "float32"}, {"name": "731", "dtype": "float32"}, {"name": "732", "dtype": "float32"}, {"name": "733", "dtype": "float32"}, {"name": "734", "dtype": "float32"}, {"name": "735", "dtype": "float32"}, {"name": "736", "dtype": "float32"}, {"name": "737", "dtype": "float32"}, {"name": "738", "dtype": "float32"}, {"name": "739", "dtype": "float32"}, {"name": "740", "dtype": "float32"}, {"name": "741", "dtype": "float32"}, {"name": "742", "dtype": "float32"}, {"name": "743", "dtype": "float32"}, {"name": "744", "dtype": "float32"}, {"name": "745", "dtype": "float32"}, {"name": "746", "dtype": "float32"}, {"name": "747", "dtype": "float32"}, {"name": "748", "dtype": "float32"}, {"name": "749", "dtype": "float32"}, {"name": "750", "dtype": "float32"}, {"name": "751", "dtype": "float32"}, {"name": "752", "dtype": "float32"}, {"name": "753", "dtype": "float32"}, {"name": "754", "dtype": "float32"}, {"name": "755", "dtype": "float32"}, {"name": "756", "dtype": "float32"}, {"name": "757", "dtype": "float32"}, {"name": "758", "dtype": "float32"}, {"name": "759", "dtype": "float32"}, {"name": "760", "dtype": "float32"}, {"name": "761", "dtype": "float32"}, {"name": "762", "dtype": "float32"}, {"name": "763", "dtype": "float32"}, {"name": "764", "dtype": "float32"}, {"name": "765", "dtype": "float32"}, {"name": "766", "dtype": "float32"}, {"name": "767", "dtype": "float32"}, {"name": "label", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 115576729.6875, "num_examples": 37500}, {"name": "test", "num_bytes": 38525577.5, "num_examples": 12500}], "download_size": 0, "dataset_size": 154102307.1875}}
2023-08-23T02:40:56+00:00
[]
[]
TAGS #region-us
# Dataset Card for "Thunderbird_RoBERTa_Baseline" More Information needed
[ "# Dataset Card for \"Thunderbird_RoBERTa_Baseline\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"Thunderbird_RoBERTa_Baseline\"\n\nMore Information needed" ]
[ 6, 20 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"Thunderbird_RoBERTa_Baseline\"\n\nMore Information needed" ]
67d46101171f338aa0f24981d9ac5391d5921acf
# Dataset Card for "Thunderbird_DistilRoBERTa_Baseline" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
EgilKarlsen/Thunderbird_DistilRoBERTa_Baseline
[ "region:us" ]
2023-08-18T18:20:05+00:00
{"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "test", "path": "data/test-*"}]}], "dataset_info": {"features": [{"name": "0", "dtype": "float32"}, {"name": "1", "dtype": "float32"}, {"name": "2", "dtype": "float32"}, {"name": "3", "dtype": "float32"}, {"name": "4", "dtype": "float32"}, {"name": "5", "dtype": "float32"}, {"name": "6", "dtype": "float32"}, {"name": "7", "dtype": "float32"}, {"name": "8", "dtype": "float32"}, {"name": "9", "dtype": "float32"}, {"name": "10", "dtype": "float32"}, {"name": "11", "dtype": "float32"}, {"name": "12", "dtype": "float32"}, {"name": "13", "dtype": "float32"}, {"name": "14", "dtype": "float32"}, {"name": "15", "dtype": "float32"}, {"name": "16", "dtype": "float32"}, {"name": "17", "dtype": "float32"}, {"name": "18", "dtype": "float32"}, {"name": "19", "dtype": "float32"}, {"name": "20", "dtype": "float32"}, {"name": "21", "dtype": "float32"}, {"name": "22", "dtype": "float32"}, {"name": "23", "dtype": "float32"}, {"name": "24", "dtype": "float32"}, {"name": "25", "dtype": "float32"}, {"name": "26", "dtype": "float32"}, {"name": "27", "dtype": "float32"}, {"name": "28", "dtype": "float32"}, {"name": "29", "dtype": "float32"}, {"name": "30", "dtype": "float32"}, {"name": "31", "dtype": "float32"}, {"name": "32", "dtype": "float32"}, {"name": "33", "dtype": "float32"}, {"name": "34", "dtype": "float32"}, {"name": "35", "dtype": "float32"}, {"name": "36", "dtype": "float32"}, {"name": "37", "dtype": "float32"}, {"name": "38", "dtype": "float32"}, {"name": "39", "dtype": "float32"}, {"name": "40", "dtype": "float32"}, {"name": "41", "dtype": "float32"}, {"name": "42", "dtype": "float32"}, {"name": "43", "dtype": "float32"}, {"name": "44", "dtype": "float32"}, {"name": "45", "dtype": "float32"}, {"name": "46", "dtype": "float32"}, {"name": "47", "dtype": "float32"}, {"name": "48", "dtype": "float32"}, {"name": "49", "dtype": "float32"}, {"name": "50", "dtype": "float32"}, {"name": "51", "dtype": "float32"}, {"name": "52", "dtype": "float32"}, {"name": "53", "dtype": "float32"}, {"name": "54", "dtype": "float32"}, {"name": "55", "dtype": "float32"}, {"name": "56", "dtype": "float32"}, {"name": "57", "dtype": "float32"}, {"name": "58", "dtype": "float32"}, {"name": "59", "dtype": "float32"}, {"name": "60", "dtype": "float32"}, {"name": "61", "dtype": "float32"}, {"name": "62", "dtype": "float32"}, {"name": "63", "dtype": "float32"}, {"name": "64", "dtype": "float32"}, {"name": "65", "dtype": "float32"}, {"name": "66", "dtype": "float32"}, {"name": "67", "dtype": "float32"}, {"name": "68", "dtype": "float32"}, {"name": "69", "dtype": "float32"}, {"name": "70", "dtype": "float32"}, {"name": "71", "dtype": "float32"}, {"name": "72", "dtype": "float32"}, {"name": "73", "dtype": "float32"}, {"name": "74", "dtype": "float32"}, {"name": "75", "dtype": "float32"}, {"name": "76", "dtype": "float32"}, {"name": "77", "dtype": "float32"}, {"name": "78", "dtype": "float32"}, {"name": "79", "dtype": "float32"}, {"name": "80", "dtype": "float32"}, {"name": "81", "dtype": "float32"}, {"name": "82", "dtype": "float32"}, {"name": "83", "dtype": "float32"}, {"name": "84", "dtype": "float32"}, {"name": "85", "dtype": "float32"}, {"name": "86", "dtype": "float32"}, {"name": "87", "dtype": "float32"}, {"name": "88", "dtype": "float32"}, {"name": "89", "dtype": "float32"}, {"name": "90", "dtype": "float32"}, {"name": "91", "dtype": "float32"}, {"name": "92", "dtype": "float32"}, {"name": "93", "dtype": "float32"}, {"name": "94", "dtype": "float32"}, {"name": "95", "dtype": "float32"}, {"name": "96", "dtype": "float32"}, {"name": "97", "dtype": "float32"}, {"name": "98", "dtype": "float32"}, {"name": "99", "dtype": "float32"}, {"name": "100", "dtype": "float32"}, {"name": "101", "dtype": "float32"}, {"name": "102", "dtype": "float32"}, {"name": "103", "dtype": "float32"}, {"name": "104", "dtype": "float32"}, {"name": "105", "dtype": "float32"}, {"name": "106", "dtype": "float32"}, {"name": "107", "dtype": "float32"}, {"name": "108", "dtype": "float32"}, {"name": "109", "dtype": "float32"}, {"name": "110", "dtype": "float32"}, {"name": "111", "dtype": "float32"}, {"name": "112", "dtype": "float32"}, {"name": "113", "dtype": "float32"}, {"name": "114", "dtype": "float32"}, {"name": "115", "dtype": "float32"}, {"name": "116", "dtype": "float32"}, {"name": "117", "dtype": "float32"}, {"name": "118", "dtype": "float32"}, {"name": "119", "dtype": "float32"}, {"name": "120", "dtype": "float32"}, {"name": "121", "dtype": "float32"}, {"name": "122", "dtype": "float32"}, {"name": "123", "dtype": "float32"}, {"name": "124", "dtype": "float32"}, {"name": "125", "dtype": "float32"}, {"name": "126", "dtype": "float32"}, {"name": "127", "dtype": "float32"}, {"name": "128", "dtype": "float32"}, {"name": "129", "dtype": "float32"}, {"name": "130", "dtype": "float32"}, {"name": "131", "dtype": "float32"}, {"name": "132", "dtype": "float32"}, {"name": "133", "dtype": "float32"}, {"name": "134", "dtype": "float32"}, {"name": "135", "dtype": "float32"}, {"name": "136", "dtype": "float32"}, {"name": "137", "dtype": "float32"}, {"name": "138", "dtype": "float32"}, {"name": "139", "dtype": "float32"}, {"name": "140", "dtype": "float32"}, {"name": "141", "dtype": "float32"}, {"name": "142", "dtype": "float32"}, {"name": "143", "dtype": "float32"}, {"name": "144", "dtype": "float32"}, {"name": "145", "dtype": "float32"}, {"name": "146", "dtype": "float32"}, {"name": "147", "dtype": "float32"}, {"name": "148", "dtype": "float32"}, {"name": "149", "dtype": "float32"}, {"name": "150", "dtype": "float32"}, {"name": "151", "dtype": "float32"}, {"name": "152", "dtype": "float32"}, {"name": "153", "dtype": "float32"}, {"name": "154", "dtype": "float32"}, {"name": "155", "dtype": "float32"}, {"name": "156", "dtype": "float32"}, {"name": "157", "dtype": "float32"}, {"name": "158", "dtype": "float32"}, {"name": "159", "dtype": "float32"}, {"name": "160", "dtype": "float32"}, {"name": "161", "dtype": "float32"}, {"name": "162", "dtype": "float32"}, {"name": "163", "dtype": "float32"}, {"name": "164", "dtype": "float32"}, {"name": "165", "dtype": "float32"}, {"name": "166", "dtype": "float32"}, {"name": "167", "dtype": "float32"}, {"name": "168", "dtype": "float32"}, {"name": "169", "dtype": "float32"}, {"name": "170", "dtype": "float32"}, {"name": "171", "dtype": "float32"}, {"name": "172", "dtype": "float32"}, {"name": "173", "dtype": "float32"}, {"name": "174", "dtype": "float32"}, {"name": "175", "dtype": "float32"}, {"name": "176", "dtype": "float32"}, {"name": "177", "dtype": "float32"}, {"name": "178", "dtype": "float32"}, {"name": "179", "dtype": "float32"}, {"name": "180", "dtype": "float32"}, {"name": "181", "dtype": "float32"}, {"name": "182", "dtype": "float32"}, {"name": "183", "dtype": "float32"}, {"name": "184", "dtype": "float32"}, {"name": "185", "dtype": "float32"}, {"name": "186", "dtype": "float32"}, {"name": "187", "dtype": "float32"}, {"name": "188", "dtype": "float32"}, {"name": "189", "dtype": "float32"}, {"name": "190", "dtype": "float32"}, {"name": "191", "dtype": "float32"}, {"name": "192", "dtype": "float32"}, {"name": "193", "dtype": "float32"}, {"name": "194", "dtype": "float32"}, {"name": "195", "dtype": "float32"}, {"name": "196", "dtype": "float32"}, {"name": "197", "dtype": "float32"}, {"name": "198", "dtype": "float32"}, {"name": "199", "dtype": "float32"}, {"name": "200", "dtype": "float32"}, {"name": "201", "dtype": "float32"}, {"name": "202", "dtype": "float32"}, {"name": "203", "dtype": "float32"}, {"name": "204", "dtype": "float32"}, {"name": "205", "dtype": "float32"}, {"name": "206", "dtype": "float32"}, {"name": "207", "dtype": "float32"}, {"name": "208", "dtype": "float32"}, {"name": "209", "dtype": "float32"}, {"name": "210", "dtype": "float32"}, {"name": "211", "dtype": "float32"}, {"name": "212", "dtype": "float32"}, {"name": "213", "dtype": "float32"}, {"name": "214", "dtype": "float32"}, {"name": "215", "dtype": "float32"}, {"name": "216", "dtype": "float32"}, {"name": "217", "dtype": "float32"}, {"name": "218", "dtype": "float32"}, {"name": "219", "dtype": "float32"}, {"name": "220", "dtype": "float32"}, {"name": "221", "dtype": "float32"}, {"name": "222", "dtype": "float32"}, {"name": "223", "dtype": "float32"}, {"name": "224", "dtype": "float32"}, {"name": "225", "dtype": "float32"}, {"name": "226", "dtype": "float32"}, {"name": "227", "dtype": "float32"}, {"name": "228", "dtype": "float32"}, {"name": "229", "dtype": "float32"}, {"name": "230", "dtype": "float32"}, {"name": "231", "dtype": "float32"}, {"name": "232", "dtype": "float32"}, {"name": "233", "dtype": "float32"}, {"name": "234", "dtype": "float32"}, {"name": "235", "dtype": "float32"}, {"name": "236", "dtype": "float32"}, {"name": "237", "dtype": "float32"}, {"name": "238", "dtype": "float32"}, {"name": "239", "dtype": "float32"}, {"name": "240", "dtype": "float32"}, {"name": "241", "dtype": "float32"}, {"name": "242", "dtype": "float32"}, {"name": "243", "dtype": "float32"}, {"name": "244", "dtype": "float32"}, {"name": "245", "dtype": "float32"}, {"name": "246", "dtype": "float32"}, {"name": "247", "dtype": "float32"}, {"name": "248", "dtype": "float32"}, {"name": "249", "dtype": "float32"}, {"name": "250", "dtype": "float32"}, {"name": "251", "dtype": "float32"}, {"name": "252", "dtype": "float32"}, {"name": "253", "dtype": "float32"}, {"name": "254", "dtype": "float32"}, {"name": "255", "dtype": "float32"}, {"name": "256", "dtype": "float32"}, {"name": "257", "dtype": "float32"}, {"name": "258", "dtype": "float32"}, {"name": "259", "dtype": "float32"}, {"name": "260", "dtype": "float32"}, {"name": "261", "dtype": "float32"}, {"name": "262", "dtype": "float32"}, {"name": "263", "dtype": "float32"}, {"name": "264", "dtype": "float32"}, {"name": "265", "dtype": "float32"}, {"name": "266", "dtype": "float32"}, {"name": "267", "dtype": "float32"}, {"name": "268", "dtype": "float32"}, {"name": "269", "dtype": "float32"}, {"name": "270", "dtype": "float32"}, {"name": "271", "dtype": "float32"}, {"name": "272", "dtype": "float32"}, {"name": "273", "dtype": "float32"}, {"name": "274", "dtype": "float32"}, {"name": "275", "dtype": "float32"}, {"name": "276", "dtype": "float32"}, {"name": "277", "dtype": "float32"}, {"name": "278", "dtype": "float32"}, {"name": "279", "dtype": "float32"}, {"name": "280", "dtype": "float32"}, {"name": "281", "dtype": "float32"}, {"name": "282", "dtype": "float32"}, {"name": "283", "dtype": "float32"}, {"name": "284", "dtype": "float32"}, {"name": "285", "dtype": "float32"}, {"name": "286", "dtype": "float32"}, {"name": "287", "dtype": "float32"}, {"name": "288", "dtype": "float32"}, {"name": "289", "dtype": "float32"}, {"name": "290", "dtype": "float32"}, {"name": "291", "dtype": "float32"}, {"name": "292", "dtype": "float32"}, {"name": "293", "dtype": "float32"}, {"name": "294", "dtype": "float32"}, {"name": "295", "dtype": "float32"}, {"name": "296", "dtype": "float32"}, {"name": "297", "dtype": "float32"}, {"name": "298", "dtype": "float32"}, {"name": "299", "dtype": "float32"}, {"name": "300", "dtype": "float32"}, {"name": "301", "dtype": "float32"}, {"name": "302", "dtype": "float32"}, {"name": "303", "dtype": "float32"}, {"name": "304", "dtype": "float32"}, {"name": "305", "dtype": "float32"}, {"name": "306", "dtype": "float32"}, {"name": "307", "dtype": "float32"}, {"name": "308", "dtype": "float32"}, {"name": "309", "dtype": "float32"}, {"name": "310", "dtype": "float32"}, {"name": "311", "dtype": "float32"}, {"name": "312", "dtype": "float32"}, {"name": "313", "dtype": "float32"}, {"name": "314", "dtype": "float32"}, {"name": "315", "dtype": "float32"}, {"name": "316", "dtype": "float32"}, {"name": "317", "dtype": "float32"}, {"name": "318", "dtype": "float32"}, {"name": "319", "dtype": "float32"}, {"name": "320", "dtype": "float32"}, {"name": "321", "dtype": "float32"}, {"name": "322", "dtype": "float32"}, {"name": "323", "dtype": "float32"}, {"name": "324", "dtype": "float32"}, {"name": "325", "dtype": "float32"}, {"name": "326", "dtype": "float32"}, {"name": "327", "dtype": "float32"}, {"name": "328", "dtype": "float32"}, {"name": "329", "dtype": "float32"}, {"name": "330", "dtype": "float32"}, {"name": "331", "dtype": "float32"}, {"name": "332", "dtype": "float32"}, {"name": "333", "dtype": "float32"}, {"name": "334", "dtype": "float32"}, {"name": "335", "dtype": "float32"}, {"name": "336", "dtype": "float32"}, {"name": "337", "dtype": "float32"}, {"name": "338", "dtype": "float32"}, {"name": "339", "dtype": "float32"}, {"name": "340", "dtype": "float32"}, {"name": "341", "dtype": "float32"}, {"name": "342", "dtype": "float32"}, {"name": "343", "dtype": "float32"}, {"name": "344", "dtype": "float32"}, {"name": "345", "dtype": "float32"}, {"name": "346", "dtype": "float32"}, {"name": "347", "dtype": "float32"}, {"name": "348", "dtype": "float32"}, {"name": "349", "dtype": "float32"}, {"name": "350", "dtype": "float32"}, {"name": "351", "dtype": "float32"}, {"name": "352", "dtype": "float32"}, {"name": "353", "dtype": "float32"}, {"name": "354", "dtype": "float32"}, {"name": "355", "dtype": "float32"}, {"name": "356", "dtype": "float32"}, {"name": "357", "dtype": "float32"}, {"name": "358", "dtype": "float32"}, {"name": "359", "dtype": "float32"}, {"name": "360", "dtype": "float32"}, {"name": "361", "dtype": "float32"}, {"name": "362", "dtype": "float32"}, {"name": "363", "dtype": "float32"}, {"name": "364", "dtype": "float32"}, {"name": "365", "dtype": "float32"}, {"name": "366", "dtype": "float32"}, {"name": "367", "dtype": "float32"}, {"name": "368", "dtype": "float32"}, {"name": "369", "dtype": "float32"}, {"name": "370", "dtype": "float32"}, {"name": "371", "dtype": "float32"}, {"name": "372", "dtype": "float32"}, {"name": "373", "dtype": "float32"}, {"name": "374", "dtype": "float32"}, {"name": "375", "dtype": "float32"}, {"name": "376", "dtype": "float32"}, {"name": "377", "dtype": "float32"}, {"name": "378", "dtype": "float32"}, {"name": "379", "dtype": "float32"}, {"name": "380", "dtype": "float32"}, {"name": "381", "dtype": "float32"}, {"name": "382", "dtype": "float32"}, {"name": "383", "dtype": "float32"}, {"name": "384", "dtype": "float32"}, {"name": "385", "dtype": "float32"}, {"name": "386", "dtype": "float32"}, {"name": "387", "dtype": "float32"}, {"name": "388", "dtype": "float32"}, {"name": "389", "dtype": "float32"}, {"name": "390", "dtype": "float32"}, {"name": "391", "dtype": "float32"}, {"name": "392", "dtype": "float32"}, {"name": "393", "dtype": "float32"}, {"name": "394", "dtype": "float32"}, {"name": "395", "dtype": "float32"}, {"name": "396", "dtype": "float32"}, {"name": "397", "dtype": "float32"}, {"name": "398", "dtype": "float32"}, {"name": "399", "dtype": "float32"}, {"name": "400", "dtype": "float32"}, {"name": "401", "dtype": "float32"}, {"name": "402", "dtype": "float32"}, {"name": "403", "dtype": "float32"}, {"name": "404", "dtype": "float32"}, {"name": "405", "dtype": "float32"}, {"name": "406", "dtype": "float32"}, {"name": "407", "dtype": "float32"}, {"name": "408", "dtype": "float32"}, {"name": "409", "dtype": "float32"}, {"name": "410", "dtype": "float32"}, {"name": "411", "dtype": "float32"}, {"name": "412", "dtype": "float32"}, {"name": "413", "dtype": "float32"}, {"name": "414", "dtype": "float32"}, {"name": "415", "dtype": "float32"}, {"name": "416", "dtype": "float32"}, {"name": "417", "dtype": "float32"}, {"name": "418", "dtype": "float32"}, {"name": "419", "dtype": "float32"}, {"name": "420", "dtype": "float32"}, {"name": "421", "dtype": "float32"}, {"name": "422", "dtype": "float32"}, {"name": "423", "dtype": "float32"}, {"name": "424", "dtype": "float32"}, {"name": "425", "dtype": "float32"}, {"name": "426", "dtype": "float32"}, {"name": "427", "dtype": "float32"}, {"name": "428", "dtype": "float32"}, {"name": "429", "dtype": "float32"}, {"name": "430", "dtype": "float32"}, {"name": "431", "dtype": "float32"}, {"name": "432", "dtype": "float32"}, {"name": "433", "dtype": "float32"}, {"name": "434", "dtype": "float32"}, {"name": "435", "dtype": "float32"}, {"name": "436", "dtype": "float32"}, {"name": "437", "dtype": "float32"}, {"name": "438", "dtype": "float32"}, {"name": "439", "dtype": "float32"}, {"name": "440", "dtype": "float32"}, {"name": "441", "dtype": "float32"}, {"name": "442", "dtype": "float32"}, {"name": "443", "dtype": "float32"}, {"name": "444", "dtype": "float32"}, {"name": "445", "dtype": "float32"}, {"name": "446", "dtype": "float32"}, {"name": "447", "dtype": "float32"}, {"name": "448", "dtype": "float32"}, {"name": "449", "dtype": "float32"}, {"name": "450", "dtype": "float32"}, {"name": "451", "dtype": "float32"}, {"name": "452", "dtype": "float32"}, {"name": "453", "dtype": "float32"}, {"name": "454", "dtype": "float32"}, {"name": "455", "dtype": "float32"}, {"name": "456", "dtype": "float32"}, {"name": "457", "dtype": "float32"}, {"name": "458", "dtype": "float32"}, {"name": "459", "dtype": "float32"}, {"name": "460", "dtype": "float32"}, {"name": "461", "dtype": "float32"}, {"name": "462", "dtype": "float32"}, {"name": "463", "dtype": "float32"}, {"name": "464", "dtype": "float32"}, {"name": "465", "dtype": "float32"}, {"name": "466", "dtype": "float32"}, {"name": "467", "dtype": "float32"}, {"name": "468", "dtype": "float32"}, {"name": "469", "dtype": "float32"}, {"name": "470", "dtype": "float32"}, {"name": "471", "dtype": "float32"}, {"name": "472", "dtype": "float32"}, {"name": "473", "dtype": "float32"}, {"name": "474", "dtype": "float32"}, {"name": "475", "dtype": "float32"}, {"name": "476", "dtype": "float32"}, {"name": "477", "dtype": "float32"}, {"name": "478", "dtype": "float32"}, {"name": "479", "dtype": "float32"}, {"name": "480", "dtype": "float32"}, {"name": "481", "dtype": "float32"}, {"name": "482", "dtype": "float32"}, {"name": "483", "dtype": "float32"}, {"name": "484", "dtype": "float32"}, {"name": "485", "dtype": "float32"}, {"name": "486", "dtype": "float32"}, {"name": "487", "dtype": "float32"}, {"name": "488", "dtype": "float32"}, {"name": "489", "dtype": "float32"}, {"name": "490", "dtype": "float32"}, {"name": "491", "dtype": "float32"}, {"name": "492", "dtype": "float32"}, {"name": "493", "dtype": "float32"}, {"name": "494", "dtype": "float32"}, {"name": "495", "dtype": "float32"}, {"name": "496", "dtype": "float32"}, {"name": "497", "dtype": "float32"}, {"name": "498", "dtype": "float32"}, {"name": "499", "dtype": "float32"}, {"name": "500", "dtype": "float32"}, {"name": "501", "dtype": "float32"}, {"name": "502", "dtype": "float32"}, {"name": "503", "dtype": "float32"}, {"name": "504", "dtype": "float32"}, {"name": "505", "dtype": "float32"}, {"name": "506", "dtype": "float32"}, {"name": "507", "dtype": "float32"}, {"name": "508", "dtype": "float32"}, {"name": "509", "dtype": "float32"}, {"name": "510", "dtype": "float32"}, {"name": "511", "dtype": "float32"}, {"name": "512", "dtype": "float32"}, {"name": "513", "dtype": "float32"}, {"name": "514", "dtype": "float32"}, {"name": "515", "dtype": "float32"}, {"name": "516", "dtype": "float32"}, {"name": "517", "dtype": "float32"}, {"name": "518", "dtype": "float32"}, {"name": "519", "dtype": "float32"}, {"name": "520", "dtype": "float32"}, {"name": "521", "dtype": "float32"}, {"name": "522", "dtype": "float32"}, {"name": "523", "dtype": "float32"}, {"name": "524", "dtype": "float32"}, {"name": "525", "dtype": "float32"}, {"name": "526", "dtype": "float32"}, {"name": "527", "dtype": "float32"}, {"name": "528", "dtype": "float32"}, {"name": "529", "dtype": "float32"}, {"name": "530", "dtype": "float32"}, {"name": "531", "dtype": "float32"}, {"name": "532", "dtype": "float32"}, {"name": "533", "dtype": "float32"}, {"name": "534", "dtype": "float32"}, {"name": "535", "dtype": "float32"}, {"name": "536", "dtype": "float32"}, {"name": "537", "dtype": "float32"}, {"name": "538", "dtype": "float32"}, {"name": "539", "dtype": "float32"}, {"name": "540", "dtype": "float32"}, {"name": "541", "dtype": "float32"}, {"name": "542", "dtype": "float32"}, {"name": "543", "dtype": "float32"}, {"name": "544", "dtype": "float32"}, {"name": "545", "dtype": "float32"}, {"name": "546", "dtype": "float32"}, {"name": "547", "dtype": "float32"}, {"name": "548", "dtype": "float32"}, {"name": "549", "dtype": "float32"}, {"name": "550", "dtype": "float32"}, {"name": "551", "dtype": "float32"}, {"name": "552", "dtype": "float32"}, {"name": "553", "dtype": "float32"}, {"name": "554", "dtype": "float32"}, {"name": "555", "dtype": "float32"}, {"name": "556", "dtype": "float32"}, {"name": "557", "dtype": "float32"}, {"name": "558", "dtype": "float32"}, {"name": "559", "dtype": "float32"}, {"name": "560", "dtype": "float32"}, {"name": "561", "dtype": "float32"}, {"name": "562", "dtype": "float32"}, {"name": "563", "dtype": "float32"}, {"name": "564", "dtype": "float32"}, {"name": "565", "dtype": "float32"}, {"name": "566", "dtype": "float32"}, {"name": "567", "dtype": "float32"}, {"name": "568", "dtype": "float32"}, {"name": "569", "dtype": "float32"}, {"name": "570", "dtype": "float32"}, {"name": "571", "dtype": "float32"}, {"name": "572", "dtype": "float32"}, {"name": "573", "dtype": "float32"}, {"name": "574", "dtype": "float32"}, {"name": "575", "dtype": "float32"}, {"name": "576", "dtype": "float32"}, {"name": "577", "dtype": "float32"}, {"name": "578", "dtype": "float32"}, {"name": "579", "dtype": "float32"}, {"name": "580", "dtype": "float32"}, {"name": "581", "dtype": "float32"}, {"name": "582", "dtype": "float32"}, {"name": "583", "dtype": "float32"}, {"name": "584", "dtype": "float32"}, {"name": "585", "dtype": "float32"}, {"name": "586", "dtype": "float32"}, {"name": "587", "dtype": "float32"}, {"name": "588", "dtype": "float32"}, {"name": "589", "dtype": "float32"}, {"name": "590", "dtype": "float32"}, {"name": "591", "dtype": "float32"}, {"name": "592", "dtype": "float32"}, {"name": "593", "dtype": "float32"}, {"name": "594", "dtype": "float32"}, {"name": "595", "dtype": "float32"}, {"name": "596", "dtype": "float32"}, {"name": "597", "dtype": "float32"}, {"name": "598", "dtype": "float32"}, {"name": "599", "dtype": "float32"}, {"name": "600", "dtype": "float32"}, {"name": "601", "dtype": "float32"}, {"name": "602", "dtype": "float32"}, {"name": "603", "dtype": "float32"}, {"name": "604", "dtype": "float32"}, {"name": "605", "dtype": "float32"}, {"name": "606", "dtype": "float32"}, {"name": "607", "dtype": "float32"}, {"name": "608", "dtype": "float32"}, {"name": "609", "dtype": "float32"}, {"name": "610", "dtype": "float32"}, {"name": "611", "dtype": "float32"}, {"name": "612", "dtype": "float32"}, {"name": "613", "dtype": "float32"}, {"name": "614", "dtype": "float32"}, {"name": "615", "dtype": "float32"}, {"name": "616", "dtype": "float32"}, {"name": "617", "dtype": "float32"}, {"name": "618", "dtype": "float32"}, {"name": "619", "dtype": "float32"}, {"name": "620", "dtype": "float32"}, {"name": "621", "dtype": "float32"}, {"name": "622", "dtype": "float32"}, {"name": "623", "dtype": "float32"}, {"name": "624", "dtype": "float32"}, {"name": "625", "dtype": "float32"}, {"name": "626", "dtype": "float32"}, {"name": "627", "dtype": "float32"}, {"name": "628", "dtype": "float32"}, {"name": "629", "dtype": "float32"}, {"name": "630", "dtype": "float32"}, {"name": "631", "dtype": "float32"}, {"name": "632", "dtype": "float32"}, {"name": "633", "dtype": "float32"}, {"name": "634", "dtype": "float32"}, {"name": "635", "dtype": "float32"}, {"name": "636", "dtype": "float32"}, {"name": "637", "dtype": "float32"}, {"name": "638", "dtype": "float32"}, {"name": "639", "dtype": "float32"}, {"name": "640", "dtype": "float32"}, {"name": "641", "dtype": "float32"}, {"name": "642", "dtype": "float32"}, {"name": "643", "dtype": "float32"}, {"name": "644", "dtype": "float32"}, {"name": "645", "dtype": "float32"}, {"name": "646", "dtype": "float32"}, {"name": "647", "dtype": "float32"}, {"name": "648", "dtype": "float32"}, {"name": "649", "dtype": "float32"}, {"name": "650", "dtype": "float32"}, {"name": "651", "dtype": "float32"}, {"name": "652", "dtype": "float32"}, {"name": "653", "dtype": "float32"}, {"name": "654", "dtype": "float32"}, {"name": "655", "dtype": "float32"}, {"name": "656", "dtype": "float32"}, {"name": "657", "dtype": "float32"}, {"name": "658", "dtype": "float32"}, {"name": "659", "dtype": "float32"}, {"name": "660", "dtype": "float32"}, {"name": "661", "dtype": "float32"}, {"name": "662", "dtype": "float32"}, {"name": "663", "dtype": "float32"}, {"name": "664", "dtype": "float32"}, {"name": "665", "dtype": "float32"}, {"name": "666", "dtype": "float32"}, {"name": "667", "dtype": "float32"}, {"name": "668", "dtype": "float32"}, {"name": "669", "dtype": "float32"}, {"name": "670", "dtype": "float32"}, {"name": "671", "dtype": "float32"}, {"name": "672", "dtype": "float32"}, {"name": "673", "dtype": "float32"}, {"name": "674", "dtype": "float32"}, {"name": "675", "dtype": "float32"}, {"name": "676", "dtype": "float32"}, {"name": "677", "dtype": "float32"}, {"name": "678", "dtype": "float32"}, {"name": "679", "dtype": "float32"}, {"name": "680", "dtype": "float32"}, {"name": "681", "dtype": "float32"}, {"name": "682", "dtype": "float32"}, {"name": "683", "dtype": "float32"}, {"name": "684", "dtype": "float32"}, {"name": "685", "dtype": "float32"}, {"name": "686", "dtype": "float32"}, {"name": "687", "dtype": "float32"}, {"name": "688", "dtype": "float32"}, {"name": "689", "dtype": "float32"}, {"name": "690", "dtype": "float32"}, {"name": "691", "dtype": "float32"}, {"name": "692", "dtype": "float32"}, {"name": "693", "dtype": "float32"}, {"name": "694", "dtype": "float32"}, {"name": "695", "dtype": "float32"}, {"name": "696", "dtype": "float32"}, {"name": "697", "dtype": "float32"}, {"name": "698", "dtype": "float32"}, {"name": "699", "dtype": "float32"}, {"name": "700", "dtype": "float32"}, {"name": "701", "dtype": "float32"}, {"name": "702", "dtype": "float32"}, {"name": "703", "dtype": "float32"}, {"name": "704", "dtype": "float32"}, {"name": "705", "dtype": "float32"}, {"name": "706", "dtype": "float32"}, {"name": "707", "dtype": "float32"}, {"name": "708", "dtype": "float32"}, {"name": "709", "dtype": "float32"}, {"name": "710", "dtype": "float32"}, {"name": "711", "dtype": "float32"}, {"name": "712", "dtype": "float32"}, {"name": "713", "dtype": "float32"}, {"name": "714", "dtype": "float32"}, {"name": "715", "dtype": "float32"}, {"name": "716", "dtype": "float32"}, {"name": "717", "dtype": "float32"}, {"name": "718", "dtype": "float32"}, {"name": "719", "dtype": "float32"}, {"name": "720", "dtype": "float32"}, {"name": "721", "dtype": "float32"}, {"name": "722", "dtype": "float32"}, {"name": "723", "dtype": "float32"}, {"name": "724", "dtype": "float32"}, {"name": "725", "dtype": "float32"}, {"name": "726", "dtype": "float32"}, {"name": "727", "dtype": "float32"}, {"name": "728", "dtype": "float32"}, {"name": "729", "dtype": "float32"}, {"name": "730", "dtype": "float32"}, {"name": "731", "dtype": "float32"}, {"name": "732", "dtype": "float32"}, {"name": "733", "dtype": "float32"}, {"name": "734", "dtype": "float32"}, {"name": "735", "dtype": "float32"}, {"name": "736", "dtype": "float32"}, {"name": "737", "dtype": "float32"}, {"name": "738", "dtype": "float32"}, {"name": "739", "dtype": "float32"}, {"name": "740", "dtype": "float32"}, {"name": "741", "dtype": "float32"}, {"name": "742", "dtype": "float32"}, {"name": "743", "dtype": "float32"}, {"name": "744", "dtype": "float32"}, {"name": "745", "dtype": "float32"}, {"name": "746", "dtype": "float32"}, {"name": "747", "dtype": "float32"}, {"name": "748", "dtype": "float32"}, {"name": "749", "dtype": "float32"}, {"name": "750", "dtype": "float32"}, {"name": "751", "dtype": "float32"}, {"name": "752", "dtype": "float32"}, {"name": "753", "dtype": "float32"}, {"name": "754", "dtype": "float32"}, {"name": "755", "dtype": "float32"}, {"name": "756", "dtype": "float32"}, {"name": "757", "dtype": "float32"}, {"name": "758", "dtype": "float32"}, {"name": "759", "dtype": "float32"}, {"name": "760", "dtype": "float32"}, {"name": "761", "dtype": "float32"}, {"name": "762", "dtype": "float32"}, {"name": "763", "dtype": "float32"}, {"name": "764", "dtype": "float32"}, {"name": "765", "dtype": "float32"}, {"name": "766", "dtype": "float32"}, {"name": "767", "dtype": "float32"}, {"name": "label", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 115576729.6875, "num_examples": 37500}, {"name": "test", "num_bytes": 38525577.5, "num_examples": 12500}], "download_size": 0, "dataset_size": 154102307.1875}}
2023-08-23T02:42:12+00:00
[]
[]
TAGS #region-us
# Dataset Card for "Thunderbird_DistilRoBERTa_Baseline" More Information needed
[ "# Dataset Card for \"Thunderbird_DistilRoBERTa_Baseline\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"Thunderbird_DistilRoBERTa_Baseline\"\n\nMore Information needed" ]
[ 6, 22 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"Thunderbird_DistilRoBERTa_Baseline\"\n\nMore Information needed" ]
e45c5645d7d28dfe5e423def7f15d54ffdd457ff
# Dataset Card for "Thunderbird_GPT2_Baseline" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
EgilKarlsen/Thunderbird_GPT2_Baseline
[ "region:us" ]
2023-08-18T18:27:25+00:00
{"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "test", "path": "data/test-*"}]}], "dataset_info": {"features": [{"name": "0", "dtype": "float32"}, {"name": "1", "dtype": "float32"}, {"name": "2", "dtype": "float32"}, {"name": "3", "dtype": "float32"}, {"name": "4", "dtype": "float32"}, {"name": "5", "dtype": "float32"}, {"name": "6", "dtype": "float32"}, {"name": "7", "dtype": "float32"}, {"name": "8", "dtype": "float32"}, {"name": "9", "dtype": "float32"}, {"name": "10", "dtype": "float32"}, {"name": "11", "dtype": "float32"}, {"name": "12", "dtype": "float32"}, {"name": "13", "dtype": "float32"}, {"name": "14", "dtype": "float32"}, {"name": "15", "dtype": "float32"}, {"name": "16", "dtype": "float32"}, {"name": "17", "dtype": "float32"}, {"name": "18", "dtype": "float32"}, {"name": "19", "dtype": "float32"}, {"name": "20", "dtype": "float32"}, {"name": "21", "dtype": "float32"}, {"name": "22", "dtype": "float32"}, {"name": "23", "dtype": "float32"}, {"name": "24", "dtype": "float32"}, {"name": "25", "dtype": "float32"}, {"name": "26", "dtype": "float32"}, {"name": "27", "dtype": "float32"}, {"name": "28", "dtype": "float32"}, {"name": "29", "dtype": "float32"}, {"name": "30", "dtype": "float32"}, {"name": "31", "dtype": "float32"}, {"name": "32", "dtype": "float32"}, {"name": "33", "dtype": "float32"}, {"name": "34", "dtype": "float32"}, {"name": "35", "dtype": "float32"}, {"name": "36", "dtype": "float32"}, {"name": "37", "dtype": "float32"}, {"name": "38", "dtype": "float32"}, {"name": "39", "dtype": "float32"}, {"name": "40", "dtype": "float32"}, {"name": "41", "dtype": "float32"}, {"name": "42", "dtype": "float32"}, {"name": "43", "dtype": "float32"}, {"name": "44", "dtype": "float32"}, {"name": "45", "dtype": "float32"}, {"name": "46", "dtype": "float32"}, {"name": "47", "dtype": "float32"}, {"name": "48", "dtype": "float32"}, {"name": "49", "dtype": "float32"}, {"name": "50", "dtype": "float32"}, {"name": "51", "dtype": "float32"}, {"name": "52", "dtype": "float32"}, {"name": "53", "dtype": "float32"}, {"name": "54", "dtype": "float32"}, {"name": "55", "dtype": "float32"}, {"name": "56", "dtype": "float32"}, {"name": "57", "dtype": "float32"}, {"name": "58", "dtype": "float32"}, {"name": "59", "dtype": "float32"}, {"name": "60", "dtype": "float32"}, {"name": "61", "dtype": "float32"}, {"name": "62", "dtype": "float32"}, {"name": "63", "dtype": "float32"}, {"name": "64", "dtype": "float32"}, {"name": "65", "dtype": "float32"}, {"name": "66", "dtype": "float32"}, {"name": "67", "dtype": "float32"}, {"name": "68", "dtype": "float32"}, {"name": "69", "dtype": "float32"}, {"name": "70", "dtype": "float32"}, {"name": "71", "dtype": "float32"}, {"name": "72", "dtype": "float32"}, {"name": "73", "dtype": "float32"}, {"name": "74", "dtype": "float32"}, {"name": "75", "dtype": "float32"}, {"name": "76", "dtype": "float32"}, {"name": "77", "dtype": "float32"}, {"name": "78", "dtype": "float32"}, {"name": "79", "dtype": "float32"}, {"name": "80", "dtype": "float32"}, {"name": "81", "dtype": "float32"}, {"name": "82", "dtype": "float32"}, {"name": "83", "dtype": "float32"}, {"name": "84", "dtype": "float32"}, {"name": "85", "dtype": "float32"}, {"name": "86", "dtype": "float32"}, {"name": "87", "dtype": "float32"}, {"name": "88", "dtype": "float32"}, {"name": "89", "dtype": "float32"}, {"name": "90", "dtype": "float32"}, {"name": "91", "dtype": "float32"}, {"name": "92", "dtype": "float32"}, {"name": "93", "dtype": "float32"}, {"name": "94", "dtype": "float32"}, {"name": "95", "dtype": "float32"}, {"name": "96", "dtype": "float32"}, {"name": "97", "dtype": "float32"}, {"name": "98", "dtype": "float32"}, {"name": "99", "dtype": "float32"}, {"name": "100", "dtype": "float32"}, {"name": "101", "dtype": "float32"}, {"name": "102", "dtype": "float32"}, {"name": "103", "dtype": "float32"}, {"name": "104", "dtype": "float32"}, {"name": "105", "dtype": "float32"}, {"name": "106", "dtype": "float32"}, {"name": "107", "dtype": "float32"}, {"name": "108", "dtype": "float32"}, {"name": "109", "dtype": "float32"}, {"name": "110", "dtype": "float32"}, {"name": "111", "dtype": "float32"}, {"name": "112", "dtype": "float32"}, {"name": "113", "dtype": "float32"}, {"name": "114", "dtype": "float32"}, {"name": "115", "dtype": "float32"}, {"name": "116", "dtype": "float32"}, {"name": "117", "dtype": "float32"}, {"name": "118", "dtype": "float32"}, {"name": "119", "dtype": "float32"}, {"name": "120", "dtype": "float32"}, {"name": "121", "dtype": "float32"}, {"name": "122", "dtype": "float32"}, {"name": "123", "dtype": "float32"}, {"name": "124", "dtype": "float32"}, {"name": "125", "dtype": "float32"}, {"name": "126", "dtype": "float32"}, {"name": "127", "dtype": "float32"}, {"name": "128", "dtype": "float32"}, {"name": "129", "dtype": "float32"}, {"name": "130", "dtype": "float32"}, {"name": "131", "dtype": "float32"}, {"name": "132", "dtype": "float32"}, {"name": "133", "dtype": "float32"}, {"name": "134", "dtype": "float32"}, {"name": "135", "dtype": "float32"}, {"name": "136", "dtype": "float32"}, {"name": "137", "dtype": "float32"}, {"name": "138", "dtype": "float32"}, {"name": "139", "dtype": "float32"}, {"name": "140", "dtype": "float32"}, {"name": "141", "dtype": "float32"}, {"name": "142", "dtype": "float32"}, {"name": "143", "dtype": "float32"}, {"name": "144", "dtype": "float32"}, {"name": "145", "dtype": "float32"}, {"name": "146", "dtype": "float32"}, {"name": "147", "dtype": "float32"}, {"name": "148", "dtype": "float32"}, {"name": "149", "dtype": "float32"}, {"name": "150", "dtype": "float32"}, {"name": "151", "dtype": "float32"}, {"name": "152", "dtype": "float32"}, {"name": "153", "dtype": "float32"}, {"name": "154", "dtype": "float32"}, {"name": "155", "dtype": "float32"}, {"name": "156", "dtype": "float32"}, {"name": "157", "dtype": "float32"}, {"name": "158", "dtype": "float32"}, {"name": "159", "dtype": "float32"}, {"name": "160", "dtype": "float32"}, {"name": "161", "dtype": "float32"}, {"name": "162", "dtype": "float32"}, {"name": "163", "dtype": "float32"}, {"name": "164", "dtype": "float32"}, {"name": "165", "dtype": "float32"}, {"name": "166", "dtype": "float32"}, {"name": "167", "dtype": "float32"}, {"name": "168", "dtype": "float32"}, {"name": "169", "dtype": "float32"}, {"name": "170", "dtype": "float32"}, {"name": "171", "dtype": "float32"}, {"name": "172", "dtype": "float32"}, {"name": "173", "dtype": "float32"}, {"name": "174", "dtype": "float32"}, {"name": "175", "dtype": "float32"}, {"name": "176", "dtype": "float32"}, {"name": "177", "dtype": "float32"}, {"name": "178", "dtype": "float32"}, {"name": "179", "dtype": "float32"}, {"name": "180", "dtype": "float32"}, {"name": "181", "dtype": "float32"}, {"name": "182", "dtype": "float32"}, {"name": "183", "dtype": "float32"}, {"name": "184", "dtype": "float32"}, {"name": "185", "dtype": "float32"}, {"name": "186", "dtype": "float32"}, {"name": "187", "dtype": "float32"}, {"name": "188", "dtype": "float32"}, {"name": "189", "dtype": "float32"}, {"name": "190", "dtype": "float32"}, {"name": "191", "dtype": "float32"}, {"name": "192", "dtype": "float32"}, {"name": "193", "dtype": "float32"}, {"name": "194", "dtype": "float32"}, {"name": "195", "dtype": "float32"}, {"name": "196", "dtype": "float32"}, {"name": "197", "dtype": "float32"}, {"name": "198", "dtype": "float32"}, {"name": "199", "dtype": "float32"}, {"name": "200", "dtype": "float32"}, {"name": "201", "dtype": "float32"}, {"name": "202", "dtype": "float32"}, {"name": "203", "dtype": "float32"}, {"name": "204", "dtype": "float32"}, {"name": "205", "dtype": "float32"}, {"name": "206", "dtype": "float32"}, {"name": "207", "dtype": "float32"}, {"name": "208", "dtype": "float32"}, {"name": "209", "dtype": "float32"}, {"name": "210", "dtype": "float32"}, {"name": "211", "dtype": "float32"}, {"name": "212", "dtype": "float32"}, {"name": "213", "dtype": "float32"}, {"name": "214", "dtype": "float32"}, {"name": "215", "dtype": "float32"}, {"name": "216", "dtype": "float32"}, {"name": "217", "dtype": "float32"}, {"name": "218", "dtype": "float32"}, {"name": "219", "dtype": "float32"}, {"name": "220", "dtype": "float32"}, {"name": "221", "dtype": "float32"}, {"name": "222", "dtype": "float32"}, {"name": "223", "dtype": "float32"}, {"name": "224", "dtype": "float32"}, {"name": "225", "dtype": "float32"}, {"name": "226", "dtype": "float32"}, {"name": "227", "dtype": "float32"}, {"name": "228", "dtype": "float32"}, {"name": "229", "dtype": "float32"}, {"name": "230", "dtype": "float32"}, {"name": "231", "dtype": "float32"}, {"name": "232", "dtype": "float32"}, {"name": "233", "dtype": "float32"}, {"name": "234", "dtype": "float32"}, {"name": "235", "dtype": "float32"}, {"name": "236", "dtype": "float32"}, {"name": "237", "dtype": "float32"}, {"name": "238", "dtype": "float32"}, {"name": "239", "dtype": "float32"}, {"name": "240", "dtype": "float32"}, {"name": "241", "dtype": "float32"}, {"name": "242", "dtype": "float32"}, {"name": "243", "dtype": "float32"}, {"name": "244", "dtype": "float32"}, {"name": "245", "dtype": "float32"}, {"name": "246", "dtype": "float32"}, {"name": "247", "dtype": "float32"}, {"name": "248", "dtype": "float32"}, {"name": "249", "dtype": "float32"}, {"name": "250", "dtype": "float32"}, {"name": "251", "dtype": "float32"}, {"name": "252", "dtype": "float32"}, {"name": "253", "dtype": "float32"}, {"name": "254", "dtype": "float32"}, {"name": "255", "dtype": "float32"}, {"name": "256", "dtype": "float32"}, {"name": "257", "dtype": "float32"}, {"name": "258", "dtype": "float32"}, {"name": "259", "dtype": "float32"}, {"name": "260", "dtype": "float32"}, {"name": "261", "dtype": "float32"}, {"name": "262", "dtype": "float32"}, {"name": "263", "dtype": "float32"}, {"name": "264", "dtype": "float32"}, {"name": "265", "dtype": "float32"}, {"name": "266", "dtype": "float32"}, {"name": "267", "dtype": "float32"}, {"name": "268", "dtype": "float32"}, {"name": "269", "dtype": "float32"}, {"name": "270", "dtype": "float32"}, {"name": "271", "dtype": "float32"}, {"name": "272", "dtype": "float32"}, {"name": "273", "dtype": "float32"}, {"name": "274", "dtype": "float32"}, {"name": "275", "dtype": "float32"}, {"name": "276", "dtype": "float32"}, {"name": "277", "dtype": "float32"}, {"name": "278", "dtype": "float32"}, {"name": "279", "dtype": "float32"}, {"name": "280", "dtype": "float32"}, {"name": "281", "dtype": "float32"}, {"name": "282", "dtype": "float32"}, {"name": "283", "dtype": "float32"}, {"name": "284", "dtype": "float32"}, {"name": "285", "dtype": "float32"}, {"name": "286", "dtype": "float32"}, {"name": "287", "dtype": "float32"}, {"name": "288", "dtype": "float32"}, {"name": "289", "dtype": "float32"}, {"name": "290", "dtype": "float32"}, {"name": "291", "dtype": "float32"}, {"name": "292", "dtype": "float32"}, {"name": "293", "dtype": "float32"}, {"name": "294", "dtype": "float32"}, {"name": "295", "dtype": "float32"}, {"name": "296", "dtype": "float32"}, {"name": "297", "dtype": "float32"}, {"name": "298", "dtype": "float32"}, {"name": "299", "dtype": "float32"}, {"name": "300", "dtype": "float32"}, {"name": "301", "dtype": "float32"}, {"name": "302", "dtype": "float32"}, {"name": "303", "dtype": "float32"}, {"name": "304", "dtype": "float32"}, {"name": "305", "dtype": "float32"}, {"name": "306", "dtype": "float32"}, {"name": "307", "dtype": "float32"}, {"name": "308", "dtype": "float32"}, {"name": "309", "dtype": "float32"}, {"name": "310", "dtype": "float32"}, {"name": "311", "dtype": "float32"}, {"name": "312", "dtype": "float32"}, {"name": "313", "dtype": "float32"}, {"name": "314", "dtype": "float32"}, {"name": "315", "dtype": "float32"}, {"name": "316", "dtype": "float32"}, {"name": "317", "dtype": "float32"}, {"name": "318", "dtype": "float32"}, {"name": "319", "dtype": "float32"}, {"name": "320", "dtype": "float32"}, {"name": "321", "dtype": "float32"}, {"name": "322", "dtype": "float32"}, {"name": "323", "dtype": "float32"}, {"name": "324", "dtype": "float32"}, {"name": "325", "dtype": "float32"}, {"name": "326", "dtype": "float32"}, {"name": "327", "dtype": "float32"}, {"name": "328", "dtype": "float32"}, {"name": "329", "dtype": "float32"}, {"name": "330", "dtype": "float32"}, {"name": "331", "dtype": "float32"}, {"name": "332", "dtype": "float32"}, {"name": "333", "dtype": "float32"}, {"name": "334", "dtype": "float32"}, {"name": "335", "dtype": "float32"}, {"name": "336", "dtype": "float32"}, {"name": "337", "dtype": "float32"}, {"name": "338", "dtype": "float32"}, {"name": "339", "dtype": "float32"}, {"name": "340", "dtype": "float32"}, {"name": "341", "dtype": "float32"}, {"name": "342", "dtype": "float32"}, {"name": "343", "dtype": "float32"}, {"name": "344", "dtype": "float32"}, {"name": "345", "dtype": "float32"}, {"name": "346", "dtype": "float32"}, {"name": "347", "dtype": "float32"}, {"name": "348", "dtype": "float32"}, {"name": "349", "dtype": "float32"}, {"name": "350", "dtype": "float32"}, {"name": "351", "dtype": "float32"}, {"name": "352", "dtype": "float32"}, {"name": "353", "dtype": "float32"}, {"name": "354", "dtype": "float32"}, {"name": "355", "dtype": "float32"}, {"name": "356", "dtype": "float32"}, {"name": "357", "dtype": "float32"}, {"name": "358", "dtype": "float32"}, {"name": "359", "dtype": "float32"}, {"name": "360", "dtype": "float32"}, {"name": "361", "dtype": "float32"}, {"name": "362", "dtype": "float32"}, {"name": "363", "dtype": "float32"}, {"name": "364", "dtype": "float32"}, {"name": "365", "dtype": "float32"}, {"name": "366", "dtype": "float32"}, {"name": "367", "dtype": "float32"}, {"name": "368", "dtype": "float32"}, {"name": "369", "dtype": "float32"}, {"name": "370", "dtype": "float32"}, {"name": "371", "dtype": "float32"}, {"name": "372", "dtype": "float32"}, {"name": "373", "dtype": "float32"}, {"name": "374", "dtype": "float32"}, {"name": "375", "dtype": "float32"}, {"name": "376", "dtype": "float32"}, {"name": "377", "dtype": "float32"}, {"name": "378", "dtype": "float32"}, {"name": "379", "dtype": "float32"}, {"name": "380", "dtype": "float32"}, {"name": "381", "dtype": "float32"}, {"name": "382", "dtype": "float32"}, {"name": "383", "dtype": "float32"}, {"name": "384", "dtype": "float32"}, {"name": "385", "dtype": "float32"}, {"name": "386", "dtype": "float32"}, {"name": "387", "dtype": "float32"}, {"name": "388", "dtype": "float32"}, {"name": "389", "dtype": "float32"}, {"name": "390", "dtype": "float32"}, {"name": "391", "dtype": "float32"}, {"name": "392", "dtype": "float32"}, {"name": "393", "dtype": "float32"}, {"name": "394", "dtype": "float32"}, {"name": "395", "dtype": "float32"}, {"name": "396", "dtype": "float32"}, {"name": "397", "dtype": "float32"}, {"name": "398", "dtype": "float32"}, {"name": "399", "dtype": "float32"}, {"name": "400", "dtype": "float32"}, {"name": "401", "dtype": "float32"}, {"name": "402", "dtype": "float32"}, {"name": "403", "dtype": "float32"}, {"name": "404", "dtype": "float32"}, {"name": "405", "dtype": "float32"}, {"name": "406", "dtype": "float32"}, {"name": "407", "dtype": "float32"}, {"name": "408", "dtype": "float32"}, {"name": "409", "dtype": "float32"}, {"name": "410", "dtype": "float32"}, {"name": "411", "dtype": "float32"}, {"name": "412", "dtype": "float32"}, {"name": "413", "dtype": "float32"}, {"name": "414", "dtype": "float32"}, {"name": "415", "dtype": "float32"}, {"name": "416", "dtype": "float32"}, {"name": "417", "dtype": "float32"}, {"name": "418", "dtype": "float32"}, {"name": "419", "dtype": "float32"}, {"name": "420", "dtype": "float32"}, {"name": "421", "dtype": "float32"}, {"name": "422", "dtype": "float32"}, {"name": "423", "dtype": "float32"}, {"name": "424", "dtype": "float32"}, {"name": "425", "dtype": "float32"}, {"name": "426", "dtype": "float32"}, {"name": "427", "dtype": "float32"}, {"name": "428", "dtype": "float32"}, {"name": "429", "dtype": "float32"}, {"name": "430", "dtype": "float32"}, {"name": "431", "dtype": "float32"}, {"name": "432", "dtype": "float32"}, {"name": "433", "dtype": "float32"}, {"name": "434", "dtype": "float32"}, {"name": "435", "dtype": "float32"}, {"name": "436", "dtype": "float32"}, {"name": "437", "dtype": "float32"}, {"name": "438", "dtype": "float32"}, {"name": "439", "dtype": "float32"}, {"name": "440", "dtype": "float32"}, {"name": "441", "dtype": "float32"}, {"name": "442", "dtype": "float32"}, {"name": "443", "dtype": "float32"}, {"name": "444", "dtype": "float32"}, {"name": "445", "dtype": "float32"}, {"name": "446", "dtype": "float32"}, {"name": "447", "dtype": "float32"}, {"name": "448", "dtype": "float32"}, {"name": "449", "dtype": "float32"}, {"name": "450", "dtype": "float32"}, {"name": "451", "dtype": "float32"}, {"name": "452", "dtype": "float32"}, {"name": "453", "dtype": "float32"}, {"name": "454", "dtype": "float32"}, {"name": "455", "dtype": "float32"}, {"name": "456", "dtype": "float32"}, {"name": "457", "dtype": "float32"}, {"name": "458", "dtype": "float32"}, {"name": "459", "dtype": "float32"}, {"name": "460", "dtype": "float32"}, {"name": "461", "dtype": "float32"}, {"name": "462", "dtype": "float32"}, {"name": "463", "dtype": "float32"}, {"name": "464", "dtype": "float32"}, {"name": "465", "dtype": "float32"}, {"name": "466", "dtype": "float32"}, {"name": "467", "dtype": "float32"}, {"name": "468", "dtype": "float32"}, {"name": "469", "dtype": "float32"}, {"name": "470", "dtype": "float32"}, {"name": "471", "dtype": "float32"}, {"name": "472", "dtype": "float32"}, {"name": "473", "dtype": "float32"}, {"name": "474", "dtype": "float32"}, {"name": "475", "dtype": "float32"}, {"name": "476", "dtype": "float32"}, {"name": "477", "dtype": "float32"}, {"name": "478", "dtype": "float32"}, {"name": "479", "dtype": "float32"}, {"name": "480", "dtype": "float32"}, {"name": "481", "dtype": "float32"}, {"name": "482", "dtype": "float32"}, {"name": "483", "dtype": "float32"}, {"name": "484", "dtype": "float32"}, {"name": "485", "dtype": "float32"}, {"name": "486", "dtype": "float32"}, {"name": "487", "dtype": "float32"}, {"name": "488", "dtype": "float32"}, {"name": "489", "dtype": "float32"}, {"name": "490", "dtype": "float32"}, {"name": "491", "dtype": "float32"}, {"name": "492", "dtype": "float32"}, {"name": "493", "dtype": "float32"}, {"name": "494", "dtype": "float32"}, {"name": "495", "dtype": "float32"}, {"name": "496", "dtype": "float32"}, {"name": "497", "dtype": "float32"}, {"name": "498", "dtype": "float32"}, {"name": "499", "dtype": "float32"}, {"name": "500", "dtype": "float32"}, {"name": "501", "dtype": "float32"}, {"name": "502", "dtype": "float32"}, {"name": "503", "dtype": "float32"}, {"name": "504", "dtype": "float32"}, {"name": "505", "dtype": "float32"}, {"name": "506", "dtype": "float32"}, {"name": "507", "dtype": "float32"}, {"name": "508", "dtype": "float32"}, {"name": "509", "dtype": "float32"}, {"name": "510", "dtype": "float32"}, {"name": "511", "dtype": "float32"}, {"name": "512", "dtype": "float32"}, {"name": "513", "dtype": "float32"}, {"name": "514", "dtype": "float32"}, {"name": "515", "dtype": "float32"}, {"name": "516", "dtype": "float32"}, {"name": "517", "dtype": "float32"}, {"name": "518", "dtype": "float32"}, {"name": "519", "dtype": "float32"}, {"name": "520", "dtype": "float32"}, {"name": "521", "dtype": "float32"}, {"name": "522", "dtype": "float32"}, {"name": "523", "dtype": "float32"}, {"name": "524", "dtype": "float32"}, {"name": "525", "dtype": "float32"}, {"name": "526", "dtype": "float32"}, {"name": "527", "dtype": "float32"}, {"name": "528", "dtype": "float32"}, {"name": "529", "dtype": "float32"}, {"name": "530", "dtype": "float32"}, {"name": "531", "dtype": "float32"}, {"name": "532", "dtype": "float32"}, {"name": "533", "dtype": "float32"}, {"name": "534", "dtype": "float32"}, {"name": "535", "dtype": "float32"}, {"name": "536", "dtype": "float32"}, {"name": "537", "dtype": "float32"}, {"name": "538", "dtype": "float32"}, {"name": "539", "dtype": "float32"}, {"name": "540", "dtype": "float32"}, {"name": "541", "dtype": "float32"}, {"name": "542", "dtype": "float32"}, {"name": "543", "dtype": "float32"}, {"name": "544", "dtype": "float32"}, {"name": "545", "dtype": "float32"}, {"name": "546", "dtype": "float32"}, {"name": "547", "dtype": "float32"}, {"name": "548", "dtype": "float32"}, {"name": "549", "dtype": "float32"}, {"name": "550", "dtype": "float32"}, {"name": "551", "dtype": "float32"}, {"name": "552", "dtype": "float32"}, {"name": "553", "dtype": "float32"}, {"name": "554", "dtype": "float32"}, {"name": "555", "dtype": "float32"}, {"name": "556", "dtype": "float32"}, {"name": "557", "dtype": "float32"}, {"name": "558", "dtype": "float32"}, {"name": "559", "dtype": "float32"}, {"name": "560", "dtype": "float32"}, {"name": "561", "dtype": "float32"}, {"name": "562", "dtype": "float32"}, {"name": "563", "dtype": "float32"}, {"name": "564", "dtype": "float32"}, {"name": "565", "dtype": "float32"}, {"name": "566", "dtype": "float32"}, {"name": "567", "dtype": "float32"}, {"name": "568", "dtype": "float32"}, {"name": "569", "dtype": "float32"}, {"name": "570", "dtype": "float32"}, {"name": "571", "dtype": "float32"}, {"name": "572", "dtype": "float32"}, {"name": "573", "dtype": "float32"}, {"name": "574", "dtype": "float32"}, {"name": "575", "dtype": "float32"}, {"name": "576", "dtype": "float32"}, {"name": "577", "dtype": "float32"}, {"name": "578", "dtype": "float32"}, {"name": "579", "dtype": "float32"}, {"name": "580", "dtype": "float32"}, {"name": "581", "dtype": "float32"}, {"name": "582", "dtype": "float32"}, {"name": "583", "dtype": "float32"}, {"name": "584", "dtype": "float32"}, {"name": "585", "dtype": "float32"}, {"name": "586", "dtype": "float32"}, {"name": "587", "dtype": "float32"}, {"name": "588", "dtype": "float32"}, {"name": "589", "dtype": "float32"}, {"name": "590", "dtype": "float32"}, {"name": "591", "dtype": "float32"}, {"name": "592", "dtype": "float32"}, {"name": "593", "dtype": "float32"}, {"name": "594", "dtype": "float32"}, {"name": "595", "dtype": "float32"}, {"name": "596", "dtype": "float32"}, {"name": "597", "dtype": "float32"}, {"name": "598", "dtype": "float32"}, {"name": "599", "dtype": "float32"}, {"name": "600", "dtype": "float32"}, {"name": "601", "dtype": "float32"}, {"name": "602", "dtype": "float32"}, {"name": "603", "dtype": "float32"}, {"name": "604", "dtype": "float32"}, {"name": "605", "dtype": "float32"}, {"name": "606", "dtype": "float32"}, {"name": "607", "dtype": "float32"}, {"name": "608", "dtype": "float32"}, {"name": "609", "dtype": "float32"}, {"name": "610", "dtype": "float32"}, {"name": "611", "dtype": "float32"}, {"name": "612", "dtype": "float32"}, {"name": "613", "dtype": "float32"}, {"name": "614", "dtype": "float32"}, {"name": "615", "dtype": "float32"}, {"name": "616", "dtype": "float32"}, {"name": "617", "dtype": "float32"}, {"name": "618", "dtype": "float32"}, {"name": "619", "dtype": "float32"}, {"name": "620", "dtype": "float32"}, {"name": "621", "dtype": "float32"}, {"name": "622", "dtype": "float32"}, {"name": "623", "dtype": "float32"}, {"name": "624", "dtype": "float32"}, {"name": "625", "dtype": "float32"}, {"name": "626", "dtype": "float32"}, {"name": "627", "dtype": "float32"}, {"name": "628", "dtype": "float32"}, {"name": "629", "dtype": "float32"}, {"name": "630", "dtype": "float32"}, {"name": "631", "dtype": "float32"}, {"name": "632", "dtype": "float32"}, {"name": "633", "dtype": "float32"}, {"name": "634", "dtype": "float32"}, {"name": "635", "dtype": "float32"}, {"name": "636", "dtype": "float32"}, {"name": "637", "dtype": "float32"}, {"name": "638", "dtype": "float32"}, {"name": "639", "dtype": "float32"}, {"name": "640", "dtype": "float32"}, {"name": "641", "dtype": "float32"}, {"name": "642", "dtype": "float32"}, {"name": "643", "dtype": "float32"}, {"name": "644", "dtype": "float32"}, {"name": "645", "dtype": "float32"}, {"name": "646", "dtype": "float32"}, {"name": "647", "dtype": "float32"}, {"name": "648", "dtype": "float32"}, {"name": "649", "dtype": "float32"}, {"name": "650", "dtype": "float32"}, {"name": "651", "dtype": "float32"}, {"name": "652", "dtype": "float32"}, {"name": "653", "dtype": "float32"}, {"name": "654", "dtype": "float32"}, {"name": "655", "dtype": "float32"}, {"name": "656", "dtype": "float32"}, {"name": "657", "dtype": "float32"}, {"name": "658", "dtype": "float32"}, {"name": "659", "dtype": "float32"}, {"name": "660", "dtype": "float32"}, {"name": "661", "dtype": "float32"}, {"name": "662", "dtype": "float32"}, {"name": "663", "dtype": "float32"}, {"name": "664", "dtype": "float32"}, {"name": "665", "dtype": "float32"}, {"name": "666", "dtype": "float32"}, {"name": "667", "dtype": "float32"}, {"name": "668", "dtype": "float32"}, {"name": "669", "dtype": "float32"}, {"name": "670", "dtype": "float32"}, {"name": "671", "dtype": "float32"}, {"name": "672", "dtype": "float32"}, {"name": "673", "dtype": "float32"}, {"name": "674", "dtype": "float32"}, {"name": "675", "dtype": "float32"}, {"name": "676", "dtype": "float32"}, {"name": "677", "dtype": "float32"}, {"name": "678", "dtype": "float32"}, {"name": "679", "dtype": "float32"}, {"name": "680", "dtype": "float32"}, {"name": "681", "dtype": "float32"}, {"name": "682", "dtype": "float32"}, {"name": "683", "dtype": "float32"}, {"name": "684", "dtype": "float32"}, {"name": "685", "dtype": "float32"}, {"name": "686", "dtype": "float32"}, {"name": "687", "dtype": "float32"}, {"name": "688", "dtype": "float32"}, {"name": "689", "dtype": "float32"}, {"name": "690", "dtype": "float32"}, {"name": "691", "dtype": "float32"}, {"name": "692", "dtype": "float32"}, {"name": "693", "dtype": "float32"}, {"name": "694", "dtype": "float32"}, {"name": "695", "dtype": "float32"}, {"name": "696", "dtype": "float32"}, {"name": "697", "dtype": "float32"}, {"name": "698", "dtype": "float32"}, {"name": "699", "dtype": "float32"}, {"name": "700", "dtype": "float32"}, {"name": "701", "dtype": "float32"}, {"name": "702", "dtype": "float32"}, {"name": "703", "dtype": "float32"}, {"name": "704", "dtype": "float32"}, {"name": "705", "dtype": "float32"}, {"name": "706", "dtype": "float32"}, {"name": "707", "dtype": "float32"}, {"name": "708", "dtype": "float32"}, {"name": "709", "dtype": "float32"}, {"name": "710", "dtype": "float32"}, {"name": "711", "dtype": "float32"}, {"name": "712", "dtype": "float32"}, {"name": "713", "dtype": "float32"}, {"name": "714", "dtype": "float32"}, {"name": "715", "dtype": "float32"}, {"name": "716", "dtype": "float32"}, {"name": "717", "dtype": "float32"}, {"name": "718", "dtype": "float32"}, {"name": "719", "dtype": "float32"}, {"name": "720", "dtype": "float32"}, {"name": "721", "dtype": "float32"}, {"name": "722", "dtype": "float32"}, {"name": "723", "dtype": "float32"}, {"name": "724", "dtype": "float32"}, {"name": "725", "dtype": "float32"}, {"name": "726", "dtype": "float32"}, {"name": "727", "dtype": "float32"}, {"name": "728", "dtype": "float32"}, {"name": "729", "dtype": "float32"}, {"name": "730", "dtype": "float32"}, {"name": "731", "dtype": "float32"}, {"name": "732", "dtype": "float32"}, {"name": "733", "dtype": "float32"}, {"name": "734", "dtype": "float32"}, {"name": "735", "dtype": "float32"}, {"name": "736", "dtype": "float32"}, {"name": "737", "dtype": "float32"}, {"name": "738", "dtype": "float32"}, {"name": "739", "dtype": "float32"}, {"name": "740", "dtype": "float32"}, {"name": "741", "dtype": "float32"}, {"name": "742", "dtype": "float32"}, {"name": "743", "dtype": "float32"}, {"name": "744", "dtype": "float32"}, {"name": "745", "dtype": "float32"}, {"name": "746", "dtype": "float32"}, {"name": "747", "dtype": "float32"}, {"name": "748", "dtype": "float32"}, {"name": "749", "dtype": "float32"}, {"name": "750", "dtype": "float32"}, {"name": "751", "dtype": "float32"}, {"name": "752", "dtype": "float32"}, {"name": "753", "dtype": "float32"}, {"name": "754", "dtype": "float32"}, {"name": "755", "dtype": "float32"}, {"name": "756", "dtype": "float32"}, {"name": "757", "dtype": "float32"}, {"name": "758", "dtype": "float32"}, {"name": "759", "dtype": "float32"}, {"name": "760", "dtype": "float32"}, {"name": "761", "dtype": "float32"}, {"name": "762", "dtype": "float32"}, {"name": "763", "dtype": "float32"}, {"name": "764", "dtype": "float32"}, {"name": "765", "dtype": "float32"}, {"name": "766", "dtype": "float32"}, {"name": "767", "dtype": "float32"}, {"name": "label", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 115576729.6875, "num_examples": 37500}, {"name": "test", "num_bytes": 38525577.5, "num_examples": 12500}], "download_size": 0, "dataset_size": 154102307.1875}}
2023-08-23T02:44:12+00:00
[]
[]
TAGS #region-us
# Dataset Card for "Thunderbird_GPT2_Baseline" More Information needed
[ "# Dataset Card for \"Thunderbird_GPT2_Baseline\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"Thunderbird_GPT2_Baseline\"\n\nMore Information needed" ]
[ 6, 20 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"Thunderbird_GPT2_Baseline\"\n\nMore Information needed" ]
e14204347b7cf9c376bf1faf6d4590a63b08cbd0
# "Dharma-1" A new carefully curated benchmark set, designed for a new era where the true end user uses LLM's for zero-shot and one-shot tasks, for a vast majority of the time. Stop training your models on mindless targets (eval_loss, train_loss), start training your LLM on lightweight Dharma as an eval target. A mix of all the top benchmarks. Formed to have an equal distribution of some of the most trusted benchmarks used by those developing SOTA LLMs, comprised of only 3,000 examples for the largest size, as well as 450 and 90 for Dharma-mini and Dharma-micro respectively. The current version of Dharma is comprised of a curated sampling of the following benchmarks: - AGIEval - Bigbench - MMLU - Winogrande - Arc-C - Arc- E - OBQA - TruthfulQA - Bool-q Each of these original benchmark datasets have their own subsections, careful work has gone into also choosing an equal distribution of the important subsections of each these, to have the best representation of the original benchmark creators goals. Dharma-1 is now integrated into Axolotl as well!, so you can focus on optimizing the other aspects of your training pipeline, model architecture and/or dataset, as opposed to worrying about what is the best evaluation measurement or optimization target that will best represent capabilities for the end user. Benchmarking for top base model will be listed here when completed and verified. Special thanks to @LDJnr for their contributions. Check out their Puffin dataset here: https://huggingface.co/LDJnr [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
pharaouk/dharma-1
[ "region:us" ]
2023-08-18T18:34:12+00:00
{"configs": [{"config_name": "default", "data_files": [{"split": "dharma_1_full", "path": "dharma_1_full*"}, {"split": "dharma_1_mini", "path": "dharma_1_mini*"}, {"split": "dharma_1_micro", "path": "dharma_1_micro*"}, {"split": "dharma_1_unshuffled", "path": "dharma_eval_unshuffled*"}]}]}
2023-09-14T22:50:58+00:00
[]
[]
TAGS #region-us
# "Dharma-1" A new carefully curated benchmark set, designed for a new era where the true end user uses LLM's for zero-shot and one-shot tasks, for a vast majority of the time. Stop training your models on mindless targets (eval_loss, train_loss), start training your LLM on lightweight Dharma as an eval target. A mix of all the top benchmarks. Formed to have an equal distribution of some of the most trusted benchmarks used by those developing SOTA LLMs, comprised of only 3,000 examples for the largest size, as well as 450 and 90 for Dharma-mini and Dharma-micro respectively. The current version of Dharma is comprised of a curated sampling of the following benchmarks: - AGIEval - Bigbench - MMLU - Winogrande - Arc-C - Arc- E - OBQA - TruthfulQA - Bool-q Each of these original benchmark datasets have their own subsections, careful work has gone into also choosing an equal distribution of the important subsections of each these, to have the best representation of the original benchmark creators goals. Dharma-1 is now integrated into Axolotl as well!, so you can focus on optimizing the other aspects of your training pipeline, model architecture and/or dataset, as opposed to worrying about what is the best evaluation measurement or optimization target that will best represent capabilities for the end user. Benchmarking for top base model will be listed here when completed and verified. Special thanks to @LDJnr for their contributions. Check out their Puffin dataset here: URL More Information needed
[ "# \"Dharma-1\"\nA new carefully curated benchmark set, designed for a new era where the true end user uses LLM's for zero-shot and one-shot tasks, for a vast majority of the time.\nStop training your models on mindless targets (eval_loss, train_loss), start training your LLM on lightweight Dharma as an eval target. \nA mix of all the top benchmarks.\n\nFormed to have an equal distribution of some of the most trusted benchmarks used by those developing SOTA LLMs, comprised of only 3,000 examples for the largest size, as well as 450 and 90 for Dharma-mini and Dharma-micro respectively.\n\nThe current version of Dharma is comprised of a curated sampling of the following benchmarks:\n\n - AGIEval \n - Bigbench \n - MMLU \n - Winogrande \n - Arc-C \n - Arc- E \n - OBQA \n - TruthfulQA \n - Bool-q \n\nEach of these original benchmark datasets have their own subsections, careful work has gone into also choosing an equal distribution of the important subsections of each these, to have the best representation of the original benchmark creators goals.\n\nDharma-1 is now integrated into Axolotl as well!, so you can focus on optimizing the other aspects of your training pipeline, model architecture and/or dataset, as opposed to worrying about what is the best evaluation measurement or optimization target that will best represent capabilities for the end user.\n\nBenchmarking for top base model will be listed here when completed and verified.\n\nSpecial thanks to @LDJnr for their contributions. Check out their Puffin dataset here: URL\n\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# \"Dharma-1\"\nA new carefully curated benchmark set, designed for a new era where the true end user uses LLM's for zero-shot and one-shot tasks, for a vast majority of the time.\nStop training your models on mindless targets (eval_loss, train_loss), start training your LLM on lightweight Dharma as an eval target. \nA mix of all the top benchmarks.\n\nFormed to have an equal distribution of some of the most trusted benchmarks used by those developing SOTA LLMs, comprised of only 3,000 examples for the largest size, as well as 450 and 90 for Dharma-mini and Dharma-micro respectively.\n\nThe current version of Dharma is comprised of a curated sampling of the following benchmarks:\n\n - AGIEval \n - Bigbench \n - MMLU \n - Winogrande \n - Arc-C \n - Arc- E \n - OBQA \n - TruthfulQA \n - Bool-q \n\nEach of these original benchmark datasets have their own subsections, careful work has gone into also choosing an equal distribution of the important subsections of each these, to have the best representation of the original benchmark creators goals.\n\nDharma-1 is now integrated into Axolotl as well!, so you can focus on optimizing the other aspects of your training pipeline, model architecture and/or dataset, as opposed to worrying about what is the best evaluation measurement or optimization target that will best represent capabilities for the end user.\n\nBenchmarking for top base model will be listed here when completed and verified.\n\nSpecial thanks to @LDJnr for their contributions. Check out their Puffin dataset here: URL\n\n\nMore Information needed" ]
[ 6, 371 ]
[ "passage: TAGS\n#region-us \n# \"Dharma-1\"\nA new carefully curated benchmark set, designed for a new era where the true end user uses LLM's for zero-shot and one-shot tasks, for a vast majority of the time.\nStop training your models on mindless targets (eval_loss, train_loss), start training your LLM on lightweight Dharma as an eval target. \nA mix of all the top benchmarks.\n\nFormed to have an equal distribution of some of the most trusted benchmarks used by those developing SOTA LLMs, comprised of only 3,000 examples for the largest size, as well as 450 and 90 for Dharma-mini and Dharma-micro respectively.\n\nThe current version of Dharma is comprised of a curated sampling of the following benchmarks:\n\n - AGIEval \n - Bigbench \n - MMLU \n - Winogrande \n - Arc-C \n - Arc- E \n - OBQA \n - TruthfulQA \n - Bool-q \n\nEach of these original benchmark datasets have their own subsections, careful work has gone into also choosing an equal distribution of the important subsections of each these, to have the best representation of the original benchmark creators goals.\n\nDharma-1 is now integrated into Axolotl as well!, so you can focus on optimizing the other aspects of your training pipeline, model architecture and/or dataset, as opposed to worrying about what is the best evaluation measurement or optimization target that will best represent capabilities for the end user.\n\nBenchmarking for top base model will be listed here when completed and verified.\n\nSpecial thanks to @LDJnr for their contributions. Check out their Puffin dataset here: URL\n\n\nMore Information needed" ]
7b8527cd5f3ea1badcf934a705eecb982cf375aa
# Dataset of kana_anaberal/カナ・アナベラル (Touhou) This is the dataset of kana_anaberal/カナ・アナベラル (Touhou), containing 196 images and their tags. The core tags of this character are `blonde_hair, hat, ribbon, short_hair, yellow_eyes, bow, sun_hat, white_headwear, red_ribbon, hat_bow, hat_ribbon`, which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)). ## List of Packages | Name | Images | Size | Download | Type | Description | |:-----------------|---------:|:-----------|:----------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------| | raw | 196 | 206.33 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kana_anaberal_touhou/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). | | 800 | 196 | 135.59 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kana_anaberal_touhou/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. | | stage3-p480-800 | 420 | 263.47 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kana_anaberal_touhou/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | | 1200 | 196 | 190.28 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kana_anaberal_touhou/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. | | stage3-p480-1200 | 420 | 344.24 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kana_anaberal_touhou/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code ```python import os import zipfile from huggingface_hub import hf_hub_download from waifuc.source import LocalSource # download raw archive file zip_file = hf_hub_download( repo_id='CyberHarem/kana_anaberal_touhou', repo_type='dataset', filename='dataset-raw.zip', ) # extract files to your directory dataset_dir = 'dataset_dir' os.makedirs(dataset_dir, exist_ok=True) with zipfile.ZipFile(zip_file, 'r') as zf: zf.extractall(dataset_dir) # load the dataset with waifuc source = LocalSource(dataset_dir) for item in source: print(item.image, item.meta['filename'], item.meta['tags']) ``` ## List of Clusters List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| | 0 | 29 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, dress, solo, apron, elbow_gloves, smile, white_gloves, open_mouth | | 1 | 18 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | 1girl, blue_dress, elbow_gloves, frilled_apron, puffy_short_sleeves, white_apron, white_gloves, solo, looking_at_viewer, waist_apron, smile, red_bow, simple_background, back_bow, bangs, closed_mouth, frilled_dress, road_sign, white_background, holding, open_mouth | | 2 | 5 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | 1girl, back_bow, blue_dress, elbow_gloves, frilled_apron, frilled_dress, puffy_short_sleeves, solo, white_apron, white_gloves, red_bowtie, waist_apron, white_bow, blush, hand_on_headwear, open_mouth, blue_eyes, closed_mouth, frilled_gloves, looking_at_viewer, medium_hair | ### Table Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | dress | solo | apron | elbow_gloves | smile | white_gloves | open_mouth | blue_dress | frilled_apron | puffy_short_sleeves | white_apron | looking_at_viewer | waist_apron | red_bow | simple_background | back_bow | bangs | closed_mouth | frilled_dress | road_sign | white_background | holding | red_bowtie | white_bow | blush | hand_on_headwear | blue_eyes | frilled_gloves | medium_hair | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------|:-------|:--------|:---------------|:--------|:---------------|:-------------|:-------------|:----------------|:----------------------|:--------------|:--------------------|:--------------|:----------|:--------------------|:-----------|:--------|:---------------|:----------------|:------------|:-------------------|:----------|:-------------|:------------|:--------|:-------------------|:------------|:-----------------|:--------------| | 0 | 29 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | 1 | 18 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | X | | X | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | 2 | 5 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | X | | X | | X | | X | X | X | X | X | X | X | X | | | X | | X | X | | | | X | X | X | X | X | X | X |
CyberHarem/kana_anaberal_touhou
[ "task_categories:text-to-image", "size_categories:n<1K", "license:mit", "art", "not-for-all-audiences", "region:us" ]
2023-08-18T18:34:44+00:00
{"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]}
2024-01-15T03:10:00+00:00
[]
[]
TAGS #task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
Dataset of kana\_anaberal/カナ・アナベラル (Touhou) =========================================== This is the dataset of kana\_anaberal/カナ・アナベラル (Touhou), containing 196 images and their tags. The core tags of this character are 'blonde\_hair, hat, ribbon, short\_hair, yellow\_eyes, bow, sun\_hat, white\_headwear, red\_ribbon, hat\_bow, hat\_ribbon', which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization). List of Packages ---------------- ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code List of Clusters ---------------- List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version ### Table Version
[ "### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.", "### Raw Text Version", "### Table Version" ]
[ "TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n", "### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.", "### Raw Text Version", "### Table Version" ]
[ 44, 61, 5, 4 ]
[ "passage: TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.### Raw Text Version### Table Version" ]
6ddd18249934fce80429600c8bc8fffe5b50ff23
# Dataset of matara_okina (Touhou) This is the dataset of matara_okina (Touhou), containing 500 images and their tags. The core tags of this character are `blonde_hair, hat, long_hair, black_headwear, bangs, yellow_eyes, brown_headwear, hair_between_eyes, breasts`, which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)). ## List of Packages | Name | Images | Size | Download | Type | Description | |:-----------------|---------:|:-----------|:---------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------| | raw | 500 | 630.44 MiB | [Download](https://huggingface.co/datasets/CyberHarem/matara_okina_touhou/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). | | 800 | 500 | 368.70 MiB | [Download](https://huggingface.co/datasets/CyberHarem/matara_okina_touhou/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. | | stage3-p480-800 | 1134 | 751.54 MiB | [Download](https://huggingface.co/datasets/CyberHarem/matara_okina_touhou/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | | 1200 | 500 | 558.91 MiB | [Download](https://huggingface.co/datasets/CyberHarem/matara_okina_touhou/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. | | stage3-p480-1200 | 1134 | 1.02 GiB | [Download](https://huggingface.co/datasets/CyberHarem/matara_okina_touhou/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code ```python import os import zipfile from huggingface_hub import hf_hub_download from waifuc.source import LocalSource # download raw archive file zip_file = hf_hub_download( repo_id='CyberHarem/matara_okina_touhou', repo_type='dataset', filename='dataset-raw.zip', ) # extract files to your directory dataset_dir = 'dataset_dir' os.makedirs(dataset_dir, exist_ok=True) with zipfile.ZipFile(zip_file, 'r') as zf: zf.extractall(dataset_dir) # load the dataset with waifuc source = LocalSource(dataset_dir) for item in source: print(item.image, item.meta['filename'], item.meta['tags']) ``` ## List of Clusters List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| | 0 | 14 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, detached_sleeves, green_skirt, long_sleeves, orange_cape, simple_background, solo, white_shirt, wide_sleeves, constellation_print, looking_at_viewer, smile, tabard, eyes_visible_through_hair, white_background, medium_breasts, hand_up, closed_mouth, hands_up, orange_sleeves, sun_symbol, blush, drum, sitting, standing, boots, open_mouth | | 1 | 6 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | 1girl, detached_sleeves, long_sleeves, medium_breasts, orange_cape, orange_sleeves, simple_background, solo, tabard, upper_body, white_shirt, wide_sleeves, constellation_print, looking_at_viewer, smile, white_background, blush, hand_up, open_mouth | | 2 | 8 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | 1girl, constellation_print, detached_sleeves, green_skirt, long_sleeves, looking_at_viewer, solo, tabard, white_shirt, wide_sleeves, smile, open_mouth, orange_cape, orange_sleeves | | 3 | 12 | ![](samples/3/clu3-sample0.png) | ![](samples/3/clu3-sample1.png) | ![](samples/3/clu3-sample2.png) | ![](samples/3/clu3-sample3.png) | ![](samples/3/clu3-sample4.png) | 1girl, closed_mouth, green_skirt, long_sleeves, looking_at_viewer, sitting, smile, solo, tabard, wide_sleeves, constellation_print, detached_sleeves, white_shirt, chair, drum, orange_sleeves, boots, black_footwear | | 4 | 7 | ![](samples/4/clu4-sample0.png) | ![](samples/4/clu4-sample1.png) | ![](samples/4/clu4-sample2.png) | ![](samples/4/clu4-sample3.png) | ![](samples/4/clu4-sample4.png) | 1girl, boots, closed_mouth, detached_sleeves, full_body, green_skirt, long_sleeves, looking_at_viewer, solo, tabard, wide_sleeves, black_footwear, constellation_print, smile, aura, simple_background, standing, white_background, white_shirt | ### Table Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | detached_sleeves | green_skirt | long_sleeves | orange_cape | simple_background | solo | white_shirt | wide_sleeves | constellation_print | looking_at_viewer | smile | tabard | eyes_visible_through_hair | white_background | medium_breasts | hand_up | closed_mouth | hands_up | orange_sleeves | sun_symbol | blush | drum | sitting | standing | boots | open_mouth | upper_body | chair | black_footwear | full_body | aura | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------------------|:--------------|:---------------|:--------------|:--------------------|:-------|:--------------|:---------------|:----------------------|:--------------------|:--------|:---------|:----------------------------|:-------------------|:-----------------|:----------|:---------------|:-----------|:-----------------|:-------------|:--------|:-------|:----------|:-----------|:--------|:-------------|:-------------|:--------|:-----------------|:------------|:-------| | 0 | 14 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | 1 | 6 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | X | X | | X | X | X | X | X | X | X | X | X | X | | X | X | X | | | X | | X | | | | | X | X | | | | | | 2 | 8 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | X | X | X | X | X | | X | X | X | X | X | X | X | | | | | | | X | | | | | | | X | | | | | | | 3 | 12 | ![](samples/3/clu3-sample0.png) | ![](samples/3/clu3-sample1.png) | ![](samples/3/clu3-sample2.png) | ![](samples/3/clu3-sample3.png) | ![](samples/3/clu3-sample4.png) | X | X | X | X | | | X | X | X | X | X | X | X | | | | | X | | X | | | X | X | | X | | | X | X | | | | 4 | 7 | ![](samples/4/clu4-sample0.png) | ![](samples/4/clu4-sample1.png) | ![](samples/4/clu4-sample2.png) | ![](samples/4/clu4-sample3.png) | ![](samples/4/clu4-sample4.png) | X | X | X | X | | X | X | X | X | X | X | X | X | | X | | | X | | | | | | | X | X | | | | X | X | X |
CyberHarem/matara_okina_touhou
[ "task_categories:text-to-image", "size_categories:n<1K", "license:mit", "art", "not-for-all-audiences", "region:us" ]
2023-08-18T18:37:37+00:00
{"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]}
2024-01-15T07:16:20+00:00
[]
[]
TAGS #task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
Dataset of matara\_okina (Touhou) ================================= This is the dataset of matara\_okina (Touhou), containing 500 images and their tags. The core tags of this character are 'blonde\_hair, hat, long\_hair, black\_headwear, bangs, yellow\_eyes, brown\_headwear, hair\_between\_eyes, breasts', which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization). List of Packages ---------------- ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code List of Clusters ---------------- List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version ### Table Version
[ "### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.", "### Raw Text Version", "### Table Version" ]
[ "TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n", "### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.", "### Raw Text Version", "### Table Version" ]
[ 44, 61, 5, 4 ]
[ "passage: TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.### Raw Text Version### Table Version" ]
de0d71babf90a362f24bd2916ace22d5ec7ed5f9
# Dataset of kurokoma_saki (Touhou) This is the dataset of kurokoma_saki (Touhou), containing 42 images and their tags. The core tags of this character are `wings, black_hair, red_eyes, hat, black_wings, cowboy_hat, long_hair, bangs, brown_headwear, feathered_wings, breasts, hair_between_eyes, ponytail, tail, medium_breasts, horse_tail`, which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)). ## List of Packages | Name | Images | Size | Download | Type | Description | |:-----------------|---------:|:----------|:----------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------| | raw | 42 | 58.26 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kurokoma_saki_touhou/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). | | 800 | 42 | 35.98 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kurokoma_saki_touhou/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. | | stage3-p480-800 | 102 | 71.86 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kurokoma_saki_touhou/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | | 1200 | 42 | 51.73 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kurokoma_saki_touhou/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. | | stage3-p480-1200 | 102 | 96.86 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kurokoma_saki_touhou/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code ```python import os import zipfile from huggingface_hub import hf_hub_download from waifuc.source import LocalSource # download raw archive file zip_file = hf_hub_download( repo_id='CyberHarem/kurokoma_saki_touhou', repo_type='dataset', filename='dataset-raw.zip', ) # extract files to your directory dataset_dir = 'dataset_dir' os.makedirs(dataset_dir, exist_ok=True) with zipfile.ZipFile(zip_file, 'r') as zf: zf.extractall(dataset_dir) # load the dataset with waifuc source = LocalSource(dataset_dir) for item in source: print(item.image, item.meta['filename'], item.meta['tags']) ``` ## List of Clusters List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| | 0 | 5 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, bandana, bare_shoulders, blue_shirt, brown_skirt, looking_at_viewer, off-shoulder_shirt, overskirt, solo, cleavage, cowboy_shot, hand_up, miniskirt, puffy_short_sleeves, standing, feathers, thighs, :d, large_breasts, open_mouth, plaid_skirt, pleated_skirt, short_hair, very_long_hair | | 1 | 7 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | 1girl, brown_footwear, brown_skirt, looking_at_viewer, overskirt, smile, solo, blue_shirt, puffy_short_sleeves, plaid, simple_background, white_background, bandana, closed_mouth, full_body, hand_on_headwear, hand_up, off-shoulder_shirt, bare_shoulders, knee_boots, multicolored_clothes, scarf | | 2 | 6 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | 1girl, bandana, blue_shirt, looking_at_viewer, solo, upper_body, bare_shoulders, off-shoulder_shirt, puffy_short_sleeves, simple_background, blush, grin, cleavage, white_background | ### Table Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | bandana | bare_shoulders | blue_shirt | brown_skirt | looking_at_viewer | off-shoulder_shirt | overskirt | solo | cleavage | cowboy_shot | hand_up | miniskirt | puffy_short_sleeves | standing | feathers | thighs | :d | large_breasts | open_mouth | plaid_skirt | pleated_skirt | short_hair | very_long_hair | brown_footwear | smile | plaid | simple_background | white_background | closed_mouth | full_body | hand_on_headwear | knee_boots | multicolored_clothes | scarf | upper_body | blush | grin | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:----------|:-----------------|:-------------|:--------------|:--------------------|:---------------------|:------------|:-------|:-----------|:--------------|:----------|:------------|:----------------------|:-----------|:-----------|:---------|:-----|:----------------|:-------------|:--------------|:----------------|:-------------|:-----------------|:-----------------|:--------|:--------|:--------------------|:-------------------|:---------------|:------------|:-------------------|:-------------|:-----------------------|:--------|:-------------|:--------|:-------| | 0 | 5 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | 1 | 7 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | X | X | X | X | X | X | X | X | X | | | X | | X | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | | | | | 2 | 6 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | X | X | X | X | | X | X | | X | X | | | | X | | | | | | | | | | | | | | X | X | | | | | | | X | X | X |
CyberHarem/kurokoma_saki_touhou
[ "task_categories:text-to-image", "size_categories:n<1K", "license:mit", "art", "not-for-all-audiences", "region:us" ]
2023-08-18T18:42:30+00:00
{"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]}
2024-01-15T06:34:17+00:00
[]
[]
TAGS #task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
Dataset of kurokoma\_saki (Touhou) ================================== This is the dataset of kurokoma\_saki (Touhou), containing 42 images and their tags. The core tags of this character are 'wings, black\_hair, red\_eyes, hat, black\_wings, cowboy\_hat, long\_hair, bangs, brown\_headwear, feathered\_wings, breasts, hair\_between\_eyes, ponytail, tail, medium\_breasts, horse\_tail', which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization). List of Packages ---------------- ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code List of Clusters ---------------- List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version ### Table Version
[ "### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.", "### Raw Text Version", "### Table Version" ]
[ "TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n", "### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.", "### Raw Text Version", "### Table Version" ]
[ 44, 61, 5, 4 ]
[ "passage: TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.### Raw Text Version### Table Version" ]
5a6c26c5fae39d0d16f7828d98d72175f8e590e1
# Dataset Card for "Thunderbird_GPTNEO_Baseline" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
EgilKarlsen/Thunderbird_GPTNEO_Baseline
[ "region:us" ]
2023-08-18T18:48:47+00:00
{"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "test", "path": "data/test-*"}]}], "dataset_info": {"features": [{"name": "0", "dtype": "float32"}, {"name": "1", "dtype": "float32"}, {"name": "2", "dtype": "float32"}, {"name": "3", "dtype": "float32"}, {"name": "4", "dtype": "float32"}, {"name": "5", "dtype": "float32"}, {"name": "6", "dtype": "float32"}, {"name": "7", "dtype": "float32"}, {"name": "8", "dtype": "float32"}, {"name": "9", "dtype": "float32"}, {"name": "10", "dtype": "float32"}, {"name": "11", "dtype": "float32"}, {"name": "12", "dtype": "float32"}, {"name": "13", "dtype": "float32"}, {"name": "14", "dtype": "float32"}, {"name": "15", "dtype": "float32"}, {"name": "16", "dtype": "float32"}, {"name": "17", "dtype": "float32"}, {"name": "18", "dtype": "float32"}, {"name": "19", "dtype": "float32"}, {"name": "20", "dtype": "float32"}, {"name": "21", "dtype": "float32"}, {"name": "22", "dtype": "float32"}, {"name": "23", "dtype": "float32"}, {"name": "24", "dtype": "float32"}, {"name": "25", "dtype": "float32"}, {"name": "26", "dtype": "float32"}, {"name": "27", "dtype": "float32"}, {"name": "28", "dtype": "float32"}, {"name": "29", "dtype": "float32"}, {"name": "30", "dtype": "float32"}, {"name": "31", "dtype": "float32"}, {"name": "32", "dtype": "float32"}, {"name": "33", "dtype": "float32"}, {"name": "34", "dtype": "float32"}, {"name": "35", "dtype": "float32"}, {"name": "36", "dtype": "float32"}, {"name": "37", "dtype": "float32"}, {"name": "38", "dtype": "float32"}, {"name": "39", "dtype": "float32"}, {"name": "40", "dtype": "float32"}, {"name": "41", "dtype": "float32"}, {"name": "42", "dtype": "float32"}, {"name": "43", "dtype": "float32"}, {"name": "44", "dtype": "float32"}, {"name": "45", "dtype": "float32"}, {"name": "46", "dtype": "float32"}, {"name": "47", "dtype": "float32"}, {"name": "48", "dtype": "float32"}, {"name": "49", "dtype": "float32"}, {"name": "50", "dtype": "float32"}, {"name": "51", "dtype": "float32"}, {"name": "52", "dtype": "float32"}, {"name": "53", "dtype": "float32"}, {"name": "54", "dtype": "float32"}, {"name": "55", "dtype": "float32"}, {"name": "56", "dtype": "float32"}, {"name": "57", "dtype": "float32"}, {"name": "58", "dtype": "float32"}, {"name": "59", "dtype": "float32"}, {"name": "60", "dtype": "float32"}, {"name": "61", "dtype": "float32"}, {"name": "62", "dtype": "float32"}, {"name": "63", "dtype": "float32"}, {"name": "64", "dtype": "float32"}, {"name": "65", "dtype": "float32"}, {"name": "66", "dtype": "float32"}, {"name": "67", "dtype": "float32"}, {"name": "68", "dtype": "float32"}, {"name": "69", "dtype": "float32"}, {"name": "70", "dtype": "float32"}, {"name": "71", "dtype": "float32"}, {"name": "72", "dtype": "float32"}, {"name": "73", "dtype": "float32"}, {"name": "74", "dtype": "float32"}, {"name": "75", "dtype": "float32"}, {"name": "76", "dtype": "float32"}, {"name": "77", "dtype": "float32"}, {"name": "78", "dtype": "float32"}, {"name": "79", "dtype": "float32"}, {"name": "80", "dtype": "float32"}, {"name": "81", "dtype": "float32"}, {"name": "82", "dtype": "float32"}, {"name": "83", "dtype": "float32"}, {"name": "84", "dtype": "float32"}, {"name": "85", "dtype": "float32"}, {"name": "86", "dtype": "float32"}, {"name": "87", "dtype": "float32"}, {"name": "88", "dtype": "float32"}, {"name": "89", "dtype": "float32"}, {"name": "90", "dtype": "float32"}, {"name": "91", "dtype": "float32"}, {"name": "92", "dtype": "float32"}, {"name": "93", "dtype": "float32"}, {"name": "94", "dtype": "float32"}, {"name": "95", "dtype": "float32"}, {"name": "96", "dtype": "float32"}, {"name": "97", "dtype": "float32"}, {"name": "98", "dtype": "float32"}, {"name": "99", "dtype": "float32"}, {"name": "100", "dtype": "float32"}, {"name": "101", "dtype": "float32"}, {"name": "102", "dtype": "float32"}, {"name": "103", "dtype": "float32"}, {"name": "104", "dtype": "float32"}, {"name": "105", "dtype": "float32"}, {"name": "106", "dtype": "float32"}, {"name": "107", "dtype": "float32"}, {"name": "108", "dtype": "float32"}, {"name": "109", "dtype": "float32"}, {"name": "110", "dtype": "float32"}, {"name": "111", "dtype": "float32"}, {"name": "112", "dtype": "float32"}, {"name": "113", "dtype": "float32"}, {"name": "114", "dtype": "float32"}, {"name": "115", "dtype": "float32"}, {"name": "116", "dtype": "float32"}, {"name": "117", "dtype": "float32"}, {"name": "118", "dtype": "float32"}, {"name": "119", "dtype": "float32"}, {"name": "120", "dtype": "float32"}, {"name": "121", "dtype": "float32"}, {"name": "122", "dtype": "float32"}, {"name": "123", "dtype": "float32"}, {"name": "124", "dtype": "float32"}, {"name": "125", "dtype": "float32"}, {"name": "126", "dtype": "float32"}, {"name": "127", "dtype": "float32"}, {"name": "128", "dtype": "float32"}, {"name": "129", "dtype": "float32"}, {"name": "130", "dtype": "float32"}, {"name": "131", "dtype": "float32"}, {"name": "132", "dtype": "float32"}, {"name": "133", "dtype": "float32"}, {"name": "134", "dtype": "float32"}, {"name": "135", "dtype": "float32"}, {"name": "136", "dtype": "float32"}, {"name": "137", "dtype": "float32"}, {"name": "138", "dtype": "float32"}, {"name": "139", "dtype": "float32"}, {"name": "140", "dtype": "float32"}, {"name": "141", "dtype": "float32"}, {"name": "142", "dtype": "float32"}, {"name": "143", "dtype": "float32"}, {"name": "144", "dtype": "float32"}, {"name": "145", "dtype": "float32"}, {"name": "146", "dtype": "float32"}, {"name": "147", "dtype": "float32"}, {"name": "148", "dtype": "float32"}, {"name": "149", "dtype": "float32"}, {"name": "150", "dtype": "float32"}, {"name": "151", "dtype": "float32"}, {"name": "152", "dtype": "float32"}, {"name": "153", "dtype": "float32"}, {"name": "154", "dtype": "float32"}, {"name": "155", "dtype": "float32"}, {"name": "156", "dtype": "float32"}, {"name": "157", "dtype": "float32"}, {"name": "158", "dtype": "float32"}, {"name": "159", "dtype": "float32"}, {"name": "160", "dtype": "float32"}, {"name": "161", "dtype": "float32"}, {"name": "162", "dtype": "float32"}, {"name": "163", "dtype": "float32"}, {"name": "164", "dtype": "float32"}, {"name": "165", "dtype": "float32"}, {"name": "166", "dtype": "float32"}, {"name": "167", "dtype": "float32"}, {"name": "168", "dtype": "float32"}, {"name": "169", "dtype": "float32"}, {"name": "170", "dtype": "float32"}, {"name": "171", "dtype": "float32"}, {"name": "172", "dtype": "float32"}, {"name": "173", "dtype": "float32"}, {"name": "174", "dtype": "float32"}, {"name": "175", "dtype": "float32"}, {"name": "176", "dtype": "float32"}, {"name": "177", "dtype": "float32"}, {"name": "178", "dtype": "float32"}, {"name": "179", "dtype": "float32"}, {"name": "180", "dtype": "float32"}, {"name": "181", "dtype": "float32"}, {"name": "182", "dtype": "float32"}, {"name": "183", "dtype": "float32"}, {"name": "184", "dtype": "float32"}, {"name": "185", "dtype": "float32"}, {"name": "186", "dtype": "float32"}, {"name": "187", "dtype": "float32"}, {"name": "188", "dtype": "float32"}, {"name": "189", "dtype": "float32"}, {"name": "190", "dtype": "float32"}, {"name": "191", "dtype": "float32"}, {"name": "192", "dtype": "float32"}, {"name": "193", "dtype": "float32"}, {"name": "194", "dtype": "float32"}, {"name": "195", "dtype": "float32"}, {"name": "196", "dtype": "float32"}, {"name": "197", "dtype": "float32"}, {"name": "198", "dtype": "float32"}, {"name": "199", "dtype": "float32"}, {"name": "200", "dtype": "float32"}, {"name": "201", "dtype": "float32"}, {"name": "202", "dtype": "float32"}, {"name": "203", "dtype": "float32"}, {"name": "204", "dtype": "float32"}, {"name": "205", "dtype": "float32"}, {"name": "206", "dtype": "float32"}, {"name": "207", "dtype": "float32"}, {"name": "208", "dtype": "float32"}, {"name": "209", "dtype": "float32"}, {"name": "210", "dtype": "float32"}, {"name": "211", "dtype": "float32"}, {"name": "212", "dtype": "float32"}, {"name": "213", "dtype": "float32"}, {"name": "214", "dtype": "float32"}, {"name": "215", "dtype": "float32"}, {"name": "216", "dtype": "float32"}, {"name": "217", "dtype": "float32"}, {"name": "218", "dtype": "float32"}, {"name": "219", "dtype": "float32"}, {"name": "220", "dtype": "float32"}, {"name": "221", "dtype": "float32"}, {"name": "222", "dtype": "float32"}, {"name": "223", "dtype": "float32"}, {"name": "224", "dtype": "float32"}, {"name": "225", "dtype": "float32"}, {"name": "226", "dtype": "float32"}, {"name": "227", "dtype": "float32"}, {"name": "228", "dtype": "float32"}, {"name": "229", "dtype": "float32"}, {"name": "230", "dtype": "float32"}, {"name": "231", "dtype": "float32"}, {"name": "232", "dtype": "float32"}, {"name": "233", "dtype": "float32"}, {"name": "234", "dtype": "float32"}, {"name": "235", "dtype": "float32"}, {"name": "236", "dtype": "float32"}, {"name": "237", "dtype": "float32"}, {"name": "238", "dtype": "float32"}, {"name": "239", "dtype": "float32"}, {"name": "240", "dtype": "float32"}, {"name": "241", "dtype": "float32"}, {"name": "242", "dtype": "float32"}, {"name": "243", "dtype": "float32"}, {"name": "244", "dtype": "float32"}, {"name": "245", "dtype": "float32"}, {"name": "246", "dtype": "float32"}, {"name": "247", "dtype": "float32"}, {"name": "248", "dtype": "float32"}, {"name": "249", "dtype": "float32"}, {"name": "250", "dtype": "float32"}, {"name": "251", "dtype": "float32"}, {"name": "252", "dtype": "float32"}, {"name": "253", "dtype": "float32"}, {"name": "254", "dtype": "float32"}, {"name": "255", "dtype": "float32"}, {"name": "256", "dtype": "float32"}, {"name": "257", "dtype": "float32"}, {"name": "258", "dtype": "float32"}, {"name": "259", "dtype": "float32"}, {"name": "260", "dtype": "float32"}, {"name": "261", "dtype": "float32"}, {"name": "262", "dtype": "float32"}, {"name": "263", "dtype": "float32"}, {"name": "264", "dtype": "float32"}, {"name": "265", "dtype": "float32"}, {"name": "266", "dtype": "float32"}, {"name": "267", "dtype": "float32"}, {"name": "268", "dtype": "float32"}, {"name": "269", "dtype": "float32"}, {"name": "270", "dtype": "float32"}, {"name": "271", "dtype": "float32"}, {"name": "272", "dtype": "float32"}, {"name": "273", "dtype": "float32"}, {"name": "274", "dtype": "float32"}, {"name": "275", "dtype": "float32"}, {"name": "276", "dtype": "float32"}, {"name": "277", "dtype": "float32"}, {"name": "278", "dtype": "float32"}, {"name": "279", "dtype": "float32"}, {"name": "280", "dtype": "float32"}, {"name": "281", "dtype": "float32"}, {"name": "282", "dtype": "float32"}, {"name": "283", "dtype": "float32"}, {"name": "284", "dtype": "float32"}, {"name": "285", "dtype": "float32"}, {"name": "286", "dtype": "float32"}, {"name": "287", "dtype": "float32"}, {"name": "288", "dtype": "float32"}, {"name": "289", "dtype": "float32"}, {"name": "290", "dtype": "float32"}, {"name": "291", "dtype": "float32"}, {"name": "292", "dtype": "float32"}, {"name": "293", "dtype": "float32"}, {"name": "294", "dtype": "float32"}, {"name": "295", "dtype": "float32"}, {"name": "296", "dtype": "float32"}, {"name": "297", "dtype": "float32"}, {"name": "298", "dtype": "float32"}, {"name": "299", "dtype": "float32"}, {"name": "300", "dtype": "float32"}, {"name": "301", "dtype": "float32"}, {"name": "302", "dtype": "float32"}, {"name": "303", "dtype": "float32"}, {"name": "304", "dtype": "float32"}, {"name": "305", "dtype": "float32"}, {"name": "306", "dtype": "float32"}, {"name": "307", "dtype": "float32"}, {"name": "308", "dtype": "float32"}, {"name": "309", "dtype": "float32"}, {"name": "310", "dtype": "float32"}, {"name": "311", "dtype": "float32"}, {"name": "312", "dtype": "float32"}, {"name": "313", "dtype": "float32"}, {"name": "314", "dtype": "float32"}, {"name": "315", "dtype": "float32"}, {"name": "316", "dtype": "float32"}, {"name": "317", "dtype": "float32"}, {"name": "318", "dtype": "float32"}, {"name": "319", "dtype": "float32"}, {"name": "320", "dtype": "float32"}, {"name": "321", "dtype": "float32"}, {"name": "322", "dtype": "float32"}, {"name": "323", "dtype": "float32"}, {"name": "324", "dtype": "float32"}, {"name": "325", "dtype": "float32"}, {"name": "326", "dtype": "float32"}, {"name": "327", "dtype": "float32"}, {"name": "328", "dtype": "float32"}, {"name": "329", "dtype": "float32"}, {"name": "330", "dtype": "float32"}, {"name": "331", "dtype": "float32"}, {"name": "332", "dtype": "float32"}, {"name": "333", "dtype": "float32"}, {"name": "334", "dtype": "float32"}, {"name": "335", "dtype": "float32"}, {"name": "336", "dtype": "float32"}, {"name": "337", "dtype": "float32"}, {"name": "338", "dtype": "float32"}, {"name": "339", "dtype": "float32"}, {"name": "340", "dtype": "float32"}, {"name": "341", "dtype": "float32"}, {"name": "342", "dtype": "float32"}, {"name": "343", "dtype": "float32"}, {"name": "344", "dtype": "float32"}, {"name": "345", "dtype": "float32"}, {"name": "346", "dtype": "float32"}, {"name": "347", "dtype": "float32"}, {"name": "348", "dtype": "float32"}, {"name": "349", "dtype": "float32"}, {"name": "350", "dtype": "float32"}, {"name": "351", "dtype": "float32"}, {"name": "352", "dtype": "float32"}, {"name": "353", "dtype": "float32"}, {"name": "354", "dtype": "float32"}, {"name": "355", "dtype": "float32"}, {"name": "356", "dtype": "float32"}, {"name": "357", "dtype": "float32"}, {"name": "358", "dtype": "float32"}, {"name": "359", "dtype": "float32"}, {"name": "360", "dtype": "float32"}, {"name": "361", "dtype": "float32"}, {"name": "362", "dtype": "float32"}, {"name": "363", "dtype": "float32"}, {"name": "364", "dtype": "float32"}, {"name": "365", "dtype": "float32"}, {"name": "366", "dtype": "float32"}, {"name": "367", "dtype": "float32"}, {"name": "368", "dtype": "float32"}, {"name": "369", "dtype": "float32"}, {"name": "370", "dtype": "float32"}, {"name": "371", "dtype": "float32"}, {"name": "372", "dtype": "float32"}, {"name": "373", "dtype": "float32"}, {"name": "374", "dtype": "float32"}, {"name": "375", "dtype": "float32"}, {"name": "376", "dtype": "float32"}, {"name": "377", "dtype": "float32"}, {"name": "378", "dtype": "float32"}, {"name": "379", "dtype": "float32"}, {"name": "380", "dtype": "float32"}, {"name": "381", "dtype": "float32"}, {"name": "382", "dtype": "float32"}, {"name": "383", "dtype": "float32"}, {"name": "384", "dtype": "float32"}, {"name": "385", "dtype": "float32"}, {"name": "386", "dtype": "float32"}, {"name": "387", "dtype": "float32"}, {"name": "388", "dtype": "float32"}, {"name": "389", "dtype": "float32"}, {"name": "390", "dtype": "float32"}, {"name": "391", "dtype": "float32"}, {"name": "392", "dtype": "float32"}, {"name": "393", "dtype": "float32"}, {"name": "394", "dtype": "float32"}, {"name": "395", "dtype": "float32"}, {"name": "396", "dtype": "float32"}, {"name": "397", "dtype": "float32"}, {"name": "398", "dtype": "float32"}, {"name": "399", "dtype": "float32"}, {"name": "400", "dtype": "float32"}, {"name": "401", "dtype": "float32"}, {"name": "402", "dtype": "float32"}, {"name": "403", "dtype": "float32"}, {"name": "404", "dtype": "float32"}, {"name": "405", "dtype": "float32"}, {"name": "406", "dtype": "float32"}, {"name": "407", "dtype": "float32"}, {"name": "408", "dtype": "float32"}, {"name": "409", "dtype": "float32"}, {"name": "410", "dtype": "float32"}, {"name": "411", "dtype": "float32"}, {"name": "412", "dtype": "float32"}, {"name": "413", "dtype": "float32"}, {"name": "414", "dtype": "float32"}, {"name": "415", "dtype": "float32"}, {"name": "416", "dtype": "float32"}, {"name": "417", "dtype": "float32"}, {"name": "418", "dtype": "float32"}, {"name": "419", "dtype": "float32"}, {"name": "420", "dtype": "float32"}, {"name": "421", "dtype": "float32"}, {"name": "422", "dtype": "float32"}, {"name": "423", "dtype": "float32"}, {"name": "424", "dtype": "float32"}, {"name": "425", "dtype": "float32"}, {"name": "426", "dtype": "float32"}, {"name": "427", "dtype": "float32"}, {"name": "428", "dtype": "float32"}, {"name": "429", "dtype": "float32"}, {"name": "430", "dtype": "float32"}, {"name": "431", "dtype": "float32"}, {"name": "432", "dtype": "float32"}, {"name": "433", "dtype": "float32"}, {"name": "434", "dtype": "float32"}, {"name": "435", "dtype": "float32"}, {"name": "436", "dtype": "float32"}, {"name": "437", "dtype": "float32"}, {"name": "438", "dtype": "float32"}, {"name": "439", "dtype": "float32"}, {"name": "440", "dtype": "float32"}, {"name": "441", "dtype": "float32"}, {"name": "442", "dtype": "float32"}, {"name": "443", "dtype": "float32"}, {"name": "444", "dtype": "float32"}, {"name": "445", "dtype": "float32"}, {"name": "446", "dtype": "float32"}, {"name": "447", "dtype": "float32"}, {"name": "448", "dtype": "float32"}, {"name": "449", "dtype": "float32"}, {"name": "450", "dtype": "float32"}, {"name": "451", "dtype": "float32"}, {"name": "452", "dtype": "float32"}, {"name": "453", "dtype": "float32"}, {"name": "454", "dtype": "float32"}, {"name": "455", "dtype": "float32"}, {"name": "456", "dtype": "float32"}, {"name": "457", "dtype": "float32"}, {"name": "458", "dtype": "float32"}, {"name": "459", "dtype": "float32"}, {"name": "460", "dtype": "float32"}, {"name": "461", "dtype": "float32"}, {"name": "462", "dtype": "float32"}, {"name": "463", "dtype": "float32"}, {"name": "464", "dtype": "float32"}, {"name": "465", "dtype": "float32"}, {"name": "466", "dtype": "float32"}, {"name": "467", "dtype": "float32"}, {"name": "468", "dtype": "float32"}, {"name": "469", "dtype": "float32"}, {"name": "470", "dtype": "float32"}, {"name": "471", "dtype": "float32"}, {"name": "472", "dtype": "float32"}, {"name": "473", "dtype": "float32"}, {"name": "474", "dtype": "float32"}, {"name": "475", "dtype": "float32"}, {"name": "476", "dtype": "float32"}, {"name": "477", "dtype": "float32"}, {"name": "478", "dtype": "float32"}, {"name": "479", "dtype": "float32"}, {"name": "480", "dtype": "float32"}, {"name": "481", "dtype": "float32"}, {"name": "482", "dtype": "float32"}, {"name": "483", "dtype": "float32"}, {"name": "484", "dtype": "float32"}, {"name": "485", "dtype": "float32"}, {"name": "486", "dtype": "float32"}, {"name": "487", "dtype": "float32"}, {"name": "488", "dtype": "float32"}, {"name": "489", "dtype": "float32"}, {"name": "490", "dtype": "float32"}, {"name": "491", "dtype": "float32"}, {"name": "492", "dtype": "float32"}, {"name": "493", "dtype": "float32"}, {"name": "494", "dtype": "float32"}, {"name": "495", "dtype": "float32"}, {"name": "496", "dtype": "float32"}, {"name": "497", "dtype": "float32"}, {"name": "498", "dtype": "float32"}, {"name": "499", "dtype": "float32"}, {"name": "500", "dtype": "float32"}, {"name": "501", "dtype": "float32"}, {"name": "502", "dtype": "float32"}, {"name": "503", "dtype": "float32"}, {"name": "504", "dtype": "float32"}, {"name": "505", "dtype": "float32"}, {"name": "506", "dtype": "float32"}, {"name": "507", "dtype": "float32"}, {"name": "508", "dtype": "float32"}, {"name": "509", "dtype": "float32"}, {"name": "510", "dtype": "float32"}, {"name": "511", "dtype": "float32"}, {"name": "512", "dtype": "float32"}, {"name": "513", "dtype": "float32"}, {"name": "514", "dtype": "float32"}, {"name": "515", "dtype": "float32"}, {"name": "516", "dtype": "float32"}, {"name": "517", "dtype": "float32"}, {"name": "518", "dtype": "float32"}, {"name": "519", "dtype": "float32"}, {"name": "520", "dtype": "float32"}, {"name": "521", "dtype": "float32"}, {"name": "522", "dtype": "float32"}, {"name": "523", "dtype": "float32"}, {"name": "524", "dtype": "float32"}, {"name": "525", "dtype": "float32"}, {"name": "526", "dtype": "float32"}, {"name": "527", "dtype": "float32"}, {"name": "528", "dtype": "float32"}, {"name": "529", "dtype": "float32"}, {"name": "530", "dtype": "float32"}, {"name": "531", "dtype": "float32"}, {"name": "532", "dtype": "float32"}, {"name": "533", "dtype": "float32"}, {"name": "534", "dtype": "float32"}, {"name": "535", "dtype": "float32"}, {"name": "536", "dtype": "float32"}, {"name": "537", "dtype": "float32"}, {"name": "538", "dtype": "float32"}, {"name": "539", "dtype": "float32"}, {"name": "540", "dtype": "float32"}, {"name": "541", "dtype": "float32"}, {"name": "542", "dtype": "float32"}, {"name": "543", "dtype": "float32"}, {"name": "544", "dtype": "float32"}, {"name": "545", "dtype": "float32"}, {"name": "546", "dtype": "float32"}, {"name": "547", "dtype": "float32"}, {"name": "548", "dtype": "float32"}, {"name": "549", "dtype": "float32"}, {"name": "550", "dtype": "float32"}, {"name": "551", "dtype": "float32"}, {"name": "552", "dtype": "float32"}, {"name": "553", "dtype": "float32"}, {"name": "554", "dtype": "float32"}, {"name": "555", "dtype": "float32"}, {"name": "556", "dtype": "float32"}, {"name": "557", "dtype": "float32"}, {"name": "558", "dtype": "float32"}, {"name": "559", "dtype": "float32"}, {"name": "560", "dtype": "float32"}, {"name": "561", "dtype": "float32"}, {"name": "562", "dtype": "float32"}, {"name": "563", "dtype": "float32"}, {"name": "564", "dtype": "float32"}, {"name": "565", "dtype": "float32"}, {"name": "566", "dtype": "float32"}, {"name": "567", "dtype": "float32"}, {"name": "568", "dtype": "float32"}, {"name": "569", "dtype": "float32"}, {"name": "570", "dtype": "float32"}, {"name": "571", "dtype": "float32"}, {"name": "572", "dtype": "float32"}, {"name": "573", "dtype": "float32"}, {"name": "574", "dtype": "float32"}, {"name": "575", "dtype": "float32"}, {"name": "576", "dtype": "float32"}, {"name": "577", "dtype": "float32"}, {"name": "578", "dtype": "float32"}, {"name": "579", "dtype": "float32"}, {"name": "580", "dtype": "float32"}, {"name": "581", "dtype": "float32"}, {"name": "582", "dtype": "float32"}, {"name": "583", "dtype": "float32"}, {"name": "584", "dtype": "float32"}, {"name": "585", "dtype": "float32"}, {"name": "586", "dtype": "float32"}, {"name": "587", "dtype": "float32"}, {"name": "588", "dtype": "float32"}, {"name": "589", "dtype": "float32"}, {"name": "590", "dtype": "float32"}, {"name": "591", "dtype": "float32"}, {"name": "592", "dtype": "float32"}, {"name": "593", "dtype": "float32"}, {"name": "594", "dtype": "float32"}, {"name": "595", "dtype": "float32"}, {"name": "596", "dtype": "float32"}, {"name": "597", "dtype": "float32"}, {"name": "598", "dtype": "float32"}, {"name": "599", "dtype": "float32"}, {"name": "600", "dtype": "float32"}, {"name": "601", "dtype": "float32"}, {"name": "602", "dtype": "float32"}, {"name": "603", "dtype": "float32"}, {"name": "604", "dtype": "float32"}, {"name": "605", "dtype": "float32"}, {"name": "606", "dtype": "float32"}, {"name": "607", "dtype": "float32"}, {"name": "608", "dtype": "float32"}, {"name": "609", "dtype": "float32"}, {"name": "610", "dtype": "float32"}, {"name": "611", "dtype": "float32"}, {"name": "612", "dtype": "float32"}, {"name": "613", "dtype": "float32"}, {"name": "614", "dtype": "float32"}, {"name": "615", "dtype": "float32"}, {"name": "616", "dtype": "float32"}, {"name": "617", "dtype": "float32"}, {"name": "618", "dtype": "float32"}, {"name": "619", "dtype": "float32"}, {"name": "620", "dtype": "float32"}, {"name": "621", "dtype": "float32"}, {"name": "622", "dtype": "float32"}, {"name": "623", "dtype": "float32"}, {"name": "624", "dtype": "float32"}, {"name": "625", "dtype": "float32"}, {"name": "626", "dtype": "float32"}, {"name": "627", "dtype": "float32"}, {"name": "628", "dtype": "float32"}, {"name": "629", "dtype": "float32"}, {"name": "630", "dtype": "float32"}, {"name": "631", "dtype": "float32"}, {"name": "632", "dtype": "float32"}, {"name": "633", "dtype": "float32"}, {"name": "634", "dtype": "float32"}, {"name": "635", "dtype": "float32"}, {"name": "636", "dtype": "float32"}, {"name": "637", "dtype": "float32"}, {"name": "638", "dtype": "float32"}, {"name": "639", "dtype": "float32"}, {"name": "640", "dtype": "float32"}, {"name": "641", "dtype": "float32"}, {"name": "642", "dtype": "float32"}, {"name": "643", "dtype": "float32"}, {"name": "644", "dtype": "float32"}, {"name": "645", "dtype": "float32"}, {"name": "646", "dtype": "float32"}, {"name": "647", "dtype": "float32"}, {"name": "648", "dtype": "float32"}, {"name": "649", "dtype": "float32"}, {"name": "650", "dtype": "float32"}, {"name": "651", "dtype": "float32"}, {"name": "652", "dtype": "float32"}, {"name": "653", "dtype": "float32"}, {"name": "654", "dtype": "float32"}, {"name": "655", "dtype": "float32"}, {"name": "656", "dtype": "float32"}, {"name": "657", "dtype": "float32"}, {"name": "658", "dtype": "float32"}, {"name": "659", "dtype": "float32"}, {"name": "660", "dtype": "float32"}, {"name": "661", "dtype": "float32"}, {"name": "662", "dtype": "float32"}, {"name": "663", "dtype": "float32"}, {"name": "664", "dtype": "float32"}, {"name": "665", "dtype": "float32"}, {"name": "666", "dtype": "float32"}, {"name": "667", "dtype": "float32"}, {"name": "668", "dtype": "float32"}, {"name": "669", "dtype": "float32"}, {"name": "670", "dtype": "float32"}, {"name": "671", "dtype": "float32"}, {"name": "672", "dtype": "float32"}, {"name": "673", "dtype": "float32"}, {"name": "674", "dtype": "float32"}, {"name": "675", "dtype": "float32"}, {"name": "676", "dtype": "float32"}, {"name": "677", "dtype": "float32"}, {"name": "678", "dtype": "float32"}, {"name": "679", "dtype": "float32"}, {"name": "680", "dtype": "float32"}, {"name": "681", "dtype": "float32"}, {"name": "682", "dtype": "float32"}, {"name": "683", "dtype": "float32"}, {"name": "684", "dtype": "float32"}, {"name": "685", "dtype": "float32"}, {"name": "686", "dtype": "float32"}, {"name": "687", "dtype": "float32"}, {"name": "688", "dtype": "float32"}, {"name": "689", "dtype": "float32"}, {"name": "690", "dtype": "float32"}, {"name": "691", "dtype": "float32"}, {"name": "692", "dtype": "float32"}, {"name": "693", "dtype": "float32"}, {"name": "694", "dtype": "float32"}, {"name": "695", "dtype": "float32"}, {"name": "696", "dtype": "float32"}, {"name": "697", "dtype": "float32"}, {"name": "698", "dtype": "float32"}, {"name": "699", "dtype": "float32"}, {"name": "700", "dtype": "float32"}, {"name": "701", "dtype": "float32"}, {"name": "702", "dtype": "float32"}, {"name": "703", "dtype": "float32"}, {"name": "704", "dtype": "float32"}, {"name": "705", "dtype": "float32"}, {"name": "706", "dtype": "float32"}, {"name": "707", "dtype": "float32"}, {"name": "708", "dtype": "float32"}, {"name": "709", "dtype": "float32"}, {"name": "710", "dtype": "float32"}, {"name": "711", "dtype": "float32"}, {"name": "712", "dtype": "float32"}, {"name": "713", "dtype": "float32"}, {"name": "714", "dtype": "float32"}, {"name": "715", "dtype": "float32"}, {"name": "716", "dtype": "float32"}, {"name": "717", "dtype": "float32"}, {"name": "718", "dtype": "float32"}, {"name": "719", "dtype": "float32"}, {"name": "720", "dtype": "float32"}, {"name": "721", "dtype": "float32"}, {"name": "722", "dtype": "float32"}, {"name": "723", "dtype": "float32"}, {"name": "724", "dtype": "float32"}, {"name": "725", "dtype": "float32"}, {"name": "726", "dtype": "float32"}, {"name": "727", "dtype": "float32"}, {"name": "728", "dtype": "float32"}, {"name": "729", "dtype": "float32"}, {"name": "730", "dtype": "float32"}, {"name": "731", "dtype": "float32"}, {"name": "732", "dtype": "float32"}, {"name": "733", "dtype": "float32"}, {"name": "734", "dtype": "float32"}, {"name": "735", "dtype": "float32"}, {"name": "736", "dtype": "float32"}, {"name": "737", "dtype": "float32"}, {"name": "738", "dtype": "float32"}, {"name": "739", "dtype": "float32"}, {"name": "740", "dtype": "float32"}, {"name": "741", "dtype": "float32"}, {"name": "742", "dtype": "float32"}, {"name": "743", "dtype": "float32"}, {"name": "744", "dtype": "float32"}, {"name": "745", "dtype": "float32"}, {"name": "746", "dtype": "float32"}, {"name": "747", "dtype": "float32"}, {"name": "748", "dtype": "float32"}, {"name": "749", "dtype": "float32"}, {"name": "750", "dtype": "float32"}, {"name": "751", "dtype": "float32"}, {"name": "752", "dtype": "float32"}, {"name": "753", "dtype": "float32"}, {"name": "754", "dtype": "float32"}, {"name": "755", "dtype": "float32"}, {"name": "756", "dtype": "float32"}, {"name": "757", "dtype": "float32"}, {"name": "758", "dtype": "float32"}, {"name": "759", "dtype": "float32"}, {"name": "760", "dtype": "float32"}, {"name": "761", "dtype": "float32"}, {"name": "762", "dtype": "float32"}, {"name": "763", "dtype": "float32"}, {"name": "764", "dtype": "float32"}, {"name": "765", "dtype": "float32"}, {"name": "766", "dtype": "float32"}, {"name": "767", "dtype": "float32"}, {"name": "768", "dtype": "float32"}, {"name": "769", "dtype": "float32"}, {"name": "770", "dtype": "float32"}, {"name": "771", "dtype": "float32"}, {"name": "772", "dtype": "float32"}, {"name": "773", "dtype": "float32"}, {"name": "774", "dtype": "float32"}, {"name": "775", "dtype": "float32"}, {"name": "776", "dtype": "float32"}, {"name": "777", "dtype": "float32"}, {"name": "778", "dtype": "float32"}, {"name": "779", "dtype": "float32"}, {"name": "780", "dtype": "float32"}, {"name": "781", "dtype": "float32"}, {"name": "782", "dtype": "float32"}, {"name": "783", "dtype": "float32"}, {"name": "784", "dtype": "float32"}, {"name": "785", "dtype": "float32"}, {"name": "786", "dtype": "float32"}, {"name": "787", "dtype": "float32"}, {"name": "788", "dtype": "float32"}, {"name": "789", "dtype": "float32"}, {"name": "790", "dtype": "float32"}, {"name": "791", "dtype": "float32"}, {"name": "792", "dtype": "float32"}, {"name": "793", "dtype": "float32"}, {"name": "794", "dtype": "float32"}, {"name": "795", "dtype": "float32"}, {"name": "796", "dtype": "float32"}, {"name": "797", "dtype": "float32"}, {"name": "798", "dtype": "float32"}, {"name": "799", "dtype": "float32"}, {"name": "800", "dtype": "float32"}, {"name": "801", "dtype": "float32"}, {"name": "802", "dtype": "float32"}, {"name": "803", "dtype": "float32"}, {"name": "804", "dtype": "float32"}, {"name": "805", "dtype": "float32"}, {"name": "806", "dtype": "float32"}, {"name": "807", "dtype": "float32"}, {"name": "808", "dtype": "float32"}, {"name": "809", "dtype": "float32"}, {"name": "810", "dtype": "float32"}, {"name": "811", "dtype": "float32"}, {"name": "812", "dtype": "float32"}, {"name": "813", "dtype": "float32"}, {"name": "814", "dtype": "float32"}, {"name": "815", "dtype": "float32"}, {"name": "816", "dtype": "float32"}, {"name": "817", "dtype": "float32"}, {"name": "818", "dtype": "float32"}, {"name": "819", "dtype": "float32"}, {"name": "820", "dtype": "float32"}, {"name": "821", "dtype": "float32"}, {"name": "822", "dtype": "float32"}, {"name": "823", "dtype": "float32"}, {"name": "824", "dtype": "float32"}, {"name": "825", "dtype": "float32"}, {"name": "826", "dtype": "float32"}, {"name": "827", "dtype": "float32"}, {"name": "828", "dtype": "float32"}, {"name": "829", "dtype": "float32"}, {"name": "830", "dtype": "float32"}, {"name": "831", "dtype": "float32"}, {"name": "832", "dtype": "float32"}, {"name": "833", "dtype": "float32"}, {"name": "834", "dtype": "float32"}, {"name": "835", "dtype": "float32"}, {"name": "836", "dtype": "float32"}, {"name": "837", "dtype": "float32"}, {"name": "838", "dtype": "float32"}, {"name": "839", "dtype": "float32"}, {"name": "840", "dtype": "float32"}, {"name": "841", "dtype": "float32"}, {"name": "842", "dtype": "float32"}, {"name": "843", "dtype": "float32"}, {"name": "844", "dtype": "float32"}, {"name": "845", "dtype": "float32"}, {"name": "846", "dtype": "float32"}, {"name": "847", "dtype": "float32"}, {"name": "848", "dtype": "float32"}, {"name": "849", "dtype": "float32"}, {"name": "850", "dtype": "float32"}, {"name": "851", "dtype": "float32"}, {"name": "852", "dtype": "float32"}, {"name": "853", "dtype": "float32"}, {"name": "854", "dtype": "float32"}, {"name": "855", "dtype": "float32"}, {"name": "856", "dtype": "float32"}, {"name": "857", "dtype": "float32"}, {"name": "858", "dtype": "float32"}, {"name": "859", "dtype": "float32"}, {"name": "860", "dtype": "float32"}, {"name": "861", "dtype": "float32"}, {"name": "862", "dtype": "float32"}, {"name": "863", "dtype": "float32"}, {"name": "864", "dtype": "float32"}, {"name": "865", "dtype": "float32"}, {"name": "866", "dtype": "float32"}, {"name": "867", "dtype": "float32"}, {"name": "868", "dtype": "float32"}, {"name": "869", "dtype": "float32"}, {"name": "870", "dtype": "float32"}, {"name": "871", "dtype": "float32"}, {"name": "872", "dtype": "float32"}, {"name": "873", "dtype": "float32"}, {"name": "874", "dtype": "float32"}, {"name": "875", "dtype": "float32"}, {"name": "876", "dtype": "float32"}, {"name": "877", "dtype": "float32"}, {"name": "878", "dtype": "float32"}, {"name": "879", "dtype": "float32"}, {"name": "880", "dtype": "float32"}, {"name": "881", "dtype": "float32"}, {"name": "882", "dtype": "float32"}, {"name": "883", "dtype": "float32"}, {"name": "884", "dtype": "float32"}, {"name": "885", "dtype": "float32"}, {"name": "886", "dtype": "float32"}, {"name": "887", "dtype": "float32"}, {"name": "888", "dtype": "float32"}, {"name": "889", "dtype": "float32"}, {"name": "890", "dtype": "float32"}, {"name": "891", "dtype": "float32"}, {"name": "892", "dtype": "float32"}, {"name": "893", "dtype": "float32"}, {"name": "894", "dtype": "float32"}, {"name": "895", "dtype": "float32"}, {"name": "896", "dtype": "float32"}, {"name": "897", "dtype": "float32"}, {"name": "898", "dtype": "float32"}, {"name": "899", "dtype": "float32"}, {"name": "900", "dtype": "float32"}, {"name": "901", "dtype": "float32"}, {"name": "902", "dtype": "float32"}, {"name": "903", "dtype": "float32"}, {"name": "904", "dtype": "float32"}, {"name": "905", "dtype": "float32"}, {"name": "906", "dtype": "float32"}, {"name": "907", "dtype": "float32"}, {"name": "908", "dtype": "float32"}, {"name": "909", "dtype": "float32"}, {"name": "910", "dtype": "float32"}, {"name": "911", "dtype": "float32"}, {"name": "912", "dtype": "float32"}, {"name": "913", "dtype": "float32"}, {"name": "914", "dtype": "float32"}, {"name": "915", "dtype": "float32"}, {"name": "916", "dtype": "float32"}, {"name": "917", "dtype": "float32"}, {"name": "918", "dtype": "float32"}, {"name": "919", "dtype": "float32"}, {"name": "920", "dtype": "float32"}, {"name": "921", "dtype": "float32"}, {"name": "922", "dtype": "float32"}, {"name": "923", "dtype": "float32"}, {"name": "924", "dtype": "float32"}, {"name": "925", "dtype": "float32"}, {"name": "926", "dtype": "float32"}, {"name": "927", "dtype": "float32"}, {"name": "928", "dtype": "float32"}, {"name": "929", "dtype": "float32"}, {"name": "930", "dtype": "float32"}, {"name": "931", "dtype": "float32"}, {"name": "932", "dtype": "float32"}, {"name": "933", "dtype": "float32"}, {"name": "934", "dtype": "float32"}, {"name": "935", "dtype": "float32"}, {"name": "936", "dtype": "float32"}, {"name": "937", "dtype": "float32"}, {"name": "938", "dtype": "float32"}, {"name": "939", "dtype": "float32"}, {"name": "940", "dtype": "float32"}, {"name": "941", "dtype": "float32"}, {"name": "942", "dtype": "float32"}, {"name": "943", "dtype": "float32"}, {"name": "944", "dtype": "float32"}, {"name": "945", "dtype": "float32"}, {"name": "946", "dtype": "float32"}, {"name": "947", "dtype": "float32"}, {"name": "948", "dtype": "float32"}, {"name": "949", "dtype": "float32"}, {"name": "950", "dtype": "float32"}, {"name": "951", "dtype": "float32"}, {"name": "952", "dtype": "float32"}, {"name": "953", "dtype": "float32"}, {"name": "954", "dtype": "float32"}, {"name": "955", "dtype": "float32"}, {"name": "956", "dtype": "float32"}, {"name": "957", "dtype": "float32"}, {"name": "958", "dtype": "float32"}, {"name": "959", "dtype": "float32"}, {"name": "960", "dtype": "float32"}, {"name": "961", "dtype": "float32"}, {"name": "962", "dtype": "float32"}, {"name": "963", "dtype": "float32"}, {"name": "964", "dtype": "float32"}, {"name": "965", "dtype": "float32"}, {"name": "966", "dtype": "float32"}, {"name": "967", "dtype": "float32"}, {"name": "968", "dtype": "float32"}, {"name": "969", "dtype": "float32"}, {"name": "970", "dtype": "float32"}, {"name": "971", "dtype": "float32"}, {"name": "972", "dtype": "float32"}, {"name": "973", "dtype": "float32"}, {"name": "974", "dtype": "float32"}, {"name": "975", "dtype": "float32"}, {"name": "976", "dtype": "float32"}, {"name": "977", "dtype": "float32"}, {"name": "978", "dtype": "float32"}, {"name": "979", "dtype": "float32"}, {"name": "980", "dtype": "float32"}, {"name": "981", "dtype": "float32"}, {"name": "982", "dtype": "float32"}, {"name": "983", "dtype": "float32"}, {"name": "984", "dtype": "float32"}, {"name": "985", "dtype": "float32"}, {"name": "986", "dtype": "float32"}, {"name": "987", "dtype": "float32"}, {"name": "988", "dtype": "float32"}, {"name": "989", "dtype": "float32"}, {"name": "990", "dtype": "float32"}, {"name": "991", "dtype": "float32"}, {"name": "992", "dtype": "float32"}, {"name": "993", "dtype": "float32"}, {"name": "994", "dtype": "float32"}, {"name": "995", "dtype": "float32"}, {"name": "996", "dtype": "float32"}, {"name": "997", "dtype": "float32"}, {"name": "998", "dtype": "float32"}, {"name": "999", "dtype": "float32"}, {"name": "1000", "dtype": "float32"}, {"name": "1001", "dtype": "float32"}, {"name": "1002", "dtype": "float32"}, {"name": "1003", "dtype": "float32"}, {"name": "1004", "dtype": "float32"}, {"name": "1005", "dtype": "float32"}, {"name": "1006", "dtype": "float32"}, {"name": "1007", "dtype": "float32"}, {"name": "1008", "dtype": "float32"}, {"name": "1009", "dtype": "float32"}, {"name": "1010", "dtype": "float32"}, {"name": "1011", "dtype": "float32"}, {"name": "1012", "dtype": "float32"}, {"name": "1013", "dtype": "float32"}, {"name": "1014", "dtype": "float32"}, {"name": "1015", "dtype": "float32"}, {"name": "1016", "dtype": "float32"}, {"name": "1017", "dtype": "float32"}, {"name": "1018", "dtype": "float32"}, {"name": "1019", "dtype": "float32"}, {"name": "1020", "dtype": "float32"}, {"name": "1021", "dtype": "float32"}, {"name": "1022", "dtype": "float32"}, {"name": "1023", "dtype": "float32"}, {"name": "1024", "dtype": "float32"}, {"name": "1025", "dtype": "float32"}, {"name": "1026", "dtype": "float32"}, {"name": "1027", "dtype": "float32"}, {"name": "1028", "dtype": "float32"}, {"name": "1029", "dtype": "float32"}, {"name": "1030", "dtype": "float32"}, {"name": "1031", "dtype": "float32"}, {"name": "1032", "dtype": "float32"}, {"name": "1033", "dtype": "float32"}, {"name": "1034", "dtype": "float32"}, {"name": "1035", "dtype": "float32"}, {"name": "1036", "dtype": "float32"}, {"name": "1037", "dtype": "float32"}, {"name": "1038", "dtype": "float32"}, {"name": "1039", "dtype": "float32"}, {"name": "1040", "dtype": "float32"}, {"name": "1041", "dtype": "float32"}, {"name": "1042", "dtype": "float32"}, {"name": "1043", "dtype": "float32"}, {"name": "1044", "dtype": "float32"}, {"name": "1045", "dtype": "float32"}, {"name": "1046", "dtype": "float32"}, {"name": "1047", "dtype": "float32"}, {"name": "1048", "dtype": "float32"}, {"name": "1049", "dtype": "float32"}, {"name": "1050", "dtype": "float32"}, {"name": "1051", "dtype": "float32"}, {"name": "1052", "dtype": "float32"}, {"name": "1053", "dtype": "float32"}, {"name": "1054", "dtype": "float32"}, {"name": "1055", "dtype": "float32"}, {"name": "1056", "dtype": "float32"}, {"name": "1057", "dtype": "float32"}, {"name": "1058", "dtype": "float32"}, {"name": "1059", "dtype": "float32"}, {"name": "1060", "dtype": "float32"}, {"name": "1061", "dtype": "float32"}, {"name": "1062", "dtype": "float32"}, {"name": "1063", "dtype": "float32"}, {"name": "1064", "dtype": "float32"}, {"name": "1065", "dtype": "float32"}, {"name": "1066", "dtype": "float32"}, {"name": "1067", "dtype": "float32"}, {"name": "1068", "dtype": "float32"}, {"name": "1069", "dtype": "float32"}, {"name": "1070", "dtype": "float32"}, {"name": "1071", "dtype": "float32"}, {"name": "1072", "dtype": "float32"}, {"name": "1073", "dtype": "float32"}, {"name": "1074", "dtype": "float32"}, {"name": "1075", "dtype": "float32"}, {"name": "1076", "dtype": "float32"}, {"name": "1077", "dtype": "float32"}, {"name": "1078", "dtype": "float32"}, {"name": "1079", "dtype": "float32"}, {"name": "1080", "dtype": "float32"}, {"name": "1081", "dtype": "float32"}, {"name": "1082", "dtype": "float32"}, {"name": "1083", "dtype": "float32"}, {"name": "1084", "dtype": "float32"}, {"name": "1085", "dtype": "float32"}, {"name": "1086", "dtype": "float32"}, {"name": "1087", "dtype": "float32"}, {"name": "1088", "dtype": "float32"}, {"name": "1089", "dtype": "float32"}, {"name": "1090", "dtype": "float32"}, {"name": "1091", "dtype": "float32"}, {"name": "1092", "dtype": "float32"}, {"name": "1093", "dtype": "float32"}, {"name": "1094", "dtype": "float32"}, {"name": "1095", "dtype": "float32"}, {"name": "1096", "dtype": "float32"}, {"name": "1097", "dtype": "float32"}, {"name": "1098", "dtype": "float32"}, {"name": "1099", "dtype": "float32"}, {"name": "1100", "dtype": "float32"}, {"name": "1101", "dtype": "float32"}, {"name": "1102", "dtype": "float32"}, {"name": "1103", "dtype": "float32"}, {"name": "1104", "dtype": "float32"}, {"name": "1105", "dtype": "float32"}, {"name": "1106", "dtype": "float32"}, {"name": "1107", "dtype": "float32"}, {"name": "1108", "dtype": "float32"}, {"name": "1109", "dtype": "float32"}, {"name": "1110", "dtype": "float32"}, {"name": "1111", "dtype": "float32"}, {"name": "1112", "dtype": "float32"}, {"name": "1113", "dtype": "float32"}, {"name": "1114", "dtype": "float32"}, {"name": "1115", "dtype": "float32"}, {"name": "1116", "dtype": "float32"}, {"name": "1117", "dtype": "float32"}, {"name": "1118", "dtype": "float32"}, {"name": "1119", "dtype": "float32"}, {"name": "1120", "dtype": "float32"}, {"name": "1121", "dtype": "float32"}, {"name": "1122", "dtype": "float32"}, {"name": "1123", "dtype": "float32"}, {"name": "1124", "dtype": "float32"}, {"name": "1125", "dtype": "float32"}, {"name": "1126", "dtype": "float32"}, {"name": "1127", "dtype": "float32"}, {"name": "1128", "dtype": "float32"}, {"name": "1129", "dtype": "float32"}, {"name": "1130", "dtype": "float32"}, {"name": "1131", "dtype": "float32"}, {"name": "1132", "dtype": "float32"}, {"name": "1133", "dtype": "float32"}, {"name": "1134", "dtype": "float32"}, {"name": "1135", "dtype": "float32"}, {"name": "1136", "dtype": "float32"}, {"name": "1137", "dtype": "float32"}, {"name": "1138", "dtype": "float32"}, {"name": "1139", "dtype": "float32"}, {"name": "1140", "dtype": "float32"}, {"name": "1141", "dtype": "float32"}, {"name": "1142", "dtype": "float32"}, {"name": "1143", "dtype": "float32"}, {"name": "1144", "dtype": "float32"}, {"name": "1145", "dtype": "float32"}, {"name": "1146", "dtype": "float32"}, {"name": "1147", "dtype": "float32"}, {"name": "1148", "dtype": "float32"}, {"name": "1149", "dtype": "float32"}, {"name": "1150", "dtype": "float32"}, {"name": "1151", "dtype": "float32"}, {"name": "1152", "dtype": "float32"}, {"name": "1153", "dtype": "float32"}, {"name": "1154", "dtype": "float32"}, {"name": "1155", "dtype": "float32"}, {"name": "1156", "dtype": "float32"}, {"name": "1157", "dtype": "float32"}, {"name": "1158", "dtype": "float32"}, {"name": "1159", "dtype": "float32"}, {"name": "1160", "dtype": "float32"}, {"name": "1161", "dtype": "float32"}, {"name": "1162", "dtype": "float32"}, {"name": "1163", "dtype": "float32"}, {"name": "1164", "dtype": "float32"}, {"name": "1165", "dtype": "float32"}, {"name": "1166", "dtype": "float32"}, {"name": "1167", "dtype": "float32"}, {"name": "1168", "dtype": "float32"}, {"name": "1169", "dtype": "float32"}, {"name": "1170", "dtype": "float32"}, {"name": "1171", "dtype": "float32"}, {"name": "1172", "dtype": "float32"}, {"name": "1173", "dtype": "float32"}, {"name": "1174", "dtype": "float32"}, {"name": "1175", "dtype": "float32"}, {"name": "1176", "dtype": "float32"}, {"name": "1177", "dtype": "float32"}, {"name": "1178", "dtype": "float32"}, {"name": "1179", "dtype": "float32"}, {"name": "1180", "dtype": "float32"}, {"name": "1181", "dtype": "float32"}, {"name": "1182", "dtype": "float32"}, {"name": "1183", "dtype": "float32"}, {"name": "1184", "dtype": "float32"}, {"name": "1185", "dtype": "float32"}, {"name": "1186", "dtype": "float32"}, {"name": "1187", "dtype": "float32"}, {"name": "1188", "dtype": "float32"}, {"name": "1189", "dtype": "float32"}, {"name": "1190", "dtype": "float32"}, {"name": "1191", "dtype": "float32"}, {"name": "1192", "dtype": "float32"}, {"name": "1193", "dtype": "float32"}, {"name": "1194", "dtype": "float32"}, {"name": "1195", "dtype": "float32"}, {"name": "1196", "dtype": "float32"}, {"name": "1197", "dtype": "float32"}, {"name": "1198", "dtype": "float32"}, {"name": "1199", "dtype": "float32"}, {"name": "1200", "dtype": "float32"}, {"name": "1201", "dtype": "float32"}, {"name": "1202", "dtype": "float32"}, {"name": "1203", "dtype": "float32"}, {"name": "1204", "dtype": "float32"}, {"name": "1205", "dtype": "float32"}, {"name": "1206", "dtype": "float32"}, {"name": "1207", "dtype": "float32"}, {"name": "1208", "dtype": "float32"}, {"name": "1209", "dtype": "float32"}, {"name": "1210", "dtype": "float32"}, {"name": "1211", "dtype": "float32"}, {"name": "1212", "dtype": "float32"}, {"name": "1213", "dtype": "float32"}, {"name": "1214", "dtype": "float32"}, {"name": "1215", "dtype": "float32"}, {"name": "1216", "dtype": "float32"}, {"name": "1217", "dtype": "float32"}, {"name": "1218", "dtype": "float32"}, {"name": "1219", "dtype": "float32"}, {"name": "1220", "dtype": "float32"}, {"name": "1221", "dtype": "float32"}, {"name": "1222", "dtype": "float32"}, {"name": "1223", "dtype": "float32"}, {"name": "1224", "dtype": "float32"}, {"name": "1225", "dtype": "float32"}, {"name": "1226", "dtype": "float32"}, {"name": "1227", "dtype": "float32"}, {"name": "1228", "dtype": "float32"}, {"name": "1229", "dtype": "float32"}, {"name": "1230", "dtype": "float32"}, {"name": "1231", "dtype": "float32"}, {"name": "1232", "dtype": "float32"}, {"name": "1233", "dtype": "float32"}, {"name": "1234", "dtype": "float32"}, {"name": "1235", "dtype": "float32"}, {"name": "1236", "dtype": "float32"}, {"name": "1237", "dtype": "float32"}, {"name": "1238", "dtype": "float32"}, {"name": "1239", "dtype": "float32"}, {"name": "1240", "dtype": "float32"}, {"name": "1241", "dtype": "float32"}, {"name": "1242", "dtype": "float32"}, {"name": "1243", "dtype": "float32"}, {"name": "1244", "dtype": "float32"}, {"name": "1245", "dtype": "float32"}, {"name": "1246", "dtype": "float32"}, {"name": "1247", "dtype": "float32"}, {"name": "1248", "dtype": "float32"}, {"name": "1249", "dtype": "float32"}, {"name": "1250", "dtype": "float32"}, {"name": "1251", "dtype": "float32"}, {"name": "1252", "dtype": "float32"}, {"name": "1253", "dtype": "float32"}, {"name": "1254", "dtype": "float32"}, {"name": "1255", "dtype": "float32"}, {"name": "1256", "dtype": "float32"}, {"name": "1257", "dtype": "float32"}, {"name": "1258", "dtype": "float32"}, {"name": "1259", "dtype": "float32"}, {"name": "1260", "dtype": "float32"}, {"name": "1261", "dtype": "float32"}, {"name": "1262", "dtype": "float32"}, {"name": "1263", "dtype": "float32"}, {"name": "1264", "dtype": "float32"}, {"name": "1265", "dtype": "float32"}, {"name": "1266", "dtype": "float32"}, {"name": "1267", "dtype": "float32"}, {"name": "1268", "dtype": "float32"}, {"name": "1269", "dtype": "float32"}, {"name": "1270", "dtype": "float32"}, {"name": "1271", "dtype": "float32"}, {"name": "1272", "dtype": "float32"}, {"name": "1273", "dtype": "float32"}, {"name": "1274", "dtype": "float32"}, {"name": "1275", "dtype": "float32"}, {"name": "1276", "dtype": "float32"}, {"name": "1277", "dtype": "float32"}, {"name": "1278", "dtype": "float32"}, {"name": "1279", "dtype": "float32"}, {"name": "1280", "dtype": "float32"}, {"name": "1281", "dtype": "float32"}, {"name": "1282", "dtype": "float32"}, {"name": "1283", "dtype": "float32"}, {"name": "1284", "dtype": "float32"}, {"name": "1285", "dtype": "float32"}, {"name": "1286", "dtype": "float32"}, {"name": "1287", "dtype": "float32"}, {"name": "1288", "dtype": "float32"}, {"name": "1289", "dtype": "float32"}, {"name": "1290", "dtype": "float32"}, {"name": "1291", "dtype": "float32"}, {"name": "1292", "dtype": "float32"}, {"name": "1293", "dtype": "float32"}, {"name": "1294", "dtype": "float32"}, {"name": "1295", "dtype": "float32"}, {"name": "1296", "dtype": "float32"}, {"name": "1297", "dtype": "float32"}, {"name": "1298", "dtype": "float32"}, {"name": "1299", "dtype": "float32"}, {"name": "1300", "dtype": "float32"}, {"name": "1301", "dtype": "float32"}, {"name": "1302", "dtype": "float32"}, {"name": "1303", "dtype": "float32"}, {"name": "1304", "dtype": "float32"}, {"name": "1305", "dtype": "float32"}, {"name": "1306", "dtype": "float32"}, {"name": "1307", "dtype": "float32"}, {"name": "1308", "dtype": "float32"}, {"name": "1309", "dtype": "float32"}, {"name": "1310", "dtype": "float32"}, {"name": "1311", "dtype": "float32"}, {"name": "1312", "dtype": "float32"}, {"name": "1313", "dtype": "float32"}, {"name": "1314", "dtype": "float32"}, {"name": "1315", "dtype": "float32"}, {"name": "1316", "dtype": "float32"}, {"name": "1317", "dtype": "float32"}, {"name": "1318", "dtype": "float32"}, {"name": "1319", "dtype": "float32"}, {"name": "1320", "dtype": "float32"}, {"name": "1321", "dtype": "float32"}, {"name": "1322", "dtype": "float32"}, {"name": "1323", "dtype": "float32"}, {"name": "1324", "dtype": "float32"}, {"name": "1325", "dtype": "float32"}, {"name": "1326", "dtype": "float32"}, {"name": "1327", "dtype": "float32"}, {"name": "1328", "dtype": "float32"}, {"name": "1329", "dtype": "float32"}, {"name": "1330", "dtype": "float32"}, {"name": "1331", "dtype": "float32"}, {"name": "1332", "dtype": "float32"}, {"name": "1333", "dtype": "float32"}, {"name": "1334", "dtype": "float32"}, {"name": "1335", "dtype": "float32"}, {"name": "1336", "dtype": "float32"}, {"name": "1337", "dtype": "float32"}, {"name": "1338", "dtype": "float32"}, {"name": "1339", "dtype": "float32"}, {"name": "1340", "dtype": "float32"}, {"name": "1341", "dtype": "float32"}, {"name": "1342", "dtype": "float32"}, {"name": "1343", "dtype": "float32"}, {"name": "1344", "dtype": "float32"}, {"name": "1345", "dtype": "float32"}, {"name": "1346", "dtype": "float32"}, {"name": "1347", "dtype": "float32"}, {"name": "1348", "dtype": "float32"}, {"name": "1349", "dtype": "float32"}, {"name": "1350", "dtype": "float32"}, {"name": "1351", "dtype": "float32"}, {"name": "1352", "dtype": "float32"}, {"name": "1353", "dtype": "float32"}, {"name": "1354", "dtype": "float32"}, {"name": "1355", "dtype": "float32"}, {"name": "1356", "dtype": "float32"}, {"name": "1357", "dtype": "float32"}, {"name": "1358", "dtype": "float32"}, {"name": "1359", "dtype": "float32"}, {"name": "1360", "dtype": "float32"}, {"name": "1361", "dtype": "float32"}, {"name": "1362", "dtype": "float32"}, {"name": "1363", "dtype": "float32"}, {"name": "1364", "dtype": "float32"}, {"name": "1365", "dtype": "float32"}, {"name": "1366", "dtype": "float32"}, {"name": "1367", "dtype": "float32"}, {"name": "1368", "dtype": "float32"}, {"name": "1369", "dtype": "float32"}, {"name": "1370", "dtype": "float32"}, {"name": "1371", "dtype": "float32"}, {"name": "1372", "dtype": "float32"}, {"name": "1373", "dtype": "float32"}, {"name": "1374", "dtype": "float32"}, {"name": "1375", "dtype": "float32"}, {"name": "1376", "dtype": "float32"}, {"name": "1377", "dtype": "float32"}, {"name": "1378", "dtype": "float32"}, {"name": "1379", "dtype": "float32"}, {"name": "1380", "dtype": "float32"}, {"name": "1381", "dtype": "float32"}, {"name": "1382", "dtype": "float32"}, {"name": "1383", "dtype": "float32"}, {"name": "1384", "dtype": "float32"}, {"name": "1385", "dtype": "float32"}, {"name": "1386", "dtype": "float32"}, {"name": "1387", "dtype": "float32"}, {"name": "1388", "dtype": "float32"}, {"name": "1389", "dtype": "float32"}, {"name": "1390", "dtype": "float32"}, {"name": "1391", "dtype": "float32"}, {"name": "1392", "dtype": "float32"}, {"name": "1393", "dtype": "float32"}, {"name": "1394", "dtype": "float32"}, {"name": "1395", "dtype": "float32"}, {"name": "1396", "dtype": "float32"}, {"name": "1397", "dtype": "float32"}, {"name": "1398", "dtype": "float32"}, {"name": "1399", "dtype": "float32"}, {"name": "1400", "dtype": "float32"}, {"name": "1401", "dtype": "float32"}, {"name": "1402", "dtype": "float32"}, {"name": "1403", "dtype": "float32"}, {"name": "1404", "dtype": "float32"}, {"name": "1405", "dtype": "float32"}, {"name": "1406", "dtype": "float32"}, {"name": "1407", "dtype": "float32"}, {"name": "1408", "dtype": "float32"}, {"name": "1409", "dtype": "float32"}, {"name": "1410", "dtype": "float32"}, {"name": "1411", "dtype": "float32"}, {"name": "1412", "dtype": "float32"}, {"name": "1413", "dtype": "float32"}, {"name": "1414", "dtype": "float32"}, {"name": "1415", "dtype": "float32"}, {"name": "1416", "dtype": "float32"}, {"name": "1417", "dtype": "float32"}, {"name": "1418", "dtype": "float32"}, {"name": "1419", "dtype": "float32"}, {"name": "1420", "dtype": "float32"}, {"name": "1421", "dtype": "float32"}, {"name": "1422", "dtype": "float32"}, {"name": "1423", "dtype": "float32"}, {"name": "1424", "dtype": "float32"}, {"name": "1425", "dtype": "float32"}, {"name": "1426", "dtype": "float32"}, {"name": "1427", "dtype": "float32"}, {"name": "1428", "dtype": "float32"}, {"name": "1429", "dtype": "float32"}, {"name": "1430", "dtype": "float32"}, {"name": "1431", "dtype": "float32"}, {"name": "1432", "dtype": "float32"}, {"name": "1433", "dtype": "float32"}, {"name": "1434", "dtype": "float32"}, {"name": "1435", "dtype": "float32"}, {"name": "1436", "dtype": "float32"}, {"name": "1437", "dtype": "float32"}, {"name": "1438", "dtype": "float32"}, {"name": "1439", "dtype": "float32"}, {"name": "1440", "dtype": "float32"}, {"name": "1441", "dtype": "float32"}, {"name": "1442", "dtype": "float32"}, {"name": "1443", "dtype": "float32"}, {"name": "1444", "dtype": "float32"}, {"name": "1445", "dtype": "float32"}, {"name": "1446", "dtype": "float32"}, {"name": "1447", "dtype": "float32"}, {"name": "1448", "dtype": "float32"}, {"name": "1449", "dtype": "float32"}, {"name": "1450", "dtype": "float32"}, {"name": "1451", "dtype": "float32"}, {"name": "1452", "dtype": "float32"}, {"name": "1453", "dtype": "float32"}, {"name": "1454", "dtype": "float32"}, {"name": "1455", "dtype": "float32"}, {"name": "1456", "dtype": "float32"}, {"name": "1457", "dtype": "float32"}, {"name": "1458", "dtype": "float32"}, {"name": "1459", "dtype": "float32"}, {"name": "1460", "dtype": "float32"}, {"name": "1461", "dtype": "float32"}, {"name": "1462", "dtype": "float32"}, {"name": "1463", "dtype": "float32"}, {"name": "1464", "dtype": "float32"}, {"name": "1465", "dtype": "float32"}, {"name": "1466", "dtype": "float32"}, {"name": "1467", "dtype": "float32"}, {"name": "1468", "dtype": "float32"}, {"name": "1469", "dtype": "float32"}, {"name": "1470", "dtype": "float32"}, {"name": "1471", "dtype": "float32"}, {"name": "1472", "dtype": "float32"}, {"name": "1473", "dtype": "float32"}, {"name": "1474", "dtype": "float32"}, {"name": "1475", "dtype": "float32"}, {"name": "1476", "dtype": "float32"}, {"name": "1477", "dtype": "float32"}, {"name": "1478", "dtype": "float32"}, {"name": "1479", "dtype": "float32"}, {"name": "1480", "dtype": "float32"}, {"name": "1481", "dtype": "float32"}, {"name": "1482", "dtype": "float32"}, {"name": "1483", "dtype": "float32"}, {"name": "1484", "dtype": "float32"}, {"name": "1485", "dtype": "float32"}, {"name": "1486", "dtype": "float32"}, {"name": "1487", "dtype": "float32"}, {"name": "1488", "dtype": "float32"}, {"name": "1489", "dtype": "float32"}, {"name": "1490", "dtype": "float32"}, {"name": "1491", "dtype": "float32"}, {"name": "1492", "dtype": "float32"}, {"name": "1493", "dtype": "float32"}, {"name": "1494", "dtype": "float32"}, {"name": "1495", "dtype": "float32"}, {"name": "1496", "dtype": "float32"}, {"name": "1497", "dtype": "float32"}, {"name": "1498", "dtype": "float32"}, {"name": "1499", "dtype": "float32"}, {"name": "1500", "dtype": "float32"}, {"name": "1501", "dtype": "float32"}, {"name": "1502", "dtype": "float32"}, {"name": "1503", "dtype": "float32"}, {"name": "1504", "dtype": "float32"}, {"name": "1505", "dtype": "float32"}, {"name": "1506", "dtype": "float32"}, {"name": "1507", "dtype": "float32"}, {"name": "1508", "dtype": "float32"}, {"name": "1509", "dtype": "float32"}, {"name": "1510", "dtype": "float32"}, {"name": "1511", "dtype": "float32"}, {"name": "1512", "dtype": "float32"}, {"name": "1513", "dtype": "float32"}, {"name": "1514", "dtype": "float32"}, {"name": "1515", "dtype": "float32"}, {"name": "1516", "dtype": "float32"}, {"name": "1517", "dtype": "float32"}, {"name": "1518", "dtype": "float32"}, {"name": "1519", "dtype": "float32"}, {"name": "1520", "dtype": "float32"}, {"name": "1521", "dtype": "float32"}, {"name": "1522", "dtype": "float32"}, {"name": "1523", "dtype": "float32"}, {"name": "1524", "dtype": "float32"}, {"name": "1525", "dtype": "float32"}, {"name": "1526", "dtype": "float32"}, {"name": "1527", "dtype": "float32"}, {"name": "1528", "dtype": "float32"}, {"name": "1529", "dtype": "float32"}, {"name": "1530", "dtype": "float32"}, {"name": "1531", "dtype": "float32"}, {"name": "1532", "dtype": "float32"}, {"name": "1533", "dtype": "float32"}, {"name": "1534", "dtype": "float32"}, {"name": "1535", "dtype": "float32"}, {"name": "1536", "dtype": "float32"}, {"name": "1537", "dtype": "float32"}, {"name": "1538", "dtype": "float32"}, {"name": "1539", "dtype": "float32"}, {"name": "1540", "dtype": "float32"}, {"name": "1541", "dtype": "float32"}, {"name": "1542", "dtype": "float32"}, {"name": "1543", "dtype": "float32"}, {"name": "1544", "dtype": "float32"}, {"name": "1545", "dtype": "float32"}, {"name": "1546", "dtype": "float32"}, {"name": "1547", "dtype": "float32"}, {"name": "1548", "dtype": "float32"}, {"name": "1549", "dtype": "float32"}, {"name": "1550", "dtype": "float32"}, {"name": "1551", "dtype": "float32"}, {"name": "1552", "dtype": "float32"}, {"name": "1553", "dtype": "float32"}, {"name": "1554", "dtype": "float32"}, {"name": "1555", "dtype": "float32"}, {"name": "1556", "dtype": "float32"}, {"name": "1557", "dtype": "float32"}, {"name": "1558", "dtype": "float32"}, {"name": "1559", "dtype": "float32"}, {"name": "1560", "dtype": "float32"}, {"name": "1561", "dtype": "float32"}, {"name": "1562", "dtype": "float32"}, {"name": "1563", "dtype": "float32"}, {"name": "1564", "dtype": "float32"}, {"name": "1565", "dtype": "float32"}, {"name": "1566", "dtype": "float32"}, {"name": "1567", "dtype": "float32"}, {"name": "1568", "dtype": "float32"}, {"name": "1569", "dtype": "float32"}, {"name": "1570", "dtype": "float32"}, {"name": "1571", "dtype": "float32"}, {"name": "1572", "dtype": "float32"}, {"name": "1573", "dtype": "float32"}, {"name": "1574", "dtype": "float32"}, {"name": "1575", "dtype": "float32"}, {"name": "1576", "dtype": "float32"}, {"name": "1577", "dtype": "float32"}, {"name": "1578", "dtype": "float32"}, {"name": "1579", "dtype": "float32"}, {"name": "1580", "dtype": "float32"}, {"name": "1581", "dtype": "float32"}, {"name": "1582", "dtype": "float32"}, {"name": "1583", "dtype": "float32"}, {"name": "1584", "dtype": "float32"}, {"name": "1585", "dtype": "float32"}, {"name": "1586", "dtype": "float32"}, {"name": "1587", "dtype": "float32"}, {"name": "1588", "dtype": "float32"}, {"name": "1589", "dtype": "float32"}, {"name": "1590", "dtype": "float32"}, {"name": "1591", "dtype": "float32"}, {"name": "1592", "dtype": "float32"}, {"name": "1593", "dtype": "float32"}, {"name": "1594", "dtype": "float32"}, {"name": "1595", "dtype": "float32"}, {"name": "1596", "dtype": "float32"}, {"name": "1597", "dtype": "float32"}, {"name": "1598", "dtype": "float32"}, {"name": "1599", "dtype": "float32"}, {"name": "1600", "dtype": "float32"}, {"name": "1601", "dtype": "float32"}, {"name": "1602", "dtype": "float32"}, {"name": "1603", "dtype": "float32"}, {"name": "1604", "dtype": "float32"}, {"name": "1605", "dtype": "float32"}, {"name": "1606", "dtype": "float32"}, {"name": "1607", "dtype": "float32"}, {"name": "1608", "dtype": "float32"}, {"name": "1609", "dtype": "float32"}, {"name": "1610", "dtype": "float32"}, {"name": "1611", "dtype": "float32"}, {"name": "1612", "dtype": "float32"}, {"name": "1613", "dtype": "float32"}, {"name": "1614", "dtype": "float32"}, {"name": "1615", "dtype": "float32"}, {"name": "1616", "dtype": "float32"}, {"name": "1617", "dtype": "float32"}, {"name": "1618", "dtype": "float32"}, {"name": "1619", "dtype": "float32"}, {"name": "1620", "dtype": "float32"}, {"name": "1621", "dtype": "float32"}, {"name": "1622", "dtype": "float32"}, {"name": "1623", "dtype": "float32"}, {"name": "1624", "dtype": "float32"}, {"name": "1625", "dtype": "float32"}, {"name": "1626", "dtype": "float32"}, {"name": "1627", "dtype": "float32"}, {"name": "1628", "dtype": "float32"}, {"name": "1629", "dtype": "float32"}, {"name": "1630", "dtype": "float32"}, {"name": "1631", "dtype": "float32"}, {"name": "1632", "dtype": "float32"}, {"name": "1633", "dtype": "float32"}, {"name": "1634", "dtype": "float32"}, {"name": "1635", "dtype": "float32"}, {"name": "1636", "dtype": "float32"}, {"name": "1637", "dtype": "float32"}, {"name": "1638", "dtype": "float32"}, {"name": "1639", "dtype": "float32"}, {"name": "1640", "dtype": "float32"}, {"name": "1641", "dtype": "float32"}, {"name": "1642", "dtype": "float32"}, {"name": "1643", "dtype": "float32"}, {"name": "1644", "dtype": "float32"}, {"name": "1645", "dtype": "float32"}, {"name": "1646", "dtype": "float32"}, {"name": "1647", "dtype": "float32"}, {"name": "1648", "dtype": "float32"}, {"name": "1649", "dtype": "float32"}, {"name": "1650", "dtype": "float32"}, {"name": "1651", "dtype": "float32"}, {"name": "1652", "dtype": "float32"}, {"name": "1653", "dtype": "float32"}, {"name": "1654", "dtype": "float32"}, {"name": "1655", "dtype": "float32"}, {"name": "1656", "dtype": "float32"}, {"name": "1657", "dtype": "float32"}, {"name": "1658", "dtype": "float32"}, {"name": "1659", "dtype": "float32"}, {"name": "1660", "dtype": "float32"}, {"name": "1661", "dtype": "float32"}, {"name": "1662", "dtype": "float32"}, {"name": "1663", "dtype": "float32"}, {"name": "1664", "dtype": "float32"}, {"name": "1665", "dtype": "float32"}, {"name": "1666", "dtype": "float32"}, {"name": "1667", "dtype": "float32"}, {"name": "1668", "dtype": "float32"}, {"name": "1669", "dtype": "float32"}, {"name": "1670", "dtype": "float32"}, {"name": "1671", "dtype": "float32"}, {"name": "1672", "dtype": "float32"}, {"name": "1673", "dtype": "float32"}, {"name": "1674", "dtype": "float32"}, {"name": "1675", "dtype": "float32"}, {"name": "1676", "dtype": "float32"}, {"name": "1677", "dtype": "float32"}, {"name": "1678", "dtype": "float32"}, {"name": "1679", "dtype": "float32"}, {"name": "1680", "dtype": "float32"}, {"name": "1681", "dtype": "float32"}, {"name": "1682", "dtype": "float32"}, {"name": "1683", "dtype": "float32"}, {"name": "1684", "dtype": "float32"}, {"name": "1685", "dtype": "float32"}, {"name": "1686", "dtype": "float32"}, {"name": "1687", "dtype": "float32"}, {"name": "1688", "dtype": "float32"}, {"name": "1689", "dtype": "float32"}, {"name": "1690", "dtype": "float32"}, {"name": "1691", "dtype": "float32"}, {"name": "1692", "dtype": "float32"}, {"name": "1693", "dtype": "float32"}, {"name": "1694", "dtype": "float32"}, {"name": "1695", "dtype": "float32"}, {"name": "1696", "dtype": "float32"}, {"name": "1697", "dtype": "float32"}, {"name": "1698", "dtype": "float32"}, {"name": "1699", "dtype": "float32"}, {"name": "1700", "dtype": "float32"}, {"name": "1701", "dtype": "float32"}, {"name": "1702", "dtype": "float32"}, {"name": "1703", "dtype": "float32"}, {"name": "1704", "dtype": "float32"}, {"name": "1705", "dtype": "float32"}, {"name": "1706", "dtype": "float32"}, {"name": "1707", "dtype": "float32"}, {"name": "1708", "dtype": "float32"}, {"name": "1709", "dtype": "float32"}, {"name": "1710", "dtype": "float32"}, {"name": "1711", "dtype": "float32"}, {"name": "1712", "dtype": "float32"}, {"name": "1713", "dtype": "float32"}, {"name": "1714", "dtype": "float32"}, {"name": "1715", "dtype": "float32"}, {"name": "1716", "dtype": "float32"}, {"name": "1717", "dtype": "float32"}, {"name": "1718", "dtype": "float32"}, {"name": "1719", "dtype": "float32"}, {"name": "1720", "dtype": "float32"}, {"name": "1721", "dtype": "float32"}, {"name": "1722", "dtype": "float32"}, {"name": "1723", "dtype": "float32"}, {"name": "1724", "dtype": "float32"}, {"name": "1725", "dtype": "float32"}, {"name": "1726", "dtype": "float32"}, {"name": "1727", "dtype": "float32"}, {"name": "1728", "dtype": "float32"}, {"name": "1729", "dtype": "float32"}, {"name": "1730", "dtype": "float32"}, {"name": "1731", "dtype": "float32"}, {"name": "1732", "dtype": "float32"}, {"name": "1733", "dtype": "float32"}, {"name": "1734", "dtype": "float32"}, {"name": "1735", "dtype": "float32"}, {"name": "1736", "dtype": "float32"}, {"name": "1737", "dtype": "float32"}, {"name": "1738", "dtype": "float32"}, {"name": "1739", "dtype": "float32"}, {"name": "1740", "dtype": "float32"}, {"name": "1741", "dtype": "float32"}, {"name": "1742", "dtype": "float32"}, {"name": "1743", "dtype": "float32"}, {"name": "1744", "dtype": "float32"}, {"name": "1745", "dtype": "float32"}, {"name": "1746", "dtype": "float32"}, {"name": "1747", "dtype": "float32"}, {"name": "1748", "dtype": "float32"}, {"name": "1749", "dtype": "float32"}, {"name": "1750", "dtype": "float32"}, {"name": "1751", "dtype": "float32"}, {"name": "1752", "dtype": "float32"}, {"name": "1753", "dtype": "float32"}, {"name": "1754", "dtype": "float32"}, {"name": "1755", "dtype": "float32"}, {"name": "1756", "dtype": "float32"}, {"name": "1757", "dtype": "float32"}, {"name": "1758", "dtype": "float32"}, {"name": "1759", "dtype": "float32"}, {"name": "1760", "dtype": "float32"}, {"name": "1761", "dtype": "float32"}, {"name": "1762", "dtype": "float32"}, {"name": "1763", "dtype": "float32"}, {"name": "1764", "dtype": "float32"}, {"name": "1765", "dtype": "float32"}, {"name": "1766", "dtype": "float32"}, {"name": "1767", "dtype": "float32"}, {"name": "1768", "dtype": "float32"}, {"name": "1769", "dtype": "float32"}, {"name": "1770", "dtype": "float32"}, {"name": "1771", "dtype": "float32"}, {"name": "1772", "dtype": "float32"}, {"name": "1773", "dtype": "float32"}, {"name": "1774", "dtype": "float32"}, {"name": "1775", "dtype": "float32"}, {"name": "1776", "dtype": "float32"}, {"name": "1777", "dtype": "float32"}, {"name": "1778", "dtype": "float32"}, {"name": "1779", "dtype": "float32"}, {"name": "1780", "dtype": "float32"}, {"name": "1781", "dtype": "float32"}, {"name": "1782", "dtype": "float32"}, {"name": "1783", "dtype": "float32"}, {"name": "1784", "dtype": "float32"}, {"name": "1785", "dtype": "float32"}, {"name": "1786", "dtype": "float32"}, {"name": "1787", "dtype": "float32"}, {"name": "1788", "dtype": "float32"}, {"name": "1789", "dtype": "float32"}, {"name": "1790", "dtype": "float32"}, {"name": "1791", "dtype": "float32"}, {"name": "1792", "dtype": "float32"}, {"name": "1793", "dtype": "float32"}, {"name": "1794", "dtype": "float32"}, {"name": "1795", "dtype": "float32"}, {"name": "1796", "dtype": "float32"}, {"name": "1797", "dtype": "float32"}, {"name": "1798", "dtype": "float32"}, {"name": "1799", "dtype": "float32"}, {"name": "1800", "dtype": "float32"}, {"name": "1801", "dtype": "float32"}, {"name": "1802", "dtype": "float32"}, {"name": "1803", "dtype": "float32"}, {"name": "1804", "dtype": "float32"}, {"name": "1805", "dtype": "float32"}, {"name": "1806", "dtype": "float32"}, {"name": "1807", "dtype": "float32"}, {"name": "1808", "dtype": "float32"}, {"name": "1809", "dtype": "float32"}, {"name": "1810", "dtype": "float32"}, {"name": "1811", "dtype": "float32"}, {"name": "1812", "dtype": "float32"}, {"name": "1813", "dtype": "float32"}, {"name": "1814", "dtype": "float32"}, {"name": "1815", "dtype": "float32"}, {"name": "1816", "dtype": "float32"}, {"name": "1817", "dtype": "float32"}, {"name": "1818", "dtype": "float32"}, {"name": "1819", "dtype": "float32"}, {"name": "1820", "dtype": "float32"}, {"name": "1821", "dtype": "float32"}, {"name": "1822", "dtype": "float32"}, {"name": "1823", "dtype": "float32"}, {"name": "1824", "dtype": "float32"}, {"name": "1825", "dtype": "float32"}, {"name": "1826", "dtype": "float32"}, {"name": "1827", "dtype": "float32"}, {"name": "1828", "dtype": "float32"}, {"name": "1829", "dtype": "float32"}, {"name": "1830", "dtype": "float32"}, {"name": "1831", "dtype": "float32"}, {"name": "1832", "dtype": "float32"}, {"name": "1833", "dtype": "float32"}, {"name": "1834", "dtype": "float32"}, {"name": "1835", "dtype": "float32"}, {"name": "1836", "dtype": "float32"}, {"name": "1837", "dtype": "float32"}, {"name": "1838", "dtype": "float32"}, {"name": "1839", "dtype": "float32"}, {"name": "1840", "dtype": "float32"}, {"name": "1841", "dtype": "float32"}, {"name": "1842", "dtype": "float32"}, {"name": "1843", "dtype": "float32"}, {"name": "1844", "dtype": "float32"}, {"name": "1845", "dtype": "float32"}, {"name": "1846", "dtype": "float32"}, {"name": "1847", "dtype": "float32"}, {"name": "1848", "dtype": "float32"}, {"name": "1849", "dtype": "float32"}, {"name": "1850", "dtype": "float32"}, {"name": "1851", "dtype": "float32"}, {"name": "1852", "dtype": "float32"}, {"name": "1853", "dtype": "float32"}, {"name": "1854", "dtype": "float32"}, {"name": "1855", "dtype": "float32"}, {"name": "1856", "dtype": "float32"}, {"name": "1857", "dtype": "float32"}, {"name": "1858", "dtype": "float32"}, {"name": "1859", "dtype": "float32"}, {"name": "1860", "dtype": "float32"}, {"name": "1861", "dtype": "float32"}, {"name": "1862", "dtype": "float32"}, {"name": "1863", "dtype": "float32"}, {"name": "1864", "dtype": "float32"}, {"name": "1865", "dtype": "float32"}, {"name": "1866", "dtype": "float32"}, {"name": "1867", "dtype": "float32"}, {"name": "1868", "dtype": "float32"}, {"name": "1869", "dtype": "float32"}, {"name": "1870", "dtype": "float32"}, {"name": "1871", "dtype": "float32"}, {"name": "1872", "dtype": "float32"}, {"name": "1873", "dtype": "float32"}, {"name": "1874", "dtype": "float32"}, {"name": "1875", "dtype": "float32"}, {"name": "1876", "dtype": "float32"}, {"name": "1877", "dtype": "float32"}, {"name": "1878", "dtype": "float32"}, {"name": "1879", "dtype": "float32"}, {"name": "1880", "dtype": "float32"}, {"name": "1881", "dtype": "float32"}, {"name": "1882", "dtype": "float32"}, {"name": "1883", "dtype": "float32"}, {"name": "1884", "dtype": "float32"}, {"name": "1885", "dtype": "float32"}, {"name": "1886", "dtype": "float32"}, {"name": "1887", "dtype": "float32"}, {"name": "1888", "dtype": "float32"}, {"name": "1889", "dtype": "float32"}, {"name": "1890", "dtype": "float32"}, {"name": "1891", "dtype": "float32"}, {"name": "1892", "dtype": "float32"}, {"name": "1893", "dtype": "float32"}, {"name": "1894", "dtype": "float32"}, {"name": "1895", "dtype": "float32"}, {"name": "1896", "dtype": "float32"}, {"name": "1897", "dtype": "float32"}, {"name": "1898", "dtype": "float32"}, {"name": "1899", "dtype": "float32"}, {"name": "1900", "dtype": "float32"}, {"name": "1901", "dtype": "float32"}, {"name": "1902", "dtype": "float32"}, {"name": "1903", "dtype": "float32"}, {"name": "1904", "dtype": "float32"}, {"name": "1905", "dtype": "float32"}, {"name": "1906", "dtype": "float32"}, {"name": "1907", "dtype": "float32"}, {"name": "1908", "dtype": "float32"}, {"name": "1909", "dtype": "float32"}, {"name": "1910", "dtype": "float32"}, {"name": "1911", "dtype": "float32"}, {"name": "1912", "dtype": "float32"}, {"name": "1913", "dtype": "float32"}, {"name": "1914", "dtype": "float32"}, {"name": "1915", "dtype": "float32"}, {"name": "1916", "dtype": "float32"}, {"name": "1917", "dtype": "float32"}, {"name": "1918", "dtype": "float32"}, {"name": "1919", "dtype": "float32"}, {"name": "1920", "dtype": "float32"}, {"name": "1921", "dtype": "float32"}, {"name": "1922", "dtype": "float32"}, {"name": "1923", "dtype": "float32"}, {"name": "1924", "dtype": "float32"}, {"name": "1925", "dtype": "float32"}, {"name": "1926", "dtype": "float32"}, {"name": "1927", "dtype": "float32"}, {"name": "1928", "dtype": "float32"}, {"name": "1929", "dtype": "float32"}, {"name": "1930", "dtype": "float32"}, {"name": "1931", "dtype": "float32"}, {"name": "1932", "dtype": "float32"}, {"name": "1933", "dtype": "float32"}, {"name": "1934", "dtype": "float32"}, {"name": "1935", "dtype": "float32"}, {"name": "1936", "dtype": "float32"}, {"name": "1937", "dtype": "float32"}, {"name": "1938", "dtype": "float32"}, {"name": "1939", "dtype": "float32"}, {"name": "1940", "dtype": "float32"}, {"name": "1941", "dtype": "float32"}, {"name": "1942", "dtype": "float32"}, {"name": "1943", "dtype": "float32"}, {"name": "1944", "dtype": "float32"}, {"name": "1945", "dtype": "float32"}, {"name": "1946", "dtype": "float32"}, {"name": "1947", "dtype": "float32"}, {"name": "1948", "dtype": "float32"}, {"name": "1949", "dtype": "float32"}, {"name": "1950", "dtype": "float32"}, {"name": "1951", "dtype": "float32"}, {"name": "1952", "dtype": "float32"}, {"name": "1953", "dtype": "float32"}, {"name": "1954", "dtype": "float32"}, {"name": "1955", "dtype": "float32"}, {"name": "1956", "dtype": "float32"}, {"name": "1957", "dtype": "float32"}, {"name": "1958", "dtype": "float32"}, {"name": "1959", "dtype": "float32"}, {"name": "1960", "dtype": "float32"}, {"name": "1961", "dtype": "float32"}, {"name": "1962", "dtype": "float32"}, {"name": "1963", "dtype": "float32"}, {"name": "1964", "dtype": "float32"}, {"name": "1965", "dtype": "float32"}, {"name": "1966", "dtype": "float32"}, {"name": "1967", "dtype": "float32"}, {"name": "1968", "dtype": "float32"}, {"name": "1969", "dtype": "float32"}, {"name": "1970", "dtype": "float32"}, {"name": "1971", "dtype": "float32"}, {"name": "1972", "dtype": "float32"}, {"name": "1973", "dtype": "float32"}, {"name": "1974", "dtype": "float32"}, {"name": "1975", "dtype": "float32"}, {"name": "1976", "dtype": "float32"}, {"name": "1977", "dtype": "float32"}, {"name": "1978", "dtype": "float32"}, {"name": "1979", "dtype": "float32"}, {"name": "1980", "dtype": "float32"}, {"name": "1981", "dtype": "float32"}, {"name": "1982", "dtype": "float32"}, {"name": "1983", "dtype": "float32"}, {"name": "1984", "dtype": "float32"}, {"name": "1985", "dtype": "float32"}, {"name": "1986", "dtype": "float32"}, {"name": "1987", "dtype": "float32"}, {"name": "1988", "dtype": "float32"}, {"name": "1989", "dtype": "float32"}, {"name": "1990", "dtype": "float32"}, {"name": "1991", "dtype": "float32"}, {"name": "1992", "dtype": "float32"}, {"name": "1993", "dtype": "float32"}, {"name": "1994", "dtype": "float32"}, {"name": "1995", "dtype": "float32"}, {"name": "1996", "dtype": "float32"}, {"name": "1997", "dtype": "float32"}, {"name": "1998", "dtype": "float32"}, {"name": "1999", "dtype": "float32"}, {"name": "2000", "dtype": "float32"}, {"name": "2001", "dtype": "float32"}, {"name": "2002", "dtype": "float32"}, {"name": "2003", "dtype": "float32"}, {"name": "2004", "dtype": "float32"}, {"name": "2005", "dtype": "float32"}, {"name": "2006", "dtype": "float32"}, {"name": "2007", "dtype": "float32"}, {"name": "2008", "dtype": "float32"}, {"name": "2009", "dtype": "float32"}, {"name": "2010", "dtype": "float32"}, {"name": "2011", "dtype": "float32"}, {"name": "2012", "dtype": "float32"}, {"name": "2013", "dtype": "float32"}, {"name": "2014", "dtype": "float32"}, {"name": "2015", "dtype": "float32"}, {"name": "2016", "dtype": "float32"}, {"name": "2017", "dtype": "float32"}, {"name": "2018", "dtype": "float32"}, {"name": "2019", "dtype": "float32"}, {"name": "2020", "dtype": "float32"}, {"name": "2021", "dtype": "float32"}, {"name": "2022", "dtype": "float32"}, {"name": "2023", "dtype": "float32"}, {"name": "2024", "dtype": "float32"}, {"name": "2025", "dtype": "float32"}, {"name": "2026", "dtype": "float32"}, {"name": "2027", "dtype": "float32"}, {"name": "2028", "dtype": "float32"}, {"name": "2029", "dtype": "float32"}, {"name": "2030", "dtype": "float32"}, {"name": "2031", "dtype": "float32"}, {"name": "2032", "dtype": "float32"}, {"name": "2033", "dtype": "float32"}, {"name": "2034", "dtype": "float32"}, {"name": "2035", "dtype": "float32"}, {"name": "2036", "dtype": "float32"}, {"name": "2037", "dtype": "float32"}, {"name": "2038", "dtype": "float32"}, {"name": "2039", "dtype": "float32"}, {"name": "2040", "dtype": "float32"}, {"name": "2041", "dtype": "float32"}, {"name": "2042", "dtype": "float32"}, {"name": "2043", "dtype": "float32"}, {"name": "2044", "dtype": "float32"}, {"name": "2045", "dtype": "float32"}, {"name": "2046", "dtype": "float32"}, {"name": "2047", "dtype": "float32"}, {"name": "label", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 307576729.6875, "num_examples": 37500}, {"name": "test", "num_bytes": 102525577.5, "num_examples": 12500}], "download_size": 565392402, "dataset_size": 410102307.1875}}
2023-08-18T19:03:23+00:00
[]
[]
TAGS #region-us
# Dataset Card for "Thunderbird_GPTNEO_Baseline" More Information needed
[ "# Dataset Card for \"Thunderbird_GPTNEO_Baseline\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"Thunderbird_GPTNEO_Baseline\"\n\nMore Information needed" ]
[ 6, 20 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"Thunderbird_GPTNEO_Baseline\"\n\nMore Information needed" ]
0b6f109b8ca5fa5d397e02d34c0317d5852fcf7c
# Dataset Card for "aya_persian_instruction_pn-summary" # Summary aya_persian_instruction_pn-summary is an open source dataset of instruct-style records generated from [pn-summary](https://huggingface.co/datasets/pn_summary) dataset. pn-summary is a Persian summarization dataset and here we transformed it to prompt-completion style to be used in the [Aya project](https://sites.google.com/cohere.com/aya-en/home) from Cohere For AI. # Templates For the creation of instruct-style prompts and completions from the original dataset, the following templates were used: - Given a text, generate a summary for it. | template_id | inputs | targets | |-------------|--------|---------| | 1 | ```متن زیر را خلاصه کنید:\n{{Original Text}}``` | ```{{Original Summary}}``` | | 2 | ```برای متن زیر یک خلاصه بنویسید:\n{Original Text}}``` | ```{{Original Summary}}``` | | 3 | ```یک یا چند جمله به عنوان خلاصه متن زیر بنویسید:\n{Original Text}}``` | ```{{Original Summary}}``` | # Language Persian # Licensing Information This dataset is licensed under MIT License.
Shafagh/aya_persian_instruction_pn-summary
[ "license:mit", "region:us" ]
2023-08-18T19:00:15+00:00
{"license": "mit", "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "validation", "path": "data/validation-*"}, {"split": "test", "path": "data/test-*"}]}], "dataset_info": {"features": [{"name": "inputs", "dtype": "string"}, {"name": "targets", "dtype": "string"}, {"name": "template_lang", "sequence": "string"}, {"name": "template_id", "dtype": "int64"}], "splits": [{"name": "train", "num_bytes": 277006335, "num_examples": 82022}, {"name": "validation", "num_bytes": 19104829, "num_examples": 5592}, {"name": "test", "num_bytes": 18729011, "num_examples": 5593}], "download_size": 142457276, "dataset_size": 314840175}}
2024-01-25T15:38:43+00:00
[]
[]
TAGS #license-mit #region-us
Dataset Card for "aya\_persian\_instruction\_pn-summary" ======================================================== Summary ======= aya\_persian\_instruction\_pn-summary is an open source dataset of instruct-style records generated from pn-summary dataset. pn-summary is a Persian summarization dataset and here we transformed it to prompt-completion style to be used in the Aya project from Cohere For AI. Templates ========= For the creation of instruct-style prompts and completions from the original dataset, the following templates were used: * Given a text, generate a summary for it. template\_id: 1, inputs: , targets: template\_id: 2, inputs: , targets: template\_id: 3, inputs: , targets: Language ======== Persian Licensing Information ===================== This dataset is licensed under MIT License.
[]
[ "TAGS\n#license-mit #region-us \n" ]
[ 11 ]
[ "passage: TAGS\n#license-mit #region-us \n" ]
0eba81ac6a67a52dccd4566b21a3451198bb755a
# Dataset Card for "Thunderbird_BERT_Finetuned" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
EgilKarlsen/Thunderbird_BERT_Finetuned
[ "region:us" ]
2023-08-18T19:05:19+00:00
{"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "test", "path": "data/test-*"}]}], "dataset_info": {"features": [{"name": "0", "dtype": "float32"}, {"name": "1", "dtype": "float32"}, {"name": "2", "dtype": "float32"}, {"name": "3", "dtype": "float32"}, {"name": "4", "dtype": "float32"}, {"name": "5", "dtype": "float32"}, {"name": "6", "dtype": "float32"}, {"name": "7", "dtype": "float32"}, {"name": "8", "dtype": "float32"}, {"name": "9", "dtype": "float32"}, {"name": "10", "dtype": "float32"}, {"name": "11", "dtype": "float32"}, {"name": "12", "dtype": "float32"}, {"name": "13", "dtype": "float32"}, {"name": "14", "dtype": "float32"}, {"name": "15", "dtype": "float32"}, {"name": "16", "dtype": "float32"}, {"name": "17", "dtype": "float32"}, {"name": "18", "dtype": "float32"}, {"name": "19", "dtype": "float32"}, {"name": "20", "dtype": "float32"}, {"name": "21", "dtype": "float32"}, {"name": "22", "dtype": "float32"}, {"name": "23", "dtype": "float32"}, {"name": "24", "dtype": "float32"}, {"name": "25", "dtype": "float32"}, {"name": "26", "dtype": "float32"}, {"name": "27", "dtype": "float32"}, {"name": "28", "dtype": "float32"}, {"name": "29", "dtype": "float32"}, {"name": "30", "dtype": "float32"}, {"name": "31", "dtype": "float32"}, {"name": "32", "dtype": "float32"}, {"name": "33", "dtype": "float32"}, {"name": "34", "dtype": "float32"}, {"name": "35", "dtype": "float32"}, {"name": "36", "dtype": "float32"}, {"name": "37", "dtype": "float32"}, {"name": "38", "dtype": "float32"}, {"name": "39", "dtype": "float32"}, {"name": "40", "dtype": "float32"}, {"name": "41", "dtype": "float32"}, {"name": "42", "dtype": "float32"}, {"name": "43", "dtype": "float32"}, {"name": "44", "dtype": "float32"}, {"name": "45", "dtype": "float32"}, {"name": "46", "dtype": "float32"}, {"name": "47", "dtype": "float32"}, {"name": "48", "dtype": "float32"}, {"name": "49", "dtype": "float32"}, {"name": "50", "dtype": "float32"}, {"name": "51", "dtype": "float32"}, {"name": "52", "dtype": "float32"}, {"name": "53", "dtype": "float32"}, {"name": "54", "dtype": "float32"}, {"name": "55", "dtype": "float32"}, {"name": "56", "dtype": "float32"}, {"name": "57", "dtype": "float32"}, {"name": "58", "dtype": "float32"}, {"name": "59", "dtype": "float32"}, {"name": "60", "dtype": "float32"}, {"name": "61", "dtype": "float32"}, {"name": "62", "dtype": "float32"}, {"name": "63", "dtype": "float32"}, {"name": "64", "dtype": "float32"}, {"name": "65", "dtype": "float32"}, {"name": "66", "dtype": "float32"}, {"name": "67", "dtype": "float32"}, {"name": "68", "dtype": "float32"}, {"name": "69", "dtype": "float32"}, {"name": "70", "dtype": "float32"}, {"name": "71", "dtype": "float32"}, {"name": "72", "dtype": "float32"}, {"name": "73", "dtype": "float32"}, {"name": "74", "dtype": "float32"}, {"name": "75", "dtype": "float32"}, {"name": "76", "dtype": "float32"}, {"name": "77", "dtype": "float32"}, {"name": "78", "dtype": "float32"}, {"name": "79", "dtype": "float32"}, {"name": "80", "dtype": "float32"}, {"name": "81", "dtype": "float32"}, {"name": "82", "dtype": "float32"}, {"name": "83", "dtype": "float32"}, {"name": "84", "dtype": "float32"}, {"name": "85", "dtype": "float32"}, {"name": "86", "dtype": "float32"}, {"name": "87", "dtype": "float32"}, {"name": "88", "dtype": "float32"}, {"name": "89", "dtype": "float32"}, {"name": "90", "dtype": "float32"}, {"name": "91", "dtype": "float32"}, {"name": "92", "dtype": "float32"}, {"name": "93", "dtype": "float32"}, {"name": "94", "dtype": "float32"}, {"name": "95", "dtype": "float32"}, {"name": "96", "dtype": "float32"}, {"name": "97", "dtype": "float32"}, {"name": "98", "dtype": "float32"}, {"name": "99", "dtype": "float32"}, {"name": "100", "dtype": "float32"}, {"name": "101", "dtype": "float32"}, {"name": "102", "dtype": "float32"}, {"name": "103", "dtype": "float32"}, {"name": "104", "dtype": "float32"}, {"name": "105", "dtype": "float32"}, {"name": "106", "dtype": "float32"}, {"name": "107", "dtype": "float32"}, {"name": "108", "dtype": "float32"}, {"name": "109", "dtype": "float32"}, {"name": "110", "dtype": "float32"}, {"name": "111", "dtype": "float32"}, {"name": "112", "dtype": "float32"}, {"name": "113", "dtype": "float32"}, {"name": "114", "dtype": "float32"}, {"name": "115", "dtype": "float32"}, {"name": "116", "dtype": "float32"}, {"name": "117", "dtype": "float32"}, {"name": "118", "dtype": "float32"}, {"name": "119", "dtype": "float32"}, {"name": "120", "dtype": "float32"}, {"name": "121", "dtype": "float32"}, {"name": "122", "dtype": "float32"}, {"name": "123", "dtype": "float32"}, {"name": "124", "dtype": "float32"}, {"name": "125", "dtype": "float32"}, {"name": "126", "dtype": "float32"}, {"name": "127", "dtype": "float32"}, {"name": "128", "dtype": "float32"}, {"name": "129", "dtype": "float32"}, {"name": "130", "dtype": "float32"}, {"name": "131", "dtype": "float32"}, {"name": "132", "dtype": "float32"}, {"name": "133", "dtype": "float32"}, {"name": "134", "dtype": "float32"}, {"name": "135", "dtype": "float32"}, {"name": "136", "dtype": "float32"}, {"name": "137", "dtype": "float32"}, {"name": "138", "dtype": "float32"}, {"name": "139", "dtype": "float32"}, {"name": "140", "dtype": "float32"}, {"name": "141", "dtype": "float32"}, {"name": "142", "dtype": "float32"}, {"name": "143", "dtype": "float32"}, {"name": "144", "dtype": "float32"}, {"name": "145", "dtype": "float32"}, {"name": "146", "dtype": "float32"}, {"name": "147", "dtype": "float32"}, {"name": "148", "dtype": "float32"}, {"name": "149", "dtype": "float32"}, {"name": "150", "dtype": "float32"}, {"name": "151", "dtype": "float32"}, {"name": "152", "dtype": "float32"}, {"name": "153", "dtype": "float32"}, {"name": "154", "dtype": "float32"}, {"name": "155", "dtype": "float32"}, {"name": "156", "dtype": "float32"}, {"name": "157", "dtype": "float32"}, {"name": "158", "dtype": "float32"}, {"name": "159", "dtype": "float32"}, {"name": "160", "dtype": "float32"}, {"name": "161", "dtype": "float32"}, {"name": "162", "dtype": "float32"}, {"name": "163", "dtype": "float32"}, {"name": "164", "dtype": "float32"}, {"name": "165", "dtype": "float32"}, {"name": "166", "dtype": "float32"}, {"name": "167", "dtype": "float32"}, {"name": "168", "dtype": "float32"}, {"name": "169", "dtype": "float32"}, {"name": "170", "dtype": "float32"}, {"name": "171", "dtype": "float32"}, {"name": "172", "dtype": "float32"}, {"name": "173", "dtype": "float32"}, {"name": "174", "dtype": "float32"}, {"name": "175", "dtype": "float32"}, {"name": "176", "dtype": "float32"}, {"name": "177", "dtype": "float32"}, {"name": "178", "dtype": "float32"}, {"name": "179", "dtype": "float32"}, {"name": "180", "dtype": "float32"}, {"name": "181", "dtype": "float32"}, {"name": "182", "dtype": "float32"}, {"name": "183", "dtype": "float32"}, {"name": "184", "dtype": "float32"}, {"name": "185", "dtype": "float32"}, {"name": "186", "dtype": "float32"}, {"name": "187", "dtype": "float32"}, {"name": "188", "dtype": "float32"}, {"name": "189", "dtype": "float32"}, {"name": "190", "dtype": "float32"}, {"name": "191", "dtype": "float32"}, {"name": "192", "dtype": "float32"}, {"name": "193", "dtype": "float32"}, {"name": "194", "dtype": "float32"}, {"name": "195", "dtype": "float32"}, {"name": "196", "dtype": "float32"}, {"name": "197", "dtype": "float32"}, {"name": "198", "dtype": "float32"}, {"name": "199", "dtype": "float32"}, {"name": "200", "dtype": "float32"}, {"name": "201", "dtype": "float32"}, {"name": "202", "dtype": "float32"}, {"name": "203", "dtype": "float32"}, {"name": "204", "dtype": "float32"}, {"name": "205", "dtype": "float32"}, {"name": "206", "dtype": "float32"}, {"name": "207", "dtype": "float32"}, {"name": "208", "dtype": "float32"}, {"name": "209", "dtype": "float32"}, {"name": "210", "dtype": "float32"}, {"name": "211", "dtype": "float32"}, {"name": "212", "dtype": "float32"}, {"name": "213", "dtype": "float32"}, {"name": "214", "dtype": "float32"}, {"name": "215", "dtype": "float32"}, {"name": "216", "dtype": "float32"}, {"name": "217", "dtype": "float32"}, {"name": "218", "dtype": "float32"}, {"name": "219", "dtype": "float32"}, {"name": "220", "dtype": "float32"}, {"name": "221", "dtype": "float32"}, {"name": "222", "dtype": "float32"}, {"name": "223", "dtype": "float32"}, {"name": "224", "dtype": "float32"}, {"name": "225", "dtype": "float32"}, {"name": "226", "dtype": "float32"}, {"name": "227", "dtype": "float32"}, {"name": "228", "dtype": "float32"}, {"name": "229", "dtype": "float32"}, {"name": "230", "dtype": "float32"}, {"name": "231", "dtype": "float32"}, {"name": "232", "dtype": "float32"}, {"name": "233", "dtype": "float32"}, {"name": "234", "dtype": "float32"}, {"name": "235", "dtype": "float32"}, {"name": "236", "dtype": "float32"}, {"name": "237", "dtype": "float32"}, {"name": "238", "dtype": "float32"}, {"name": "239", "dtype": "float32"}, {"name": "240", "dtype": "float32"}, {"name": "241", "dtype": "float32"}, {"name": "242", "dtype": "float32"}, {"name": "243", "dtype": "float32"}, {"name": "244", "dtype": "float32"}, {"name": "245", "dtype": "float32"}, {"name": "246", "dtype": "float32"}, {"name": "247", "dtype": "float32"}, {"name": "248", "dtype": "float32"}, {"name": "249", "dtype": "float32"}, {"name": "250", "dtype": "float32"}, {"name": "251", "dtype": "float32"}, {"name": "252", "dtype": "float32"}, {"name": "253", "dtype": "float32"}, {"name": "254", "dtype": "float32"}, {"name": "255", "dtype": "float32"}, {"name": "256", "dtype": "float32"}, {"name": "257", "dtype": "float32"}, {"name": "258", "dtype": "float32"}, {"name": "259", "dtype": "float32"}, {"name": "260", "dtype": "float32"}, {"name": "261", "dtype": "float32"}, {"name": "262", "dtype": "float32"}, {"name": "263", "dtype": "float32"}, {"name": "264", "dtype": "float32"}, {"name": "265", "dtype": "float32"}, {"name": "266", "dtype": "float32"}, {"name": "267", "dtype": "float32"}, {"name": "268", "dtype": "float32"}, {"name": "269", "dtype": "float32"}, {"name": "270", "dtype": "float32"}, {"name": "271", "dtype": "float32"}, {"name": "272", "dtype": "float32"}, {"name": "273", "dtype": "float32"}, {"name": "274", "dtype": "float32"}, {"name": "275", "dtype": "float32"}, {"name": "276", "dtype": "float32"}, {"name": "277", "dtype": "float32"}, {"name": "278", "dtype": "float32"}, {"name": "279", "dtype": "float32"}, {"name": "280", "dtype": "float32"}, {"name": "281", "dtype": "float32"}, {"name": "282", "dtype": "float32"}, {"name": "283", "dtype": "float32"}, {"name": "284", "dtype": "float32"}, {"name": "285", "dtype": "float32"}, {"name": "286", "dtype": "float32"}, {"name": "287", "dtype": "float32"}, {"name": "288", "dtype": "float32"}, {"name": "289", "dtype": "float32"}, {"name": "290", "dtype": "float32"}, {"name": "291", "dtype": "float32"}, {"name": "292", "dtype": "float32"}, {"name": "293", "dtype": "float32"}, {"name": "294", "dtype": "float32"}, {"name": "295", "dtype": "float32"}, {"name": "296", "dtype": "float32"}, {"name": "297", "dtype": "float32"}, {"name": "298", "dtype": "float32"}, {"name": "299", "dtype": "float32"}, {"name": "300", "dtype": "float32"}, {"name": "301", "dtype": "float32"}, {"name": "302", "dtype": "float32"}, {"name": "303", "dtype": "float32"}, {"name": "304", "dtype": "float32"}, {"name": "305", "dtype": "float32"}, {"name": "306", "dtype": "float32"}, {"name": "307", "dtype": "float32"}, {"name": "308", "dtype": "float32"}, {"name": "309", "dtype": "float32"}, {"name": "310", "dtype": "float32"}, {"name": "311", "dtype": "float32"}, {"name": "312", "dtype": "float32"}, {"name": "313", "dtype": "float32"}, {"name": "314", "dtype": "float32"}, {"name": "315", "dtype": "float32"}, {"name": "316", "dtype": "float32"}, {"name": "317", "dtype": "float32"}, {"name": "318", "dtype": "float32"}, {"name": "319", "dtype": "float32"}, {"name": "320", "dtype": "float32"}, {"name": "321", "dtype": "float32"}, {"name": "322", "dtype": "float32"}, {"name": "323", "dtype": "float32"}, {"name": "324", "dtype": "float32"}, {"name": "325", "dtype": "float32"}, {"name": "326", "dtype": "float32"}, {"name": "327", "dtype": "float32"}, {"name": "328", "dtype": "float32"}, {"name": "329", "dtype": "float32"}, {"name": "330", "dtype": "float32"}, {"name": "331", "dtype": "float32"}, {"name": "332", "dtype": "float32"}, {"name": "333", "dtype": "float32"}, {"name": "334", "dtype": "float32"}, {"name": "335", "dtype": "float32"}, {"name": "336", "dtype": "float32"}, {"name": "337", "dtype": "float32"}, {"name": "338", "dtype": "float32"}, {"name": "339", "dtype": "float32"}, {"name": "340", "dtype": "float32"}, {"name": "341", "dtype": "float32"}, {"name": "342", "dtype": "float32"}, {"name": "343", "dtype": "float32"}, {"name": "344", "dtype": "float32"}, {"name": "345", "dtype": "float32"}, {"name": "346", "dtype": "float32"}, {"name": "347", "dtype": "float32"}, {"name": "348", "dtype": "float32"}, {"name": "349", "dtype": "float32"}, {"name": "350", "dtype": "float32"}, {"name": "351", "dtype": "float32"}, {"name": "352", "dtype": "float32"}, {"name": "353", "dtype": "float32"}, {"name": "354", "dtype": "float32"}, {"name": "355", "dtype": "float32"}, {"name": "356", "dtype": "float32"}, {"name": "357", "dtype": "float32"}, {"name": "358", "dtype": "float32"}, {"name": "359", "dtype": "float32"}, {"name": "360", "dtype": "float32"}, {"name": "361", "dtype": "float32"}, {"name": "362", "dtype": "float32"}, {"name": "363", "dtype": "float32"}, {"name": "364", "dtype": "float32"}, {"name": "365", "dtype": "float32"}, {"name": "366", "dtype": "float32"}, {"name": "367", "dtype": "float32"}, {"name": "368", "dtype": "float32"}, {"name": "369", "dtype": "float32"}, {"name": "370", "dtype": "float32"}, {"name": "371", "dtype": "float32"}, {"name": "372", "dtype": "float32"}, {"name": "373", "dtype": "float32"}, {"name": "374", "dtype": "float32"}, {"name": "375", "dtype": "float32"}, {"name": "376", "dtype": "float32"}, {"name": "377", "dtype": "float32"}, {"name": "378", "dtype": "float32"}, {"name": "379", "dtype": "float32"}, {"name": "380", "dtype": "float32"}, {"name": "381", "dtype": "float32"}, {"name": "382", "dtype": "float32"}, {"name": "383", "dtype": "float32"}, {"name": "384", "dtype": "float32"}, {"name": "385", "dtype": "float32"}, {"name": "386", "dtype": "float32"}, {"name": "387", "dtype": "float32"}, {"name": "388", "dtype": "float32"}, {"name": "389", "dtype": "float32"}, {"name": "390", "dtype": "float32"}, {"name": "391", "dtype": "float32"}, {"name": "392", "dtype": "float32"}, {"name": "393", "dtype": "float32"}, {"name": "394", "dtype": "float32"}, {"name": "395", "dtype": "float32"}, {"name": "396", "dtype": "float32"}, {"name": "397", "dtype": "float32"}, {"name": "398", "dtype": "float32"}, {"name": "399", "dtype": "float32"}, {"name": "400", "dtype": "float32"}, {"name": "401", "dtype": "float32"}, {"name": "402", "dtype": "float32"}, {"name": "403", "dtype": "float32"}, {"name": "404", "dtype": "float32"}, {"name": "405", "dtype": "float32"}, {"name": "406", "dtype": "float32"}, {"name": "407", "dtype": "float32"}, {"name": "408", "dtype": "float32"}, {"name": "409", "dtype": "float32"}, {"name": "410", "dtype": "float32"}, {"name": "411", "dtype": "float32"}, {"name": "412", "dtype": "float32"}, {"name": "413", "dtype": "float32"}, {"name": "414", "dtype": "float32"}, {"name": "415", "dtype": "float32"}, {"name": "416", "dtype": "float32"}, {"name": "417", "dtype": "float32"}, {"name": "418", "dtype": "float32"}, {"name": "419", "dtype": "float32"}, {"name": "420", "dtype": "float32"}, {"name": "421", "dtype": "float32"}, {"name": "422", "dtype": "float32"}, {"name": "423", "dtype": "float32"}, {"name": "424", "dtype": "float32"}, {"name": "425", "dtype": "float32"}, {"name": "426", "dtype": "float32"}, {"name": "427", "dtype": "float32"}, {"name": "428", "dtype": "float32"}, {"name": "429", "dtype": "float32"}, {"name": "430", "dtype": "float32"}, {"name": "431", "dtype": "float32"}, {"name": "432", "dtype": "float32"}, {"name": "433", "dtype": "float32"}, {"name": "434", "dtype": "float32"}, {"name": "435", "dtype": "float32"}, {"name": "436", "dtype": "float32"}, {"name": "437", "dtype": "float32"}, {"name": "438", "dtype": "float32"}, {"name": "439", "dtype": "float32"}, {"name": "440", "dtype": "float32"}, {"name": "441", "dtype": "float32"}, {"name": "442", "dtype": "float32"}, {"name": "443", "dtype": "float32"}, {"name": "444", "dtype": "float32"}, {"name": "445", "dtype": "float32"}, {"name": "446", "dtype": "float32"}, {"name": "447", "dtype": "float32"}, {"name": "448", "dtype": "float32"}, {"name": "449", "dtype": "float32"}, {"name": "450", "dtype": "float32"}, {"name": "451", "dtype": "float32"}, {"name": "452", "dtype": "float32"}, {"name": "453", "dtype": "float32"}, {"name": "454", "dtype": "float32"}, {"name": "455", "dtype": "float32"}, {"name": "456", "dtype": "float32"}, {"name": "457", "dtype": "float32"}, {"name": "458", "dtype": "float32"}, {"name": "459", "dtype": "float32"}, {"name": "460", "dtype": "float32"}, {"name": "461", "dtype": "float32"}, {"name": "462", "dtype": "float32"}, {"name": "463", "dtype": "float32"}, {"name": "464", "dtype": "float32"}, {"name": "465", "dtype": "float32"}, {"name": "466", "dtype": "float32"}, {"name": "467", "dtype": "float32"}, {"name": "468", "dtype": "float32"}, {"name": "469", "dtype": "float32"}, {"name": "470", "dtype": "float32"}, {"name": "471", "dtype": "float32"}, {"name": "472", "dtype": "float32"}, {"name": "473", "dtype": "float32"}, {"name": "474", "dtype": "float32"}, {"name": "475", "dtype": "float32"}, {"name": "476", "dtype": "float32"}, {"name": "477", "dtype": "float32"}, {"name": "478", "dtype": "float32"}, {"name": "479", "dtype": "float32"}, {"name": "480", "dtype": "float32"}, {"name": "481", "dtype": "float32"}, {"name": "482", "dtype": "float32"}, {"name": "483", "dtype": "float32"}, {"name": "484", "dtype": "float32"}, {"name": "485", "dtype": "float32"}, {"name": "486", "dtype": "float32"}, {"name": "487", "dtype": "float32"}, {"name": "488", "dtype": "float32"}, {"name": "489", "dtype": "float32"}, {"name": "490", "dtype": "float32"}, {"name": "491", "dtype": "float32"}, {"name": "492", "dtype": "float32"}, {"name": "493", "dtype": "float32"}, {"name": "494", "dtype": "float32"}, {"name": "495", "dtype": "float32"}, {"name": "496", "dtype": "float32"}, {"name": "497", "dtype": "float32"}, {"name": "498", "dtype": "float32"}, {"name": "499", "dtype": "float32"}, {"name": "500", "dtype": "float32"}, {"name": "501", "dtype": "float32"}, {"name": "502", "dtype": "float32"}, {"name": "503", "dtype": "float32"}, {"name": "504", "dtype": "float32"}, {"name": "505", "dtype": "float32"}, {"name": "506", "dtype": "float32"}, {"name": "507", "dtype": "float32"}, {"name": "508", "dtype": "float32"}, {"name": "509", "dtype": "float32"}, {"name": "510", "dtype": "float32"}, {"name": "511", "dtype": "float32"}, {"name": "512", "dtype": "float32"}, {"name": "513", "dtype": "float32"}, {"name": "514", "dtype": "float32"}, {"name": "515", "dtype": "float32"}, {"name": "516", "dtype": "float32"}, {"name": "517", "dtype": "float32"}, {"name": "518", "dtype": "float32"}, {"name": "519", "dtype": "float32"}, {"name": "520", "dtype": "float32"}, {"name": "521", "dtype": "float32"}, {"name": "522", "dtype": "float32"}, {"name": "523", "dtype": "float32"}, {"name": "524", "dtype": "float32"}, {"name": "525", "dtype": "float32"}, {"name": "526", "dtype": "float32"}, {"name": "527", "dtype": "float32"}, {"name": "528", "dtype": "float32"}, {"name": "529", "dtype": "float32"}, {"name": "530", "dtype": "float32"}, {"name": "531", "dtype": "float32"}, {"name": "532", "dtype": "float32"}, {"name": "533", "dtype": "float32"}, {"name": "534", "dtype": "float32"}, {"name": "535", "dtype": "float32"}, {"name": "536", "dtype": "float32"}, {"name": "537", "dtype": "float32"}, {"name": "538", "dtype": "float32"}, {"name": "539", "dtype": "float32"}, {"name": "540", "dtype": "float32"}, {"name": "541", "dtype": "float32"}, {"name": "542", "dtype": "float32"}, {"name": "543", "dtype": "float32"}, {"name": "544", "dtype": "float32"}, {"name": "545", "dtype": "float32"}, {"name": "546", "dtype": "float32"}, {"name": "547", "dtype": "float32"}, {"name": "548", "dtype": "float32"}, {"name": "549", "dtype": "float32"}, {"name": "550", "dtype": "float32"}, {"name": "551", "dtype": "float32"}, {"name": "552", "dtype": "float32"}, {"name": "553", "dtype": "float32"}, {"name": "554", "dtype": "float32"}, {"name": "555", "dtype": "float32"}, {"name": "556", "dtype": "float32"}, {"name": "557", "dtype": "float32"}, {"name": "558", "dtype": "float32"}, {"name": "559", "dtype": "float32"}, {"name": "560", "dtype": "float32"}, {"name": "561", "dtype": "float32"}, {"name": "562", "dtype": "float32"}, {"name": "563", "dtype": "float32"}, {"name": "564", "dtype": "float32"}, {"name": "565", "dtype": "float32"}, {"name": "566", "dtype": "float32"}, {"name": "567", "dtype": "float32"}, {"name": "568", "dtype": "float32"}, {"name": "569", "dtype": "float32"}, {"name": "570", "dtype": "float32"}, {"name": "571", "dtype": "float32"}, {"name": "572", "dtype": "float32"}, {"name": "573", "dtype": "float32"}, {"name": "574", "dtype": "float32"}, {"name": "575", "dtype": "float32"}, {"name": "576", "dtype": "float32"}, {"name": "577", "dtype": "float32"}, {"name": "578", "dtype": "float32"}, {"name": "579", "dtype": "float32"}, {"name": "580", "dtype": "float32"}, {"name": "581", "dtype": "float32"}, {"name": "582", "dtype": "float32"}, {"name": "583", "dtype": "float32"}, {"name": "584", "dtype": "float32"}, {"name": "585", "dtype": "float32"}, {"name": "586", "dtype": "float32"}, {"name": "587", "dtype": "float32"}, {"name": "588", "dtype": "float32"}, {"name": "589", "dtype": "float32"}, {"name": "590", "dtype": "float32"}, {"name": "591", "dtype": "float32"}, {"name": "592", "dtype": "float32"}, {"name": "593", "dtype": "float32"}, {"name": "594", "dtype": "float32"}, {"name": "595", "dtype": "float32"}, {"name": "596", "dtype": "float32"}, {"name": "597", "dtype": "float32"}, {"name": "598", "dtype": "float32"}, {"name": "599", "dtype": "float32"}, {"name": "600", "dtype": "float32"}, {"name": "601", "dtype": "float32"}, {"name": "602", "dtype": "float32"}, {"name": "603", "dtype": "float32"}, {"name": "604", "dtype": "float32"}, {"name": "605", "dtype": "float32"}, {"name": "606", "dtype": "float32"}, {"name": "607", "dtype": "float32"}, {"name": "608", "dtype": "float32"}, {"name": "609", "dtype": "float32"}, {"name": "610", "dtype": "float32"}, {"name": "611", "dtype": "float32"}, {"name": "612", "dtype": "float32"}, {"name": "613", "dtype": "float32"}, {"name": "614", "dtype": "float32"}, {"name": "615", "dtype": "float32"}, {"name": "616", "dtype": "float32"}, {"name": "617", "dtype": "float32"}, {"name": "618", "dtype": "float32"}, {"name": "619", "dtype": "float32"}, {"name": "620", "dtype": "float32"}, {"name": "621", "dtype": "float32"}, {"name": "622", "dtype": "float32"}, {"name": "623", "dtype": "float32"}, {"name": "624", "dtype": "float32"}, {"name": "625", "dtype": "float32"}, {"name": "626", "dtype": "float32"}, {"name": "627", "dtype": "float32"}, {"name": "628", "dtype": "float32"}, {"name": "629", "dtype": "float32"}, {"name": "630", "dtype": "float32"}, {"name": "631", "dtype": "float32"}, {"name": "632", "dtype": "float32"}, {"name": "633", "dtype": "float32"}, {"name": "634", "dtype": "float32"}, {"name": "635", "dtype": "float32"}, {"name": "636", "dtype": "float32"}, {"name": "637", "dtype": "float32"}, {"name": "638", "dtype": "float32"}, {"name": "639", "dtype": "float32"}, {"name": "640", "dtype": "float32"}, {"name": "641", "dtype": "float32"}, {"name": "642", "dtype": "float32"}, {"name": "643", "dtype": "float32"}, {"name": "644", "dtype": "float32"}, {"name": "645", "dtype": "float32"}, {"name": "646", "dtype": "float32"}, {"name": "647", "dtype": "float32"}, {"name": "648", "dtype": "float32"}, {"name": "649", "dtype": "float32"}, {"name": "650", "dtype": "float32"}, {"name": "651", "dtype": "float32"}, {"name": "652", "dtype": "float32"}, {"name": "653", "dtype": "float32"}, {"name": "654", "dtype": "float32"}, {"name": "655", "dtype": "float32"}, {"name": "656", "dtype": "float32"}, {"name": "657", "dtype": "float32"}, {"name": "658", "dtype": "float32"}, {"name": "659", "dtype": "float32"}, {"name": "660", "dtype": "float32"}, {"name": "661", "dtype": "float32"}, {"name": "662", "dtype": "float32"}, {"name": "663", "dtype": "float32"}, {"name": "664", "dtype": "float32"}, {"name": "665", "dtype": "float32"}, {"name": "666", "dtype": "float32"}, {"name": "667", "dtype": "float32"}, {"name": "668", "dtype": "float32"}, {"name": "669", "dtype": "float32"}, {"name": "670", "dtype": "float32"}, {"name": "671", "dtype": "float32"}, {"name": "672", "dtype": "float32"}, {"name": "673", "dtype": "float32"}, {"name": "674", "dtype": "float32"}, {"name": "675", "dtype": "float32"}, {"name": "676", "dtype": "float32"}, {"name": "677", "dtype": "float32"}, {"name": "678", "dtype": "float32"}, {"name": "679", "dtype": "float32"}, {"name": "680", "dtype": "float32"}, {"name": "681", "dtype": "float32"}, {"name": "682", "dtype": "float32"}, {"name": "683", "dtype": "float32"}, {"name": "684", "dtype": "float32"}, {"name": "685", "dtype": "float32"}, {"name": "686", "dtype": "float32"}, {"name": "687", "dtype": "float32"}, {"name": "688", "dtype": "float32"}, {"name": "689", "dtype": "float32"}, {"name": "690", "dtype": "float32"}, {"name": "691", "dtype": "float32"}, {"name": "692", "dtype": "float32"}, {"name": "693", "dtype": "float32"}, {"name": "694", "dtype": "float32"}, {"name": "695", "dtype": "float32"}, {"name": "696", "dtype": "float32"}, {"name": "697", "dtype": "float32"}, {"name": "698", "dtype": "float32"}, {"name": "699", "dtype": "float32"}, {"name": "700", "dtype": "float32"}, {"name": "701", "dtype": "float32"}, {"name": "702", "dtype": "float32"}, {"name": "703", "dtype": "float32"}, {"name": "704", "dtype": "float32"}, {"name": "705", "dtype": "float32"}, {"name": "706", "dtype": "float32"}, {"name": "707", "dtype": "float32"}, {"name": "708", "dtype": "float32"}, {"name": "709", "dtype": "float32"}, {"name": "710", "dtype": "float32"}, {"name": "711", "dtype": "float32"}, {"name": "712", "dtype": "float32"}, {"name": "713", "dtype": "float32"}, {"name": "714", "dtype": "float32"}, {"name": "715", "dtype": "float32"}, {"name": "716", "dtype": "float32"}, {"name": "717", "dtype": "float32"}, {"name": "718", "dtype": "float32"}, {"name": "719", "dtype": "float32"}, {"name": "720", "dtype": "float32"}, {"name": "721", "dtype": "float32"}, {"name": "722", "dtype": "float32"}, {"name": "723", "dtype": "float32"}, {"name": "724", "dtype": "float32"}, {"name": "725", "dtype": "float32"}, {"name": "726", "dtype": "float32"}, {"name": "727", "dtype": "float32"}, {"name": "728", "dtype": "float32"}, {"name": "729", "dtype": "float32"}, {"name": "730", "dtype": "float32"}, {"name": "731", "dtype": "float32"}, {"name": "732", "dtype": "float32"}, {"name": "733", "dtype": "float32"}, {"name": "734", "dtype": "float32"}, {"name": "735", "dtype": "float32"}, {"name": "736", "dtype": "float32"}, {"name": "737", "dtype": "float32"}, {"name": "738", "dtype": "float32"}, {"name": "739", "dtype": "float32"}, {"name": "740", "dtype": "float32"}, {"name": "741", "dtype": "float32"}, {"name": "742", "dtype": "float32"}, {"name": "743", "dtype": "float32"}, {"name": "744", "dtype": "float32"}, {"name": "745", "dtype": "float32"}, {"name": "746", "dtype": "float32"}, {"name": "747", "dtype": "float32"}, {"name": "748", "dtype": "float32"}, {"name": "749", "dtype": "float32"}, {"name": "750", "dtype": "float32"}, {"name": "751", "dtype": "float32"}, {"name": "752", "dtype": "float32"}, {"name": "753", "dtype": "float32"}, {"name": "754", "dtype": "float32"}, {"name": "755", "dtype": "float32"}, {"name": "756", "dtype": "float32"}, {"name": "757", "dtype": "float32"}, {"name": "758", "dtype": "float32"}, {"name": "759", "dtype": "float32"}, {"name": "760", "dtype": "float32"}, {"name": "761", "dtype": "float32"}, {"name": "762", "dtype": "float32"}, {"name": "763", "dtype": "float32"}, {"name": "764", "dtype": "float32"}, {"name": "765", "dtype": "float32"}, {"name": "766", "dtype": "float32"}, {"name": "767", "dtype": "float32"}, {"name": "label", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 115576729.6875, "num_examples": 37500}, {"name": "test", "num_bytes": 38525577.5, "num_examples": 12500}], "download_size": 211881253, "dataset_size": 154102307.1875}}
2023-08-23T02:56:46+00:00
[]
[]
TAGS #region-us
# Dataset Card for "Thunderbird_BERT_Finetuned" More Information needed
[ "# Dataset Card for \"Thunderbird_BERT_Finetuned\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"Thunderbird_BERT_Finetuned\"\n\nMore Information needed" ]
[ 6, 21 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"Thunderbird_BERT_Finetuned\"\n\nMore Information needed" ]
ebcb01218ce087047e201dca7348623b84f40235
# Dataset Card for "en-hi" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
rajuptvs/English-to-hindi-podcast-translation
[ "region:us" ]
2023-08-18T19:07:42+00:00
{"dataset_info": {"features": [{"name": "video_id", "dtype": "string"}, {"name": "English subtitles", "dtype": "string"}, {"name": "Hindi subtitles", "dtype": "string"}, {"name": "__index_level_0__", "dtype": "int64"}], "splits": [{"name": "train", "num_bytes": 1827416, "num_examples": 11427}], "download_size": 784942, "dataset_size": 1827416}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]}
2023-08-18T19:07:47+00:00
[]
[]
TAGS #region-us
# Dataset Card for "en-hi" More Information needed
[ "# Dataset Card for \"en-hi\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"en-hi\"\n\nMore Information needed" ]
[ 6, 13 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"en-hi\"\n\nMore Information needed" ]
f5cca4e708d633040df9d65aaf93bf552498ef80
# Dataset Card for "Thunderbird_RoBERTa_Finetuned" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
EgilKarlsen/Thunderbird_RoBERTa_Finetuned
[ "region:us" ]
2023-08-18T19:12:30+00:00
{"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "test", "path": "data/test-*"}]}], "dataset_info": {"features": [{"name": "0", "dtype": "float32"}, {"name": "1", "dtype": "float32"}, {"name": "2", "dtype": "float32"}, {"name": "3", "dtype": "float32"}, {"name": "4", "dtype": "float32"}, {"name": "5", "dtype": "float32"}, {"name": "6", "dtype": "float32"}, {"name": "7", "dtype": "float32"}, {"name": "8", "dtype": "float32"}, {"name": "9", "dtype": "float32"}, {"name": "10", "dtype": "float32"}, {"name": "11", "dtype": "float32"}, {"name": "12", "dtype": "float32"}, {"name": "13", "dtype": "float32"}, {"name": "14", "dtype": "float32"}, {"name": "15", "dtype": "float32"}, {"name": "16", "dtype": "float32"}, {"name": "17", "dtype": "float32"}, {"name": "18", "dtype": "float32"}, {"name": "19", "dtype": "float32"}, {"name": "20", "dtype": "float32"}, {"name": "21", "dtype": "float32"}, {"name": "22", "dtype": "float32"}, {"name": "23", "dtype": "float32"}, {"name": "24", "dtype": "float32"}, {"name": "25", "dtype": "float32"}, {"name": "26", "dtype": "float32"}, {"name": "27", "dtype": "float32"}, {"name": "28", "dtype": "float32"}, {"name": "29", "dtype": "float32"}, {"name": "30", "dtype": "float32"}, {"name": "31", "dtype": "float32"}, {"name": "32", "dtype": "float32"}, {"name": "33", "dtype": "float32"}, {"name": "34", "dtype": "float32"}, {"name": "35", "dtype": "float32"}, {"name": "36", "dtype": "float32"}, {"name": "37", "dtype": "float32"}, {"name": "38", "dtype": "float32"}, {"name": "39", "dtype": "float32"}, {"name": "40", "dtype": "float32"}, {"name": "41", "dtype": "float32"}, {"name": "42", "dtype": "float32"}, {"name": "43", "dtype": "float32"}, {"name": "44", "dtype": "float32"}, {"name": "45", "dtype": "float32"}, {"name": "46", "dtype": "float32"}, {"name": "47", "dtype": "float32"}, {"name": "48", "dtype": "float32"}, {"name": "49", "dtype": "float32"}, {"name": "50", "dtype": "float32"}, {"name": "51", "dtype": "float32"}, {"name": "52", "dtype": "float32"}, {"name": "53", "dtype": "float32"}, {"name": "54", "dtype": "float32"}, {"name": "55", "dtype": "float32"}, {"name": "56", "dtype": "float32"}, {"name": "57", "dtype": "float32"}, {"name": "58", "dtype": "float32"}, {"name": "59", "dtype": "float32"}, {"name": "60", "dtype": "float32"}, {"name": "61", "dtype": "float32"}, {"name": "62", "dtype": "float32"}, {"name": "63", "dtype": "float32"}, {"name": "64", "dtype": "float32"}, {"name": "65", "dtype": "float32"}, {"name": "66", "dtype": "float32"}, {"name": "67", "dtype": "float32"}, {"name": "68", "dtype": "float32"}, {"name": "69", "dtype": "float32"}, {"name": "70", "dtype": "float32"}, {"name": "71", "dtype": "float32"}, {"name": "72", "dtype": "float32"}, {"name": "73", "dtype": "float32"}, {"name": "74", "dtype": "float32"}, {"name": "75", "dtype": "float32"}, {"name": "76", "dtype": "float32"}, {"name": "77", "dtype": "float32"}, {"name": "78", "dtype": "float32"}, {"name": "79", "dtype": "float32"}, {"name": "80", "dtype": "float32"}, {"name": "81", "dtype": "float32"}, {"name": "82", "dtype": "float32"}, {"name": "83", "dtype": "float32"}, {"name": "84", "dtype": "float32"}, {"name": "85", "dtype": "float32"}, {"name": "86", "dtype": "float32"}, {"name": "87", "dtype": "float32"}, {"name": "88", "dtype": "float32"}, {"name": "89", "dtype": "float32"}, {"name": "90", "dtype": "float32"}, {"name": "91", "dtype": "float32"}, {"name": "92", "dtype": "float32"}, {"name": "93", "dtype": "float32"}, {"name": "94", "dtype": "float32"}, {"name": "95", "dtype": "float32"}, {"name": "96", "dtype": "float32"}, {"name": "97", "dtype": "float32"}, {"name": "98", "dtype": "float32"}, {"name": "99", "dtype": "float32"}, {"name": "100", "dtype": "float32"}, {"name": "101", "dtype": "float32"}, {"name": "102", "dtype": "float32"}, {"name": "103", "dtype": "float32"}, {"name": "104", "dtype": "float32"}, {"name": "105", "dtype": "float32"}, {"name": "106", "dtype": "float32"}, {"name": "107", "dtype": "float32"}, {"name": "108", "dtype": "float32"}, {"name": "109", "dtype": "float32"}, {"name": "110", "dtype": "float32"}, {"name": "111", "dtype": "float32"}, {"name": "112", "dtype": "float32"}, {"name": "113", "dtype": "float32"}, {"name": "114", "dtype": "float32"}, {"name": "115", "dtype": "float32"}, {"name": "116", "dtype": "float32"}, {"name": "117", "dtype": "float32"}, {"name": "118", "dtype": "float32"}, {"name": "119", "dtype": "float32"}, {"name": "120", "dtype": "float32"}, {"name": "121", "dtype": "float32"}, {"name": "122", "dtype": "float32"}, {"name": "123", "dtype": "float32"}, {"name": "124", "dtype": "float32"}, {"name": "125", "dtype": "float32"}, {"name": "126", "dtype": "float32"}, {"name": "127", "dtype": "float32"}, {"name": "128", "dtype": "float32"}, {"name": "129", "dtype": "float32"}, {"name": "130", "dtype": "float32"}, {"name": "131", "dtype": "float32"}, {"name": "132", "dtype": "float32"}, {"name": "133", "dtype": "float32"}, {"name": "134", "dtype": "float32"}, {"name": "135", "dtype": "float32"}, {"name": "136", "dtype": "float32"}, {"name": "137", "dtype": "float32"}, {"name": "138", "dtype": "float32"}, {"name": "139", "dtype": "float32"}, {"name": "140", "dtype": "float32"}, {"name": "141", "dtype": "float32"}, {"name": "142", "dtype": "float32"}, {"name": "143", "dtype": "float32"}, {"name": "144", "dtype": "float32"}, {"name": "145", "dtype": "float32"}, {"name": "146", "dtype": "float32"}, {"name": "147", "dtype": "float32"}, {"name": "148", "dtype": "float32"}, {"name": "149", "dtype": "float32"}, {"name": "150", "dtype": "float32"}, {"name": "151", "dtype": "float32"}, {"name": "152", "dtype": "float32"}, {"name": "153", "dtype": "float32"}, {"name": "154", "dtype": "float32"}, {"name": "155", "dtype": "float32"}, {"name": "156", "dtype": "float32"}, {"name": "157", "dtype": "float32"}, {"name": "158", "dtype": "float32"}, {"name": "159", "dtype": "float32"}, {"name": "160", "dtype": "float32"}, {"name": "161", "dtype": "float32"}, {"name": "162", "dtype": "float32"}, {"name": "163", "dtype": "float32"}, {"name": "164", "dtype": "float32"}, {"name": "165", "dtype": "float32"}, {"name": "166", "dtype": "float32"}, {"name": "167", "dtype": "float32"}, {"name": "168", "dtype": "float32"}, {"name": "169", "dtype": "float32"}, {"name": "170", "dtype": "float32"}, {"name": "171", "dtype": "float32"}, {"name": "172", "dtype": "float32"}, {"name": "173", "dtype": "float32"}, {"name": "174", "dtype": "float32"}, {"name": "175", "dtype": "float32"}, {"name": "176", "dtype": "float32"}, {"name": "177", "dtype": "float32"}, {"name": "178", "dtype": "float32"}, {"name": "179", "dtype": "float32"}, {"name": "180", "dtype": "float32"}, {"name": "181", "dtype": "float32"}, {"name": "182", "dtype": "float32"}, {"name": "183", "dtype": "float32"}, {"name": "184", "dtype": "float32"}, {"name": "185", "dtype": "float32"}, {"name": "186", "dtype": "float32"}, {"name": "187", "dtype": "float32"}, {"name": "188", "dtype": "float32"}, {"name": "189", "dtype": "float32"}, {"name": "190", "dtype": "float32"}, {"name": "191", "dtype": "float32"}, {"name": "192", "dtype": "float32"}, {"name": "193", "dtype": "float32"}, {"name": "194", "dtype": "float32"}, {"name": "195", "dtype": "float32"}, {"name": "196", "dtype": "float32"}, {"name": "197", "dtype": "float32"}, {"name": "198", "dtype": "float32"}, {"name": "199", "dtype": "float32"}, {"name": "200", "dtype": "float32"}, {"name": "201", "dtype": "float32"}, {"name": "202", "dtype": "float32"}, {"name": "203", "dtype": "float32"}, {"name": "204", "dtype": "float32"}, {"name": "205", "dtype": "float32"}, {"name": "206", "dtype": "float32"}, {"name": "207", "dtype": "float32"}, {"name": "208", "dtype": "float32"}, {"name": "209", "dtype": "float32"}, {"name": "210", "dtype": "float32"}, {"name": "211", "dtype": "float32"}, {"name": "212", "dtype": "float32"}, {"name": "213", "dtype": "float32"}, {"name": "214", "dtype": "float32"}, {"name": "215", "dtype": "float32"}, {"name": "216", "dtype": "float32"}, {"name": "217", "dtype": "float32"}, {"name": "218", "dtype": "float32"}, {"name": "219", "dtype": "float32"}, {"name": "220", "dtype": "float32"}, {"name": "221", "dtype": "float32"}, {"name": "222", "dtype": "float32"}, {"name": "223", "dtype": "float32"}, {"name": "224", "dtype": "float32"}, {"name": "225", "dtype": "float32"}, {"name": "226", "dtype": "float32"}, {"name": "227", "dtype": "float32"}, {"name": "228", "dtype": "float32"}, {"name": "229", "dtype": "float32"}, {"name": "230", "dtype": "float32"}, {"name": "231", "dtype": "float32"}, {"name": "232", "dtype": "float32"}, {"name": "233", "dtype": "float32"}, {"name": "234", "dtype": "float32"}, {"name": "235", "dtype": "float32"}, {"name": "236", "dtype": "float32"}, {"name": "237", "dtype": "float32"}, {"name": "238", "dtype": "float32"}, {"name": "239", "dtype": "float32"}, {"name": "240", "dtype": "float32"}, {"name": "241", "dtype": "float32"}, {"name": "242", "dtype": "float32"}, {"name": "243", "dtype": "float32"}, {"name": "244", "dtype": "float32"}, {"name": "245", "dtype": "float32"}, {"name": "246", "dtype": "float32"}, {"name": "247", "dtype": "float32"}, {"name": "248", "dtype": "float32"}, {"name": "249", "dtype": "float32"}, {"name": "250", "dtype": "float32"}, {"name": "251", "dtype": "float32"}, {"name": "252", "dtype": "float32"}, {"name": "253", "dtype": "float32"}, {"name": "254", "dtype": "float32"}, {"name": "255", "dtype": "float32"}, {"name": "256", "dtype": "float32"}, {"name": "257", "dtype": "float32"}, {"name": "258", "dtype": "float32"}, {"name": "259", "dtype": "float32"}, {"name": "260", "dtype": "float32"}, {"name": "261", "dtype": "float32"}, {"name": "262", "dtype": "float32"}, {"name": "263", "dtype": "float32"}, {"name": "264", "dtype": "float32"}, {"name": "265", "dtype": "float32"}, {"name": "266", "dtype": "float32"}, {"name": "267", "dtype": "float32"}, {"name": "268", "dtype": "float32"}, {"name": "269", "dtype": "float32"}, {"name": "270", "dtype": "float32"}, {"name": "271", "dtype": "float32"}, {"name": "272", "dtype": "float32"}, {"name": "273", "dtype": "float32"}, {"name": "274", "dtype": "float32"}, {"name": "275", "dtype": "float32"}, {"name": "276", "dtype": "float32"}, {"name": "277", "dtype": "float32"}, {"name": "278", "dtype": "float32"}, {"name": "279", "dtype": "float32"}, {"name": "280", "dtype": "float32"}, {"name": "281", "dtype": "float32"}, {"name": "282", "dtype": "float32"}, {"name": "283", "dtype": "float32"}, {"name": "284", "dtype": "float32"}, {"name": "285", "dtype": "float32"}, {"name": "286", "dtype": "float32"}, {"name": "287", "dtype": "float32"}, {"name": "288", "dtype": "float32"}, {"name": "289", "dtype": "float32"}, {"name": "290", "dtype": "float32"}, {"name": "291", "dtype": "float32"}, {"name": "292", "dtype": "float32"}, {"name": "293", "dtype": "float32"}, {"name": "294", "dtype": "float32"}, {"name": "295", "dtype": "float32"}, {"name": "296", "dtype": "float32"}, {"name": "297", "dtype": "float32"}, {"name": "298", "dtype": "float32"}, {"name": "299", "dtype": "float32"}, {"name": "300", "dtype": "float32"}, {"name": "301", "dtype": "float32"}, {"name": "302", "dtype": "float32"}, {"name": "303", "dtype": "float32"}, {"name": "304", "dtype": "float32"}, {"name": "305", "dtype": "float32"}, {"name": "306", "dtype": "float32"}, {"name": "307", "dtype": "float32"}, {"name": "308", "dtype": "float32"}, {"name": "309", "dtype": "float32"}, {"name": "310", "dtype": "float32"}, {"name": "311", "dtype": "float32"}, {"name": "312", "dtype": "float32"}, {"name": "313", "dtype": "float32"}, {"name": "314", "dtype": "float32"}, {"name": "315", "dtype": "float32"}, {"name": "316", "dtype": "float32"}, {"name": "317", "dtype": "float32"}, {"name": "318", "dtype": "float32"}, {"name": "319", "dtype": "float32"}, {"name": "320", "dtype": "float32"}, {"name": "321", "dtype": "float32"}, {"name": "322", "dtype": "float32"}, {"name": "323", "dtype": "float32"}, {"name": "324", "dtype": "float32"}, {"name": "325", "dtype": "float32"}, {"name": "326", "dtype": "float32"}, {"name": "327", "dtype": "float32"}, {"name": "328", "dtype": "float32"}, {"name": "329", "dtype": "float32"}, {"name": "330", "dtype": "float32"}, {"name": "331", "dtype": "float32"}, {"name": "332", "dtype": "float32"}, {"name": "333", "dtype": "float32"}, {"name": "334", "dtype": "float32"}, {"name": "335", "dtype": "float32"}, {"name": "336", "dtype": "float32"}, {"name": "337", "dtype": "float32"}, {"name": "338", "dtype": "float32"}, {"name": "339", "dtype": "float32"}, {"name": "340", "dtype": "float32"}, {"name": "341", "dtype": "float32"}, {"name": "342", "dtype": "float32"}, {"name": "343", "dtype": "float32"}, {"name": "344", "dtype": "float32"}, {"name": "345", "dtype": "float32"}, {"name": "346", "dtype": "float32"}, {"name": "347", "dtype": "float32"}, {"name": "348", "dtype": "float32"}, {"name": "349", "dtype": "float32"}, {"name": "350", "dtype": "float32"}, {"name": "351", "dtype": "float32"}, {"name": "352", "dtype": "float32"}, {"name": "353", "dtype": "float32"}, {"name": "354", "dtype": "float32"}, {"name": "355", "dtype": "float32"}, {"name": "356", "dtype": "float32"}, {"name": "357", "dtype": "float32"}, {"name": "358", "dtype": "float32"}, {"name": "359", "dtype": "float32"}, {"name": "360", "dtype": "float32"}, {"name": "361", "dtype": "float32"}, {"name": "362", "dtype": "float32"}, {"name": "363", "dtype": "float32"}, {"name": "364", "dtype": "float32"}, {"name": "365", "dtype": "float32"}, {"name": "366", "dtype": "float32"}, {"name": "367", "dtype": "float32"}, {"name": "368", "dtype": "float32"}, {"name": "369", "dtype": "float32"}, {"name": "370", "dtype": "float32"}, {"name": "371", "dtype": "float32"}, {"name": "372", "dtype": "float32"}, {"name": "373", "dtype": "float32"}, {"name": "374", "dtype": "float32"}, {"name": "375", "dtype": "float32"}, {"name": "376", "dtype": "float32"}, {"name": "377", "dtype": "float32"}, {"name": "378", "dtype": "float32"}, {"name": "379", "dtype": "float32"}, {"name": "380", "dtype": "float32"}, {"name": "381", "dtype": "float32"}, {"name": "382", "dtype": "float32"}, {"name": "383", "dtype": "float32"}, {"name": "384", "dtype": "float32"}, {"name": "385", "dtype": "float32"}, {"name": "386", "dtype": "float32"}, {"name": "387", "dtype": "float32"}, {"name": "388", "dtype": "float32"}, {"name": "389", "dtype": "float32"}, {"name": "390", "dtype": "float32"}, {"name": "391", "dtype": "float32"}, {"name": "392", "dtype": "float32"}, {"name": "393", "dtype": "float32"}, {"name": "394", "dtype": "float32"}, {"name": "395", "dtype": "float32"}, {"name": "396", "dtype": "float32"}, {"name": "397", "dtype": "float32"}, {"name": "398", "dtype": "float32"}, {"name": "399", "dtype": "float32"}, {"name": "400", "dtype": "float32"}, {"name": "401", "dtype": "float32"}, {"name": "402", "dtype": "float32"}, {"name": "403", "dtype": "float32"}, {"name": "404", "dtype": "float32"}, {"name": "405", "dtype": "float32"}, {"name": "406", "dtype": "float32"}, {"name": "407", "dtype": "float32"}, {"name": "408", "dtype": "float32"}, {"name": "409", "dtype": "float32"}, {"name": "410", "dtype": "float32"}, {"name": "411", "dtype": "float32"}, {"name": "412", "dtype": "float32"}, {"name": "413", "dtype": "float32"}, {"name": "414", "dtype": "float32"}, {"name": "415", "dtype": "float32"}, {"name": "416", "dtype": "float32"}, {"name": "417", "dtype": "float32"}, {"name": "418", "dtype": "float32"}, {"name": "419", "dtype": "float32"}, {"name": "420", "dtype": "float32"}, {"name": "421", "dtype": "float32"}, {"name": "422", "dtype": "float32"}, {"name": "423", "dtype": "float32"}, {"name": "424", "dtype": "float32"}, {"name": "425", "dtype": "float32"}, {"name": "426", "dtype": "float32"}, {"name": "427", "dtype": "float32"}, {"name": "428", "dtype": "float32"}, {"name": "429", "dtype": "float32"}, {"name": "430", "dtype": "float32"}, {"name": "431", "dtype": "float32"}, {"name": "432", "dtype": "float32"}, {"name": "433", "dtype": "float32"}, {"name": "434", "dtype": "float32"}, {"name": "435", "dtype": "float32"}, {"name": "436", "dtype": "float32"}, {"name": "437", "dtype": "float32"}, {"name": "438", "dtype": "float32"}, {"name": "439", "dtype": "float32"}, {"name": "440", "dtype": "float32"}, {"name": "441", "dtype": "float32"}, {"name": "442", "dtype": "float32"}, {"name": "443", "dtype": "float32"}, {"name": "444", "dtype": "float32"}, {"name": "445", "dtype": "float32"}, {"name": "446", "dtype": "float32"}, {"name": "447", "dtype": "float32"}, {"name": "448", "dtype": "float32"}, {"name": "449", "dtype": "float32"}, {"name": "450", "dtype": "float32"}, {"name": "451", "dtype": "float32"}, {"name": "452", "dtype": "float32"}, {"name": "453", "dtype": "float32"}, {"name": "454", "dtype": "float32"}, {"name": "455", "dtype": "float32"}, {"name": "456", "dtype": "float32"}, {"name": "457", "dtype": "float32"}, {"name": "458", "dtype": "float32"}, {"name": "459", "dtype": "float32"}, {"name": "460", "dtype": "float32"}, {"name": "461", "dtype": "float32"}, {"name": "462", "dtype": "float32"}, {"name": "463", "dtype": "float32"}, {"name": "464", "dtype": "float32"}, {"name": "465", "dtype": "float32"}, {"name": "466", "dtype": "float32"}, {"name": "467", "dtype": "float32"}, {"name": "468", "dtype": "float32"}, {"name": "469", "dtype": "float32"}, {"name": "470", "dtype": "float32"}, {"name": "471", "dtype": "float32"}, {"name": "472", "dtype": "float32"}, {"name": "473", "dtype": "float32"}, {"name": "474", "dtype": "float32"}, {"name": "475", "dtype": "float32"}, {"name": "476", "dtype": "float32"}, {"name": "477", "dtype": "float32"}, {"name": "478", "dtype": "float32"}, {"name": "479", "dtype": "float32"}, {"name": "480", "dtype": "float32"}, {"name": "481", "dtype": "float32"}, {"name": "482", "dtype": "float32"}, {"name": "483", "dtype": "float32"}, {"name": "484", "dtype": "float32"}, {"name": "485", "dtype": "float32"}, {"name": "486", "dtype": "float32"}, {"name": "487", "dtype": "float32"}, {"name": "488", "dtype": "float32"}, {"name": "489", "dtype": "float32"}, {"name": "490", "dtype": "float32"}, {"name": "491", "dtype": "float32"}, {"name": "492", "dtype": "float32"}, {"name": "493", "dtype": "float32"}, {"name": "494", "dtype": "float32"}, {"name": "495", "dtype": "float32"}, {"name": "496", "dtype": "float32"}, {"name": "497", "dtype": "float32"}, {"name": "498", "dtype": "float32"}, {"name": "499", "dtype": "float32"}, {"name": "500", "dtype": "float32"}, {"name": "501", "dtype": "float32"}, {"name": "502", "dtype": "float32"}, {"name": "503", "dtype": "float32"}, {"name": "504", "dtype": "float32"}, {"name": "505", "dtype": "float32"}, {"name": "506", "dtype": "float32"}, {"name": "507", "dtype": "float32"}, {"name": "508", "dtype": "float32"}, {"name": "509", "dtype": "float32"}, {"name": "510", "dtype": "float32"}, {"name": "511", "dtype": "float32"}, {"name": "512", "dtype": "float32"}, {"name": "513", "dtype": "float32"}, {"name": "514", "dtype": "float32"}, {"name": "515", "dtype": "float32"}, {"name": "516", "dtype": "float32"}, {"name": "517", "dtype": "float32"}, {"name": "518", "dtype": "float32"}, {"name": "519", "dtype": "float32"}, {"name": "520", "dtype": "float32"}, {"name": "521", "dtype": "float32"}, {"name": "522", "dtype": "float32"}, {"name": "523", "dtype": "float32"}, {"name": "524", "dtype": "float32"}, {"name": "525", "dtype": "float32"}, {"name": "526", "dtype": "float32"}, {"name": "527", "dtype": "float32"}, {"name": "528", "dtype": "float32"}, {"name": "529", "dtype": "float32"}, {"name": "530", "dtype": "float32"}, {"name": "531", "dtype": "float32"}, {"name": "532", "dtype": "float32"}, {"name": "533", "dtype": "float32"}, {"name": "534", "dtype": "float32"}, {"name": "535", "dtype": "float32"}, {"name": "536", "dtype": "float32"}, {"name": "537", "dtype": "float32"}, {"name": "538", "dtype": "float32"}, {"name": "539", "dtype": "float32"}, {"name": "540", "dtype": "float32"}, {"name": "541", "dtype": "float32"}, {"name": "542", "dtype": "float32"}, {"name": "543", "dtype": "float32"}, {"name": "544", "dtype": "float32"}, {"name": "545", "dtype": "float32"}, {"name": "546", "dtype": "float32"}, {"name": "547", "dtype": "float32"}, {"name": "548", "dtype": "float32"}, {"name": "549", "dtype": "float32"}, {"name": "550", "dtype": "float32"}, {"name": "551", "dtype": "float32"}, {"name": "552", "dtype": "float32"}, {"name": "553", "dtype": "float32"}, {"name": "554", "dtype": "float32"}, {"name": "555", "dtype": "float32"}, {"name": "556", "dtype": "float32"}, {"name": "557", "dtype": "float32"}, {"name": "558", "dtype": "float32"}, {"name": "559", "dtype": "float32"}, {"name": "560", "dtype": "float32"}, {"name": "561", "dtype": "float32"}, {"name": "562", "dtype": "float32"}, {"name": "563", "dtype": "float32"}, {"name": "564", "dtype": "float32"}, {"name": "565", "dtype": "float32"}, {"name": "566", "dtype": "float32"}, {"name": "567", "dtype": "float32"}, {"name": "568", "dtype": "float32"}, {"name": "569", "dtype": "float32"}, {"name": "570", "dtype": "float32"}, {"name": "571", "dtype": "float32"}, {"name": "572", "dtype": "float32"}, {"name": "573", "dtype": "float32"}, {"name": "574", "dtype": "float32"}, {"name": "575", "dtype": "float32"}, {"name": "576", "dtype": "float32"}, {"name": "577", "dtype": "float32"}, {"name": "578", "dtype": "float32"}, {"name": "579", "dtype": "float32"}, {"name": "580", "dtype": "float32"}, {"name": "581", "dtype": "float32"}, {"name": "582", "dtype": "float32"}, {"name": "583", "dtype": "float32"}, {"name": "584", "dtype": "float32"}, {"name": "585", "dtype": "float32"}, {"name": "586", "dtype": "float32"}, {"name": "587", "dtype": "float32"}, {"name": "588", "dtype": "float32"}, {"name": "589", "dtype": "float32"}, {"name": "590", "dtype": "float32"}, {"name": "591", "dtype": "float32"}, {"name": "592", "dtype": "float32"}, {"name": "593", "dtype": "float32"}, {"name": "594", "dtype": "float32"}, {"name": "595", "dtype": "float32"}, {"name": "596", "dtype": "float32"}, {"name": "597", "dtype": "float32"}, {"name": "598", "dtype": "float32"}, {"name": "599", "dtype": "float32"}, {"name": "600", "dtype": "float32"}, {"name": "601", "dtype": "float32"}, {"name": "602", "dtype": "float32"}, {"name": "603", "dtype": "float32"}, {"name": "604", "dtype": "float32"}, {"name": "605", "dtype": "float32"}, {"name": "606", "dtype": "float32"}, {"name": "607", "dtype": "float32"}, {"name": "608", "dtype": "float32"}, {"name": "609", "dtype": "float32"}, {"name": "610", "dtype": "float32"}, {"name": "611", "dtype": "float32"}, {"name": "612", "dtype": "float32"}, {"name": "613", "dtype": "float32"}, {"name": "614", "dtype": "float32"}, {"name": "615", "dtype": "float32"}, {"name": "616", "dtype": "float32"}, {"name": "617", "dtype": "float32"}, {"name": "618", "dtype": "float32"}, {"name": "619", "dtype": "float32"}, {"name": "620", "dtype": "float32"}, {"name": "621", "dtype": "float32"}, {"name": "622", "dtype": "float32"}, {"name": "623", "dtype": "float32"}, {"name": "624", "dtype": "float32"}, {"name": "625", "dtype": "float32"}, {"name": "626", "dtype": "float32"}, {"name": "627", "dtype": "float32"}, {"name": "628", "dtype": "float32"}, {"name": "629", "dtype": "float32"}, {"name": "630", "dtype": "float32"}, {"name": "631", "dtype": "float32"}, {"name": "632", "dtype": "float32"}, {"name": "633", "dtype": "float32"}, {"name": "634", "dtype": "float32"}, {"name": "635", "dtype": "float32"}, {"name": "636", "dtype": "float32"}, {"name": "637", "dtype": "float32"}, {"name": "638", "dtype": "float32"}, {"name": "639", "dtype": "float32"}, {"name": "640", "dtype": "float32"}, {"name": "641", "dtype": "float32"}, {"name": "642", "dtype": "float32"}, {"name": "643", "dtype": "float32"}, {"name": "644", "dtype": "float32"}, {"name": "645", "dtype": "float32"}, {"name": "646", "dtype": "float32"}, {"name": "647", "dtype": "float32"}, {"name": "648", "dtype": "float32"}, {"name": "649", "dtype": "float32"}, {"name": "650", "dtype": "float32"}, {"name": "651", "dtype": "float32"}, {"name": "652", "dtype": "float32"}, {"name": "653", "dtype": "float32"}, {"name": "654", "dtype": "float32"}, {"name": "655", "dtype": "float32"}, {"name": "656", "dtype": "float32"}, {"name": "657", "dtype": "float32"}, {"name": "658", "dtype": "float32"}, {"name": "659", "dtype": "float32"}, {"name": "660", "dtype": "float32"}, {"name": "661", "dtype": "float32"}, {"name": "662", "dtype": "float32"}, {"name": "663", "dtype": "float32"}, {"name": "664", "dtype": "float32"}, {"name": "665", "dtype": "float32"}, {"name": "666", "dtype": "float32"}, {"name": "667", "dtype": "float32"}, {"name": "668", "dtype": "float32"}, {"name": "669", "dtype": "float32"}, {"name": "670", "dtype": "float32"}, {"name": "671", "dtype": "float32"}, {"name": "672", "dtype": "float32"}, {"name": "673", "dtype": "float32"}, {"name": "674", "dtype": "float32"}, {"name": "675", "dtype": "float32"}, {"name": "676", "dtype": "float32"}, {"name": "677", "dtype": "float32"}, {"name": "678", "dtype": "float32"}, {"name": "679", "dtype": "float32"}, {"name": "680", "dtype": "float32"}, {"name": "681", "dtype": "float32"}, {"name": "682", "dtype": "float32"}, {"name": "683", "dtype": "float32"}, {"name": "684", "dtype": "float32"}, {"name": "685", "dtype": "float32"}, {"name": "686", "dtype": "float32"}, {"name": "687", "dtype": "float32"}, {"name": "688", "dtype": "float32"}, {"name": "689", "dtype": "float32"}, {"name": "690", "dtype": "float32"}, {"name": "691", "dtype": "float32"}, {"name": "692", "dtype": "float32"}, {"name": "693", "dtype": "float32"}, {"name": "694", "dtype": "float32"}, {"name": "695", "dtype": "float32"}, {"name": "696", "dtype": "float32"}, {"name": "697", "dtype": "float32"}, {"name": "698", "dtype": "float32"}, {"name": "699", "dtype": "float32"}, {"name": "700", "dtype": "float32"}, {"name": "701", "dtype": "float32"}, {"name": "702", "dtype": "float32"}, {"name": "703", "dtype": "float32"}, {"name": "704", "dtype": "float32"}, {"name": "705", "dtype": "float32"}, {"name": "706", "dtype": "float32"}, {"name": "707", "dtype": "float32"}, {"name": "708", "dtype": "float32"}, {"name": "709", "dtype": "float32"}, {"name": "710", "dtype": "float32"}, {"name": "711", "dtype": "float32"}, {"name": "712", "dtype": "float32"}, {"name": "713", "dtype": "float32"}, {"name": "714", "dtype": "float32"}, {"name": "715", "dtype": "float32"}, {"name": "716", "dtype": "float32"}, {"name": "717", "dtype": "float32"}, {"name": "718", "dtype": "float32"}, {"name": "719", "dtype": "float32"}, {"name": "720", "dtype": "float32"}, {"name": "721", "dtype": "float32"}, {"name": "722", "dtype": "float32"}, {"name": "723", "dtype": "float32"}, {"name": "724", "dtype": "float32"}, {"name": "725", "dtype": "float32"}, {"name": "726", "dtype": "float32"}, {"name": "727", "dtype": "float32"}, {"name": "728", "dtype": "float32"}, {"name": "729", "dtype": "float32"}, {"name": "730", "dtype": "float32"}, {"name": "731", "dtype": "float32"}, {"name": "732", "dtype": "float32"}, {"name": "733", "dtype": "float32"}, {"name": "734", "dtype": "float32"}, {"name": "735", "dtype": "float32"}, {"name": "736", "dtype": "float32"}, {"name": "737", "dtype": "float32"}, {"name": "738", "dtype": "float32"}, {"name": "739", "dtype": "float32"}, {"name": "740", "dtype": "float32"}, {"name": "741", "dtype": "float32"}, {"name": "742", "dtype": "float32"}, {"name": "743", "dtype": "float32"}, {"name": "744", "dtype": "float32"}, {"name": "745", "dtype": "float32"}, {"name": "746", "dtype": "float32"}, {"name": "747", "dtype": "float32"}, {"name": "748", "dtype": "float32"}, {"name": "749", "dtype": "float32"}, {"name": "750", "dtype": "float32"}, {"name": "751", "dtype": "float32"}, {"name": "752", "dtype": "float32"}, {"name": "753", "dtype": "float32"}, {"name": "754", "dtype": "float32"}, {"name": "755", "dtype": "float32"}, {"name": "756", "dtype": "float32"}, {"name": "757", "dtype": "float32"}, {"name": "758", "dtype": "float32"}, {"name": "759", "dtype": "float32"}, {"name": "760", "dtype": "float32"}, {"name": "761", "dtype": "float32"}, {"name": "762", "dtype": "float32"}, {"name": "763", "dtype": "float32"}, {"name": "764", "dtype": "float32"}, {"name": "765", "dtype": "float32"}, {"name": "766", "dtype": "float32"}, {"name": "767", "dtype": "float32"}, {"name": "label", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 115576729.6875, "num_examples": 37500}, {"name": "test", "num_bytes": 38525577.5, "num_examples": 12500}], "download_size": 211881241, "dataset_size": 154102307.1875}}
2023-08-23T03:05:06+00:00
[]
[]
TAGS #region-us
# Dataset Card for "Thunderbird_RoBERTa_Finetuned" More Information needed
[ "# Dataset Card for \"Thunderbird_RoBERTa_Finetuned\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"Thunderbird_RoBERTa_Finetuned\"\n\nMore Information needed" ]
[ 6, 22 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"Thunderbird_RoBERTa_Finetuned\"\n\nMore Information needed" ]
a1a3888d509e4b2d8f99bd70f1d32a1bf2ab23e9
# Dataset Card for "Thunderbird_DistilRoBERTa_Finetuned" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
EgilKarlsen/Thunderbird_DistilRoBERTa_Finetuned
[ "region:us" ]
2023-08-18T19:18:59+00:00
{"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "test", "path": "data/test-*"}]}], "dataset_info": {"features": [{"name": "0", "dtype": "float32"}, {"name": "1", "dtype": "float32"}, {"name": "2", "dtype": "float32"}, {"name": "3", "dtype": "float32"}, {"name": "4", "dtype": "float32"}, {"name": "5", "dtype": "float32"}, {"name": "6", "dtype": "float32"}, {"name": "7", "dtype": "float32"}, {"name": "8", "dtype": "float32"}, {"name": "9", "dtype": "float32"}, {"name": "10", "dtype": "float32"}, {"name": "11", "dtype": "float32"}, {"name": "12", "dtype": "float32"}, {"name": "13", "dtype": "float32"}, {"name": "14", "dtype": "float32"}, {"name": "15", "dtype": "float32"}, {"name": "16", "dtype": "float32"}, {"name": "17", "dtype": "float32"}, {"name": "18", "dtype": "float32"}, {"name": "19", "dtype": "float32"}, {"name": "20", "dtype": "float32"}, {"name": "21", "dtype": "float32"}, {"name": "22", "dtype": "float32"}, {"name": "23", "dtype": "float32"}, {"name": "24", "dtype": "float32"}, {"name": "25", "dtype": "float32"}, {"name": "26", "dtype": "float32"}, {"name": "27", "dtype": "float32"}, {"name": "28", "dtype": "float32"}, {"name": "29", "dtype": "float32"}, {"name": "30", "dtype": "float32"}, {"name": "31", "dtype": "float32"}, {"name": "32", "dtype": "float32"}, {"name": "33", "dtype": "float32"}, {"name": "34", "dtype": "float32"}, {"name": "35", "dtype": "float32"}, {"name": "36", "dtype": "float32"}, {"name": "37", "dtype": "float32"}, {"name": "38", "dtype": "float32"}, {"name": "39", "dtype": "float32"}, {"name": "40", "dtype": "float32"}, {"name": "41", "dtype": "float32"}, {"name": "42", "dtype": "float32"}, {"name": "43", "dtype": "float32"}, {"name": "44", "dtype": "float32"}, {"name": "45", "dtype": "float32"}, {"name": "46", "dtype": "float32"}, {"name": "47", "dtype": "float32"}, {"name": "48", "dtype": "float32"}, {"name": "49", "dtype": "float32"}, {"name": "50", "dtype": "float32"}, {"name": "51", "dtype": "float32"}, {"name": "52", "dtype": "float32"}, {"name": "53", "dtype": "float32"}, {"name": "54", "dtype": "float32"}, {"name": "55", "dtype": "float32"}, {"name": "56", "dtype": "float32"}, {"name": "57", "dtype": "float32"}, {"name": "58", "dtype": "float32"}, {"name": "59", "dtype": "float32"}, {"name": "60", "dtype": "float32"}, {"name": "61", "dtype": "float32"}, {"name": "62", "dtype": "float32"}, {"name": "63", "dtype": "float32"}, {"name": "64", "dtype": "float32"}, {"name": "65", "dtype": "float32"}, {"name": "66", "dtype": "float32"}, {"name": "67", "dtype": "float32"}, {"name": "68", "dtype": "float32"}, {"name": "69", "dtype": "float32"}, {"name": "70", "dtype": "float32"}, {"name": "71", "dtype": "float32"}, {"name": "72", "dtype": "float32"}, {"name": "73", "dtype": "float32"}, {"name": "74", "dtype": "float32"}, {"name": "75", "dtype": "float32"}, {"name": "76", "dtype": "float32"}, {"name": "77", "dtype": "float32"}, {"name": "78", "dtype": "float32"}, {"name": "79", "dtype": "float32"}, {"name": "80", "dtype": "float32"}, {"name": "81", "dtype": "float32"}, {"name": "82", "dtype": "float32"}, {"name": "83", "dtype": "float32"}, {"name": "84", "dtype": "float32"}, {"name": "85", "dtype": "float32"}, {"name": "86", "dtype": "float32"}, {"name": "87", "dtype": "float32"}, {"name": "88", "dtype": "float32"}, {"name": "89", "dtype": "float32"}, {"name": "90", "dtype": "float32"}, {"name": "91", "dtype": "float32"}, {"name": "92", "dtype": "float32"}, {"name": "93", "dtype": "float32"}, {"name": "94", "dtype": "float32"}, {"name": "95", "dtype": "float32"}, {"name": "96", "dtype": "float32"}, {"name": "97", "dtype": "float32"}, {"name": "98", "dtype": "float32"}, {"name": "99", "dtype": "float32"}, {"name": "100", "dtype": "float32"}, {"name": "101", "dtype": "float32"}, {"name": "102", "dtype": "float32"}, {"name": "103", "dtype": "float32"}, {"name": "104", "dtype": "float32"}, {"name": "105", "dtype": "float32"}, {"name": "106", "dtype": "float32"}, {"name": "107", "dtype": "float32"}, {"name": "108", "dtype": "float32"}, {"name": "109", "dtype": "float32"}, {"name": "110", "dtype": "float32"}, {"name": "111", "dtype": "float32"}, {"name": "112", "dtype": "float32"}, {"name": "113", "dtype": "float32"}, {"name": "114", "dtype": "float32"}, {"name": "115", "dtype": "float32"}, {"name": "116", "dtype": "float32"}, {"name": "117", "dtype": "float32"}, {"name": "118", "dtype": "float32"}, {"name": "119", "dtype": "float32"}, {"name": "120", "dtype": "float32"}, {"name": "121", "dtype": "float32"}, {"name": "122", "dtype": "float32"}, {"name": "123", "dtype": "float32"}, {"name": "124", "dtype": "float32"}, {"name": "125", "dtype": "float32"}, {"name": "126", "dtype": "float32"}, {"name": "127", "dtype": "float32"}, {"name": "128", "dtype": "float32"}, {"name": "129", "dtype": "float32"}, {"name": "130", "dtype": "float32"}, {"name": "131", "dtype": "float32"}, {"name": "132", "dtype": "float32"}, {"name": "133", "dtype": "float32"}, {"name": "134", "dtype": "float32"}, {"name": "135", "dtype": "float32"}, {"name": "136", "dtype": "float32"}, {"name": "137", "dtype": "float32"}, {"name": "138", "dtype": "float32"}, {"name": "139", "dtype": "float32"}, {"name": "140", "dtype": "float32"}, {"name": "141", "dtype": "float32"}, {"name": "142", "dtype": "float32"}, {"name": "143", "dtype": "float32"}, {"name": "144", "dtype": "float32"}, {"name": "145", "dtype": "float32"}, {"name": "146", "dtype": "float32"}, {"name": "147", "dtype": "float32"}, {"name": "148", "dtype": "float32"}, {"name": "149", "dtype": "float32"}, {"name": "150", "dtype": "float32"}, {"name": "151", "dtype": "float32"}, {"name": "152", "dtype": "float32"}, {"name": "153", "dtype": "float32"}, {"name": "154", "dtype": "float32"}, {"name": "155", "dtype": "float32"}, {"name": "156", "dtype": "float32"}, {"name": "157", "dtype": "float32"}, {"name": "158", "dtype": "float32"}, {"name": "159", "dtype": "float32"}, {"name": "160", "dtype": "float32"}, {"name": "161", "dtype": "float32"}, {"name": "162", "dtype": "float32"}, {"name": "163", "dtype": "float32"}, {"name": "164", "dtype": "float32"}, {"name": "165", "dtype": "float32"}, {"name": "166", "dtype": "float32"}, {"name": "167", "dtype": "float32"}, {"name": "168", "dtype": "float32"}, {"name": "169", "dtype": "float32"}, {"name": "170", "dtype": "float32"}, {"name": "171", "dtype": "float32"}, {"name": "172", "dtype": "float32"}, {"name": "173", "dtype": "float32"}, {"name": "174", "dtype": "float32"}, {"name": "175", "dtype": "float32"}, {"name": "176", "dtype": "float32"}, {"name": "177", "dtype": "float32"}, {"name": "178", "dtype": "float32"}, {"name": "179", "dtype": "float32"}, {"name": "180", "dtype": "float32"}, {"name": "181", "dtype": "float32"}, {"name": "182", "dtype": "float32"}, {"name": "183", "dtype": "float32"}, {"name": "184", "dtype": "float32"}, {"name": "185", "dtype": "float32"}, {"name": "186", "dtype": "float32"}, {"name": "187", "dtype": "float32"}, {"name": "188", "dtype": "float32"}, {"name": "189", "dtype": "float32"}, {"name": "190", "dtype": "float32"}, {"name": "191", "dtype": "float32"}, {"name": "192", "dtype": "float32"}, {"name": "193", "dtype": "float32"}, {"name": "194", "dtype": "float32"}, {"name": "195", "dtype": "float32"}, {"name": "196", "dtype": "float32"}, {"name": "197", "dtype": "float32"}, {"name": "198", "dtype": "float32"}, {"name": "199", "dtype": "float32"}, {"name": "200", "dtype": "float32"}, {"name": "201", "dtype": "float32"}, {"name": "202", "dtype": "float32"}, {"name": "203", "dtype": "float32"}, {"name": "204", "dtype": "float32"}, {"name": "205", "dtype": "float32"}, {"name": "206", "dtype": "float32"}, {"name": "207", "dtype": "float32"}, {"name": "208", "dtype": "float32"}, {"name": "209", "dtype": "float32"}, {"name": "210", "dtype": "float32"}, {"name": "211", "dtype": "float32"}, {"name": "212", "dtype": "float32"}, {"name": "213", "dtype": "float32"}, {"name": "214", "dtype": "float32"}, {"name": "215", "dtype": "float32"}, {"name": "216", "dtype": "float32"}, {"name": "217", "dtype": "float32"}, {"name": "218", "dtype": "float32"}, {"name": "219", "dtype": "float32"}, {"name": "220", "dtype": "float32"}, {"name": "221", "dtype": "float32"}, {"name": "222", "dtype": "float32"}, {"name": "223", "dtype": "float32"}, {"name": "224", "dtype": "float32"}, {"name": "225", "dtype": "float32"}, {"name": "226", "dtype": "float32"}, {"name": "227", "dtype": "float32"}, {"name": "228", "dtype": "float32"}, {"name": "229", "dtype": "float32"}, {"name": "230", "dtype": "float32"}, {"name": "231", "dtype": "float32"}, {"name": "232", "dtype": "float32"}, {"name": "233", "dtype": "float32"}, {"name": "234", "dtype": "float32"}, {"name": "235", "dtype": "float32"}, {"name": "236", "dtype": "float32"}, {"name": "237", "dtype": "float32"}, {"name": "238", "dtype": "float32"}, {"name": "239", "dtype": "float32"}, {"name": "240", "dtype": "float32"}, {"name": "241", "dtype": "float32"}, {"name": "242", "dtype": "float32"}, {"name": "243", "dtype": "float32"}, {"name": "244", "dtype": "float32"}, {"name": "245", "dtype": "float32"}, {"name": "246", "dtype": "float32"}, {"name": "247", "dtype": "float32"}, {"name": "248", "dtype": "float32"}, {"name": "249", "dtype": "float32"}, {"name": "250", "dtype": "float32"}, {"name": "251", "dtype": "float32"}, {"name": "252", "dtype": "float32"}, {"name": "253", "dtype": "float32"}, {"name": "254", "dtype": "float32"}, {"name": "255", "dtype": "float32"}, {"name": "256", "dtype": "float32"}, {"name": "257", "dtype": "float32"}, {"name": "258", "dtype": "float32"}, {"name": "259", "dtype": "float32"}, {"name": "260", "dtype": "float32"}, {"name": "261", "dtype": "float32"}, {"name": "262", "dtype": "float32"}, {"name": "263", "dtype": "float32"}, {"name": "264", "dtype": "float32"}, {"name": "265", "dtype": "float32"}, {"name": "266", "dtype": "float32"}, {"name": "267", "dtype": "float32"}, {"name": "268", "dtype": "float32"}, {"name": "269", "dtype": "float32"}, {"name": "270", "dtype": "float32"}, {"name": "271", "dtype": "float32"}, {"name": "272", "dtype": "float32"}, {"name": "273", "dtype": "float32"}, {"name": "274", "dtype": "float32"}, {"name": "275", "dtype": "float32"}, {"name": "276", "dtype": "float32"}, {"name": "277", "dtype": "float32"}, {"name": "278", "dtype": "float32"}, {"name": "279", "dtype": "float32"}, {"name": "280", "dtype": "float32"}, {"name": "281", "dtype": "float32"}, {"name": "282", "dtype": "float32"}, {"name": "283", "dtype": "float32"}, {"name": "284", "dtype": "float32"}, {"name": "285", "dtype": "float32"}, {"name": "286", "dtype": "float32"}, {"name": "287", "dtype": "float32"}, {"name": "288", "dtype": "float32"}, {"name": "289", "dtype": "float32"}, {"name": "290", "dtype": "float32"}, {"name": "291", "dtype": "float32"}, {"name": "292", "dtype": "float32"}, {"name": "293", "dtype": "float32"}, {"name": "294", "dtype": "float32"}, {"name": "295", "dtype": "float32"}, {"name": "296", "dtype": "float32"}, {"name": "297", "dtype": "float32"}, {"name": "298", "dtype": "float32"}, {"name": "299", "dtype": "float32"}, {"name": "300", "dtype": "float32"}, {"name": "301", "dtype": "float32"}, {"name": "302", "dtype": "float32"}, {"name": "303", "dtype": "float32"}, {"name": "304", "dtype": "float32"}, {"name": "305", "dtype": "float32"}, {"name": "306", "dtype": "float32"}, {"name": "307", "dtype": "float32"}, {"name": "308", "dtype": "float32"}, {"name": "309", "dtype": "float32"}, {"name": "310", "dtype": "float32"}, {"name": "311", "dtype": "float32"}, {"name": "312", "dtype": "float32"}, {"name": "313", "dtype": "float32"}, {"name": "314", "dtype": "float32"}, {"name": "315", "dtype": "float32"}, {"name": "316", "dtype": "float32"}, {"name": "317", "dtype": "float32"}, {"name": "318", "dtype": "float32"}, {"name": "319", "dtype": "float32"}, {"name": "320", "dtype": "float32"}, {"name": "321", "dtype": "float32"}, {"name": "322", "dtype": "float32"}, {"name": "323", "dtype": "float32"}, {"name": "324", "dtype": "float32"}, {"name": "325", "dtype": "float32"}, {"name": "326", "dtype": "float32"}, {"name": "327", "dtype": "float32"}, {"name": "328", "dtype": "float32"}, {"name": "329", "dtype": "float32"}, {"name": "330", "dtype": "float32"}, {"name": "331", "dtype": "float32"}, {"name": "332", "dtype": "float32"}, {"name": "333", "dtype": "float32"}, {"name": "334", "dtype": "float32"}, {"name": "335", "dtype": "float32"}, {"name": "336", "dtype": "float32"}, {"name": "337", "dtype": "float32"}, {"name": "338", "dtype": "float32"}, {"name": "339", "dtype": "float32"}, {"name": "340", "dtype": "float32"}, {"name": "341", "dtype": "float32"}, {"name": "342", "dtype": "float32"}, {"name": "343", "dtype": "float32"}, {"name": "344", "dtype": "float32"}, {"name": "345", "dtype": "float32"}, {"name": "346", "dtype": "float32"}, {"name": "347", "dtype": "float32"}, {"name": "348", "dtype": "float32"}, {"name": "349", "dtype": "float32"}, {"name": "350", "dtype": "float32"}, {"name": "351", "dtype": "float32"}, {"name": "352", "dtype": "float32"}, {"name": "353", "dtype": "float32"}, {"name": "354", "dtype": "float32"}, {"name": "355", "dtype": "float32"}, {"name": "356", "dtype": "float32"}, {"name": "357", "dtype": "float32"}, {"name": "358", "dtype": "float32"}, {"name": "359", "dtype": "float32"}, {"name": "360", "dtype": "float32"}, {"name": "361", "dtype": "float32"}, {"name": "362", "dtype": "float32"}, {"name": "363", "dtype": "float32"}, {"name": "364", "dtype": "float32"}, {"name": "365", "dtype": "float32"}, {"name": "366", "dtype": "float32"}, {"name": "367", "dtype": "float32"}, {"name": "368", "dtype": "float32"}, {"name": "369", "dtype": "float32"}, {"name": "370", "dtype": "float32"}, {"name": "371", "dtype": "float32"}, {"name": "372", "dtype": "float32"}, {"name": "373", "dtype": "float32"}, {"name": "374", "dtype": "float32"}, {"name": "375", "dtype": "float32"}, {"name": "376", "dtype": "float32"}, {"name": "377", "dtype": "float32"}, {"name": "378", "dtype": "float32"}, {"name": "379", "dtype": "float32"}, {"name": "380", "dtype": "float32"}, {"name": "381", "dtype": "float32"}, {"name": "382", "dtype": "float32"}, {"name": "383", "dtype": "float32"}, {"name": "384", "dtype": "float32"}, {"name": "385", "dtype": "float32"}, {"name": "386", "dtype": "float32"}, {"name": "387", "dtype": "float32"}, {"name": "388", "dtype": "float32"}, {"name": "389", "dtype": "float32"}, {"name": "390", "dtype": "float32"}, {"name": "391", "dtype": "float32"}, {"name": "392", "dtype": "float32"}, {"name": "393", "dtype": "float32"}, {"name": "394", "dtype": "float32"}, {"name": "395", "dtype": "float32"}, {"name": "396", "dtype": "float32"}, {"name": "397", "dtype": "float32"}, {"name": "398", "dtype": "float32"}, {"name": "399", "dtype": "float32"}, {"name": "400", "dtype": "float32"}, {"name": "401", "dtype": "float32"}, {"name": "402", "dtype": "float32"}, {"name": "403", "dtype": "float32"}, {"name": "404", "dtype": "float32"}, {"name": "405", "dtype": "float32"}, {"name": "406", "dtype": "float32"}, {"name": "407", "dtype": "float32"}, {"name": "408", "dtype": "float32"}, {"name": "409", "dtype": "float32"}, {"name": "410", "dtype": "float32"}, {"name": "411", "dtype": "float32"}, {"name": "412", "dtype": "float32"}, {"name": "413", "dtype": "float32"}, {"name": "414", "dtype": "float32"}, {"name": "415", "dtype": "float32"}, {"name": "416", "dtype": "float32"}, {"name": "417", "dtype": "float32"}, {"name": "418", "dtype": "float32"}, {"name": "419", "dtype": "float32"}, {"name": "420", "dtype": "float32"}, {"name": "421", "dtype": "float32"}, {"name": "422", "dtype": "float32"}, {"name": "423", "dtype": "float32"}, {"name": "424", "dtype": "float32"}, {"name": "425", "dtype": "float32"}, {"name": "426", "dtype": "float32"}, {"name": "427", "dtype": "float32"}, {"name": "428", "dtype": "float32"}, {"name": "429", "dtype": "float32"}, {"name": "430", "dtype": "float32"}, {"name": "431", "dtype": "float32"}, {"name": "432", "dtype": "float32"}, {"name": "433", "dtype": "float32"}, {"name": "434", "dtype": "float32"}, {"name": "435", "dtype": "float32"}, {"name": "436", "dtype": "float32"}, {"name": "437", "dtype": "float32"}, {"name": "438", "dtype": "float32"}, {"name": "439", "dtype": "float32"}, {"name": "440", "dtype": "float32"}, {"name": "441", "dtype": "float32"}, {"name": "442", "dtype": "float32"}, {"name": "443", "dtype": "float32"}, {"name": "444", "dtype": "float32"}, {"name": "445", "dtype": "float32"}, {"name": "446", "dtype": "float32"}, {"name": "447", "dtype": "float32"}, {"name": "448", "dtype": "float32"}, {"name": "449", "dtype": "float32"}, {"name": "450", "dtype": "float32"}, {"name": "451", "dtype": "float32"}, {"name": "452", "dtype": "float32"}, {"name": "453", "dtype": "float32"}, {"name": "454", "dtype": "float32"}, {"name": "455", "dtype": "float32"}, {"name": "456", "dtype": "float32"}, {"name": "457", "dtype": "float32"}, {"name": "458", "dtype": "float32"}, {"name": "459", "dtype": "float32"}, {"name": "460", "dtype": "float32"}, {"name": "461", "dtype": "float32"}, {"name": "462", "dtype": "float32"}, {"name": "463", "dtype": "float32"}, {"name": "464", "dtype": "float32"}, {"name": "465", "dtype": "float32"}, {"name": "466", "dtype": "float32"}, {"name": "467", "dtype": "float32"}, {"name": "468", "dtype": "float32"}, {"name": "469", "dtype": "float32"}, {"name": "470", "dtype": "float32"}, {"name": "471", "dtype": "float32"}, {"name": "472", "dtype": "float32"}, {"name": "473", "dtype": "float32"}, {"name": "474", "dtype": "float32"}, {"name": "475", "dtype": "float32"}, {"name": "476", "dtype": "float32"}, {"name": "477", "dtype": "float32"}, {"name": "478", "dtype": "float32"}, {"name": "479", "dtype": "float32"}, {"name": "480", "dtype": "float32"}, {"name": "481", "dtype": "float32"}, {"name": "482", "dtype": "float32"}, {"name": "483", "dtype": "float32"}, {"name": "484", "dtype": "float32"}, {"name": "485", "dtype": "float32"}, {"name": "486", "dtype": "float32"}, {"name": "487", "dtype": "float32"}, {"name": "488", "dtype": "float32"}, {"name": "489", "dtype": "float32"}, {"name": "490", "dtype": "float32"}, {"name": "491", "dtype": "float32"}, {"name": "492", "dtype": "float32"}, {"name": "493", "dtype": "float32"}, {"name": "494", "dtype": "float32"}, {"name": "495", "dtype": "float32"}, {"name": "496", "dtype": "float32"}, {"name": "497", "dtype": "float32"}, {"name": "498", "dtype": "float32"}, {"name": "499", "dtype": "float32"}, {"name": "500", "dtype": "float32"}, {"name": "501", "dtype": "float32"}, {"name": "502", "dtype": "float32"}, {"name": "503", "dtype": "float32"}, {"name": "504", "dtype": "float32"}, {"name": "505", "dtype": "float32"}, {"name": "506", "dtype": "float32"}, {"name": "507", "dtype": "float32"}, {"name": "508", "dtype": "float32"}, {"name": "509", "dtype": "float32"}, {"name": "510", "dtype": "float32"}, {"name": "511", "dtype": "float32"}, {"name": "512", "dtype": "float32"}, {"name": "513", "dtype": "float32"}, {"name": "514", "dtype": "float32"}, {"name": "515", "dtype": "float32"}, {"name": "516", "dtype": "float32"}, {"name": "517", "dtype": "float32"}, {"name": "518", "dtype": "float32"}, {"name": "519", "dtype": "float32"}, {"name": "520", "dtype": "float32"}, {"name": "521", "dtype": "float32"}, {"name": "522", "dtype": "float32"}, {"name": "523", "dtype": "float32"}, {"name": "524", "dtype": "float32"}, {"name": "525", "dtype": "float32"}, {"name": "526", "dtype": "float32"}, {"name": "527", "dtype": "float32"}, {"name": "528", "dtype": "float32"}, {"name": "529", "dtype": "float32"}, {"name": "530", "dtype": "float32"}, {"name": "531", "dtype": "float32"}, {"name": "532", "dtype": "float32"}, {"name": "533", "dtype": "float32"}, {"name": "534", "dtype": "float32"}, {"name": "535", "dtype": "float32"}, {"name": "536", "dtype": "float32"}, {"name": "537", "dtype": "float32"}, {"name": "538", "dtype": "float32"}, {"name": "539", "dtype": "float32"}, {"name": "540", "dtype": "float32"}, {"name": "541", "dtype": "float32"}, {"name": "542", "dtype": "float32"}, {"name": "543", "dtype": "float32"}, {"name": "544", "dtype": "float32"}, {"name": "545", "dtype": "float32"}, {"name": "546", "dtype": "float32"}, {"name": "547", "dtype": "float32"}, {"name": "548", "dtype": "float32"}, {"name": "549", "dtype": "float32"}, {"name": "550", "dtype": "float32"}, {"name": "551", "dtype": "float32"}, {"name": "552", "dtype": "float32"}, {"name": "553", "dtype": "float32"}, {"name": "554", "dtype": "float32"}, {"name": "555", "dtype": "float32"}, {"name": "556", "dtype": "float32"}, {"name": "557", "dtype": "float32"}, {"name": "558", "dtype": "float32"}, {"name": "559", "dtype": "float32"}, {"name": "560", "dtype": "float32"}, {"name": "561", "dtype": "float32"}, {"name": "562", "dtype": "float32"}, {"name": "563", "dtype": "float32"}, {"name": "564", "dtype": "float32"}, {"name": "565", "dtype": "float32"}, {"name": "566", "dtype": "float32"}, {"name": "567", "dtype": "float32"}, {"name": "568", "dtype": "float32"}, {"name": "569", "dtype": "float32"}, {"name": "570", "dtype": "float32"}, {"name": "571", "dtype": "float32"}, {"name": "572", "dtype": "float32"}, {"name": "573", "dtype": "float32"}, {"name": "574", "dtype": "float32"}, {"name": "575", "dtype": "float32"}, {"name": "576", "dtype": "float32"}, {"name": "577", "dtype": "float32"}, {"name": "578", "dtype": "float32"}, {"name": "579", "dtype": "float32"}, {"name": "580", "dtype": "float32"}, {"name": "581", "dtype": "float32"}, {"name": "582", "dtype": "float32"}, {"name": "583", "dtype": "float32"}, {"name": "584", "dtype": "float32"}, {"name": "585", "dtype": "float32"}, {"name": "586", "dtype": "float32"}, {"name": "587", "dtype": "float32"}, {"name": "588", "dtype": "float32"}, {"name": "589", "dtype": "float32"}, {"name": "590", "dtype": "float32"}, {"name": "591", "dtype": "float32"}, {"name": "592", "dtype": "float32"}, {"name": "593", "dtype": "float32"}, {"name": "594", "dtype": "float32"}, {"name": "595", "dtype": "float32"}, {"name": "596", "dtype": "float32"}, {"name": "597", "dtype": "float32"}, {"name": "598", "dtype": "float32"}, {"name": "599", "dtype": "float32"}, {"name": "600", "dtype": "float32"}, {"name": "601", "dtype": "float32"}, {"name": "602", "dtype": "float32"}, {"name": "603", "dtype": "float32"}, {"name": "604", "dtype": "float32"}, {"name": "605", "dtype": "float32"}, {"name": "606", "dtype": "float32"}, {"name": "607", "dtype": "float32"}, {"name": "608", "dtype": "float32"}, {"name": "609", "dtype": "float32"}, {"name": "610", "dtype": "float32"}, {"name": "611", "dtype": "float32"}, {"name": "612", "dtype": "float32"}, {"name": "613", "dtype": "float32"}, {"name": "614", "dtype": "float32"}, {"name": "615", "dtype": "float32"}, {"name": "616", "dtype": "float32"}, {"name": "617", "dtype": "float32"}, {"name": "618", "dtype": "float32"}, {"name": "619", "dtype": "float32"}, {"name": "620", "dtype": "float32"}, {"name": "621", "dtype": "float32"}, {"name": "622", "dtype": "float32"}, {"name": "623", "dtype": "float32"}, {"name": "624", "dtype": "float32"}, {"name": "625", "dtype": "float32"}, {"name": "626", "dtype": "float32"}, {"name": "627", "dtype": "float32"}, {"name": "628", "dtype": "float32"}, {"name": "629", "dtype": "float32"}, {"name": "630", "dtype": "float32"}, {"name": "631", "dtype": "float32"}, {"name": "632", "dtype": "float32"}, {"name": "633", "dtype": "float32"}, {"name": "634", "dtype": "float32"}, {"name": "635", "dtype": "float32"}, {"name": "636", "dtype": "float32"}, {"name": "637", "dtype": "float32"}, {"name": "638", "dtype": "float32"}, {"name": "639", "dtype": "float32"}, {"name": "640", "dtype": "float32"}, {"name": "641", "dtype": "float32"}, {"name": "642", "dtype": "float32"}, {"name": "643", "dtype": "float32"}, {"name": "644", "dtype": "float32"}, {"name": "645", "dtype": "float32"}, {"name": "646", "dtype": "float32"}, {"name": "647", "dtype": "float32"}, {"name": "648", "dtype": "float32"}, {"name": "649", "dtype": "float32"}, {"name": "650", "dtype": "float32"}, {"name": "651", "dtype": "float32"}, {"name": "652", "dtype": "float32"}, {"name": "653", "dtype": "float32"}, {"name": "654", "dtype": "float32"}, {"name": "655", "dtype": "float32"}, {"name": "656", "dtype": "float32"}, {"name": "657", "dtype": "float32"}, {"name": "658", "dtype": "float32"}, {"name": "659", "dtype": "float32"}, {"name": "660", "dtype": "float32"}, {"name": "661", "dtype": "float32"}, {"name": "662", "dtype": "float32"}, {"name": "663", "dtype": "float32"}, {"name": "664", "dtype": "float32"}, {"name": "665", "dtype": "float32"}, {"name": "666", "dtype": "float32"}, {"name": "667", "dtype": "float32"}, {"name": "668", "dtype": "float32"}, {"name": "669", "dtype": "float32"}, {"name": "670", "dtype": "float32"}, {"name": "671", "dtype": "float32"}, {"name": "672", "dtype": "float32"}, {"name": "673", "dtype": "float32"}, {"name": "674", "dtype": "float32"}, {"name": "675", "dtype": "float32"}, {"name": "676", "dtype": "float32"}, {"name": "677", "dtype": "float32"}, {"name": "678", "dtype": "float32"}, {"name": "679", "dtype": "float32"}, {"name": "680", "dtype": "float32"}, {"name": "681", "dtype": "float32"}, {"name": "682", "dtype": "float32"}, {"name": "683", "dtype": "float32"}, {"name": "684", "dtype": "float32"}, {"name": "685", "dtype": "float32"}, {"name": "686", "dtype": "float32"}, {"name": "687", "dtype": "float32"}, {"name": "688", "dtype": "float32"}, {"name": "689", "dtype": "float32"}, {"name": "690", "dtype": "float32"}, {"name": "691", "dtype": "float32"}, {"name": "692", "dtype": "float32"}, {"name": "693", "dtype": "float32"}, {"name": "694", "dtype": "float32"}, {"name": "695", "dtype": "float32"}, {"name": "696", "dtype": "float32"}, {"name": "697", "dtype": "float32"}, {"name": "698", "dtype": "float32"}, {"name": "699", "dtype": "float32"}, {"name": "700", "dtype": "float32"}, {"name": "701", "dtype": "float32"}, {"name": "702", "dtype": "float32"}, {"name": "703", "dtype": "float32"}, {"name": "704", "dtype": "float32"}, {"name": "705", "dtype": "float32"}, {"name": "706", "dtype": "float32"}, {"name": "707", "dtype": "float32"}, {"name": "708", "dtype": "float32"}, {"name": "709", "dtype": "float32"}, {"name": "710", "dtype": "float32"}, {"name": "711", "dtype": "float32"}, {"name": "712", "dtype": "float32"}, {"name": "713", "dtype": "float32"}, {"name": "714", "dtype": "float32"}, {"name": "715", "dtype": "float32"}, {"name": "716", "dtype": "float32"}, {"name": "717", "dtype": "float32"}, {"name": "718", "dtype": "float32"}, {"name": "719", "dtype": "float32"}, {"name": "720", "dtype": "float32"}, {"name": "721", "dtype": "float32"}, {"name": "722", "dtype": "float32"}, {"name": "723", "dtype": "float32"}, {"name": "724", "dtype": "float32"}, {"name": "725", "dtype": "float32"}, {"name": "726", "dtype": "float32"}, {"name": "727", "dtype": "float32"}, {"name": "728", "dtype": "float32"}, {"name": "729", "dtype": "float32"}, {"name": "730", "dtype": "float32"}, {"name": "731", "dtype": "float32"}, {"name": "732", "dtype": "float32"}, {"name": "733", "dtype": "float32"}, {"name": "734", "dtype": "float32"}, {"name": "735", "dtype": "float32"}, {"name": "736", "dtype": "float32"}, {"name": "737", "dtype": "float32"}, {"name": "738", "dtype": "float32"}, {"name": "739", "dtype": "float32"}, {"name": "740", "dtype": "float32"}, {"name": "741", "dtype": "float32"}, {"name": "742", "dtype": "float32"}, {"name": "743", "dtype": "float32"}, {"name": "744", "dtype": "float32"}, {"name": "745", "dtype": "float32"}, {"name": "746", "dtype": "float32"}, {"name": "747", "dtype": "float32"}, {"name": "748", "dtype": "float32"}, {"name": "749", "dtype": "float32"}, {"name": "750", "dtype": "float32"}, {"name": "751", "dtype": "float32"}, {"name": "752", "dtype": "float32"}, {"name": "753", "dtype": "float32"}, {"name": "754", "dtype": "float32"}, {"name": "755", "dtype": "float32"}, {"name": "756", "dtype": "float32"}, {"name": "757", "dtype": "float32"}, {"name": "758", "dtype": "float32"}, {"name": "759", "dtype": "float32"}, {"name": "760", "dtype": "float32"}, {"name": "761", "dtype": "float32"}, {"name": "762", "dtype": "float32"}, {"name": "763", "dtype": "float32"}, {"name": "764", "dtype": "float32"}, {"name": "765", "dtype": "float32"}, {"name": "766", "dtype": "float32"}, {"name": "767", "dtype": "float32"}, {"name": "label", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 115576729.6875, "num_examples": 37500}, {"name": "test", "num_bytes": 38525577.5, "num_examples": 12500}], "download_size": 211881255, "dataset_size": 154102307.1875}}
2023-08-23T03:12:34+00:00
[]
[]
TAGS #region-us
# Dataset Card for "Thunderbird_DistilRoBERTa_Finetuned" More Information needed
[ "# Dataset Card for \"Thunderbird_DistilRoBERTa_Finetuned\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"Thunderbird_DistilRoBERTa_Finetuned\"\n\nMore Information needed" ]
[ 6, 24 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"Thunderbird_DistilRoBERTa_Finetuned\"\n\nMore Information needed" ]
cc60b8251063e8d15aa48cfa05c726302bc6e477
# Dataset of tsukumo_yatsuhashi/九十九八橋 (Touhou) This is the dataset of tsukumo_yatsuhashi/九十九八橋 (Touhou), containing 253 images and their tags. The core tags of this character are `brown_hair, short_hair, hairband, brown_eyes, purple_hairband`, which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)). ## List of Packages | Name | Images | Size | Download | Type | Description | |:-----------------|---------:|:-----------|:---------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------| | raw | 253 | 190.77 MiB | [Download](https://huggingface.co/datasets/CyberHarem/tsukumo_yatsuhashi_touhou/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). | | 800 | 253 | 149.04 MiB | [Download](https://huggingface.co/datasets/CyberHarem/tsukumo_yatsuhashi_touhou/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. | | stage3-p480-800 | 452 | 250.06 MiB | [Download](https://huggingface.co/datasets/CyberHarem/tsukumo_yatsuhashi_touhou/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | | 1200 | 253 | 183.29 MiB | [Download](https://huggingface.co/datasets/CyberHarem/tsukumo_yatsuhashi_touhou/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. | | stage3-p480-1200 | 452 | 296.12 MiB | [Download](https://huggingface.co/datasets/CyberHarem/tsukumo_yatsuhashi_touhou/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code ```python import os import zipfile from huggingface_hub import hf_hub_download from waifuc.source import LocalSource # download raw archive file zip_file = hf_hub_download( repo_id='CyberHarem/tsukumo_yatsuhashi_touhou', repo_type='dataset', filename='dataset-raw.zip', ) # extract files to your directory dataset_dir = 'dataset_dir' os.makedirs(dataset_dir, exist_ok=True) with zipfile.ZipFile(zip_file, 'r') as zf: zf.extractall(dataset_dir) # load the dataset with waifuc source = LocalSource(dataset_dir) for item in source: print(item.image, item.meta['filename'], item.meta['tags']) ``` ## List of Clusters List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| | 0 | 12 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, barefoot, black_skirt, smile, solo, white_shirt, collared_shirt, long_sleeves, simple_background, closed_mouth, eighth_note, frilled_skirt, full_body, white_background, looking_at_viewer, staff_(music), instrument | | 1 | 15 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | 1girl, long_sleeves, shirt, skirt, smile, solo, looking_at_viewer, eighth_note, open_mouth | | 2 | 10 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | 1girl, beamed_eighth_notes, black_skirt, collared_shirt, long_sleeves, looking_at_viewer, solo, white_shirt, bangs, beamed_sixteenth_notes, frilled_skirt, open_mouth, quarter_note, :d, hair_between_eyes, simple_background, staff_(music), blush, cowboy_shot, instrument | | 3 | 17 | ![](samples/3/clu3-sample0.png) | ![](samples/3/clu3-sample1.png) | ![](samples/3/clu3-sample2.png) | ![](samples/3/clu3-sample3.png) | ![](samples/3/clu3-sample4.png) | 2girls, shirt, smile, long_sleeves, open_mouth, sisters, barefoot, eighth_note, purple_hair, biwa_lute, hair_flower, looking_at_viewer, black_skirt | ### Table Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | barefoot | black_skirt | smile | solo | white_shirt | collared_shirt | long_sleeves | simple_background | closed_mouth | eighth_note | frilled_skirt | full_body | white_background | looking_at_viewer | staff_(music) | instrument | shirt | skirt | open_mouth | beamed_eighth_notes | bangs | beamed_sixteenth_notes | quarter_note | :d | hair_between_eyes | blush | cowboy_shot | 2girls | sisters | purple_hair | biwa_lute | hair_flower | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-----------|:--------------|:--------|:-------|:--------------|:-----------------|:---------------|:--------------------|:---------------|:--------------|:----------------|:------------|:-------------------|:--------------------|:----------------|:-------------|:--------|:--------|:-------------|:----------------------|:--------|:-------------------------|:---------------|:-----|:--------------------|:--------|:--------------|:---------|:----------|:--------------|:------------|:--------------| | 0 | 12 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | 1 | 15 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | X | | | X | X | | | X | | | X | | | | X | | | X | X | X | | | | | | | | | | | | | | | 2 | 10 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | X | | X | | X | X | X | X | X | | | X | | | X | X | X | | | X | X | X | X | X | X | X | X | X | | | | | | | 3 | 17 | ![](samples/3/clu3-sample0.png) | ![](samples/3/clu3-sample1.png) | ![](samples/3/clu3-sample2.png) | ![](samples/3/clu3-sample3.png) | ![](samples/3/clu3-sample4.png) | | X | X | X | | | | X | | | X | | | | X | | | X | | X | | | | | | | | | X | X | X | X | X |
CyberHarem/tsukumo_yatsuhashi_touhou
[ "task_categories:text-to-image", "size_categories:n<1K", "license:mit", "art", "not-for-all-audiences", "region:us" ]
2023-08-18T19:25:06+00:00
{"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]}
2024-01-15T02:45:28+00:00
[]
[]
TAGS #task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
Dataset of tsukumo\_yatsuhashi/九十九八橋 (Touhou) ============================================= This is the dataset of tsukumo\_yatsuhashi/九十九八橋 (Touhou), containing 253 images and their tags. The core tags of this character are 'brown\_hair, short\_hair, hairband, brown\_eyes, purple\_hairband', which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization). List of Packages ---------------- ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code List of Clusters ---------------- List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version ### Table Version
[ "### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.", "### Raw Text Version", "### Table Version" ]
[ "TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n", "### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.", "### Raw Text Version", "### Table Version" ]
[ 44, 61, 5, 4 ]
[ "passage: TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.### Raw Text Version### Table Version" ]
308926b04fda232a8a02228470cd12f660f38618
# Dataset of aki_shizuha/秋静葉 (Touhou) This is the dataset of aki_shizuha/秋静葉 (Touhou), containing 500 images and their tags. The core tags of this character are `blonde_hair, short_hair, hair_ornament, leaf_hair_ornament, yellow_eyes`, which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)). ## List of Packages | Name | Images | Size | Download | Type | Description | |:-----------------|---------:|:-----------|:--------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------| | raw | 500 | 474.73 MiB | [Download](https://huggingface.co/datasets/CyberHarem/aki_shizuha_touhou/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). | | 800 | 500 | 344.19 MiB | [Download](https://huggingface.co/datasets/CyberHarem/aki_shizuha_touhou/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. | | stage3-p480-800 | 1041 | 636.53 MiB | [Download](https://huggingface.co/datasets/CyberHarem/aki_shizuha_touhou/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | | 1200 | 500 | 446.66 MiB | [Download](https://huggingface.co/datasets/CyberHarem/aki_shizuha_touhou/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. | | stage3-p480-1200 | 1041 | 787.06 MiB | [Download](https://huggingface.co/datasets/CyberHarem/aki_shizuha_touhou/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code ```python import os import zipfile from huggingface_hub import hf_hub_download from waifuc.source import LocalSource # download raw archive file zip_file = hf_hub_download( repo_id='CyberHarem/aki_shizuha_touhou', repo_type='dataset', filename='dataset-raw.zip', ) # extract files to your directory dataset_dir = 'dataset_dir' os.makedirs(dataset_dir, exist_ok=True) with zipfile.ZipFile(zip_file, 'r') as zf: zf.extractall(dataset_dir) # load the dataset with waifuc source = LocalSource(dataset_dir) for item in source: print(item.image, item.meta['filename'], item.meta['tags']) ``` ## List of Clusters List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| | 0 | 20 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, red_skirt, solo, long_sleeves, looking_at_viewer, bangs, buttons, open_mouth, blush, cowboy_shot, autumn_leaves, maple_leaf, orange_eyes, :d, hair_between_eyes, skirt_hold, collared_shirt | | 1 | 15 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | 1girl, long_sleeves, solo, buttons, red_skirt, collared_shirt, looking_at_viewer, bangs, closed_mouth, shoes, white_socks, maple_leaf, autumn_leaves, full_body, simple_background, smile, white_background, black_footwear, skirt_hold, blush, leaf_on_head | | 2 | 6 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | 1girl, maple_leaf, solo, dress, leaf_on_head, blush | | 3 | 5 | ![](samples/3/clu3-sample0.png) | ![](samples/3/clu3-sample1.png) | ![](samples/3/clu3-sample2.png) | ![](samples/3/clu3-sample3.png) | ![](samples/3/clu3-sample4.png) | 2girls, leaf, long_sleeves, smile, bangs, closed_mouth, red_shirt, red_skirt, sisters, solo_focus, blush, hair_between_eyes, looking_at_viewer, red_dress, autumn_leaves, buttons | | 4 | 7 | ![](samples/4/clu4-sample0.png) | ![](samples/4/clu4-sample1.png) | ![](samples/4/clu4-sample2.png) | ![](samples/4/clu4-sample3.png) | ![](samples/4/clu4-sample4.png) | 1girl, black_ribbon, black_skirt, grapes, long_sleeves, mob_cap, neck_ribbon, solo, looking_at_viewer, red_apron, red_headwear, wide_sleeves, yellow_shirt, autumn_leaves, bangs, food-themed_hair_ornament, frills, full_body, hat_ornament, open_mouth, red_eyes, maple_leaf, puffy_sleeves, :d, barefoot, choker, simple_background, white_background | | 5 | 7 | ![](samples/5/clu5-sample0.png) | ![](samples/5/clu5-sample1.png) | ![](samples/5/clu5-sample2.png) | ![](samples/5/clu5-sample3.png) | ![](samples/5/clu5-sample4.png) | 2girls, grapes, long_sleeves, mob_cap, open_mouth, red_apron, red_headwear, dress, sisters, wide_sleeves, :d, black_skirt, leaf, yellow_shirt, red_eyes, solo_focus, barefoot, blush, holding_hands, looking_at_viewer, neck_ribbon, puffy_sleeves | | 6 | 16 | ![](samples/6/clu6-sample0.png) | ![](samples/6/clu6-sample1.png) | ![](samples/6/clu6-sample2.png) | ![](samples/6/clu6-sample3.png) | ![](samples/6/clu6-sample4.png) | 1girl, grapes, hat, solo, red_eyes, smile, open_mouth, leaf, dress | | 7 | 7 | ![](samples/7/clu7-sample0.png) | ![](samples/7/clu7-sample1.png) | ![](samples/7/clu7-sample2.png) | ![](samples/7/clu7-sample3.png) | ![](samples/7/clu7-sample4.png) | 2girls, sisters, open_mouth, leaf_on_head, grapes, hat | | 8 | 7 | ![](samples/8/clu8-sample0.png) | ![](samples/8/clu8-sample1.png) | ![](samples/8/clu8-sample2.png) | ![](samples/8/clu8-sample3.png) | ![](samples/8/clu8-sample4.png) | 1girl, blush, cowboy_shot, frilled_shirt_collar, looking_at_viewer, marker_(medium), open_mouth, sample_watermark, solo, :d, autumn_leaves, hair_ribbon, leaf_print, medium_hair, long_sleeves, maple_leaf, puffy_sleeves, red_eyes, red_ribbon, red_skirt, bangs, fang, skirt_hold, black_ribbon, bowtie, buttons, embellished_costume, frilled_skirt, frilled_sleeves, hair_between_eyes, orange_background, orange_theme, print_skirt, red_dress | | 9 | 9 | ![](samples/9/clu9-sample0.png) | ![](samples/9/clu9-sample1.png) | ![](samples/9/clu9-sample2.png) | ![](samples/9/clu9-sample3.png) | ![](samples/9/clu9-sample4.png) | 1boy, 1girl, blush, hetero, nipples, solo_focus, open_mouth, penis, pussy, navel, sex, vaginal, censored, leaf, medium_breasts, tears, nude, small_breasts, lying, one_eye_closed, pubic_hair, spread_legs | ### Table Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | red_skirt | solo | long_sleeves | looking_at_viewer | bangs | buttons | open_mouth | blush | cowboy_shot | autumn_leaves | maple_leaf | orange_eyes | :d | hair_between_eyes | skirt_hold | collared_shirt | closed_mouth | shoes | white_socks | full_body | simple_background | smile | white_background | black_footwear | leaf_on_head | dress | 2girls | leaf | red_shirt | sisters | solo_focus | red_dress | black_ribbon | black_skirt | grapes | mob_cap | neck_ribbon | red_apron | red_headwear | wide_sleeves | yellow_shirt | food-themed_hair_ornament | frills | hat_ornament | red_eyes | puffy_sleeves | barefoot | choker | holding_hands | hat | frilled_shirt_collar | marker_(medium) | sample_watermark | hair_ribbon | leaf_print | medium_hair | red_ribbon | fang | bowtie | embellished_costume | frilled_skirt | frilled_sleeves | orange_background | orange_theme | print_skirt | 1boy | hetero | nipples | penis | pussy | navel | sex | vaginal | censored | medium_breasts | tears | nude | small_breasts | lying | one_eye_closed | pubic_hair | spread_legs | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:------------|:-------|:---------------|:--------------------|:--------|:----------|:-------------|:--------|:--------------|:----------------|:-------------|:--------------|:-----|:--------------------|:-------------|:-----------------|:---------------|:--------|:--------------|:------------|:--------------------|:--------|:-------------------|:-----------------|:---------------|:--------|:---------|:-------|:------------|:----------|:-------------|:------------|:---------------|:--------------|:---------|:----------|:--------------|:------------|:---------------|:---------------|:---------------|:----------------------------|:---------|:---------------|:-----------|:----------------|:-----------|:---------|:----------------|:------|:-----------------------|:------------------|:-------------------|:--------------|:-------------|:--------------|:-------------|:-------|:---------|:----------------------|:----------------|:------------------|:--------------------|:---------------|:--------------|:-------|:---------|:----------|:--------|:--------|:--------|:------|:----------|:-----------|:-----------------|:--------|:-------|:----------------|:--------|:-----------------|:-------------|:--------------| | 0 | 20 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 1 | 15 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | X | X | X | X | X | X | X | | X | | X | X | | | | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 2 | 6 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | X | | X | | | | | | X | | | X | | | | | | | | | | | | | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 3 | 5 | ![](samples/3/clu3-sample0.png) | ![](samples/3/clu3-sample1.png) | ![](samples/3/clu3-sample2.png) | ![](samples/3/clu3-sample3.png) | ![](samples/3/clu3-sample4.png) | | X | | X | X | X | X | | X | | X | | | | X | | | X | | | | | X | | | | | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 4 | 7 | ![](samples/4/clu4-sample0.png) | ![](samples/4/clu4-sample1.png) | ![](samples/4/clu4-sample2.png) | ![](samples/4/clu4-sample3.png) | ![](samples/4/clu4-sample4.png) | X | | X | X | X | X | | X | | | X | X | | X | | | | | | | X | X | | X | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 5 | 7 | ![](samples/5/clu5-sample0.png) | ![](samples/5/clu5-sample1.png) | ![](samples/5/clu5-sample2.png) | ![](samples/5/clu5-sample3.png) | ![](samples/5/clu5-sample4.png) | | | | X | X | | | X | X | | | | | X | | | | | | | | | | | | | X | X | X | | X | X | | | X | X | X | X | X | X | X | X | | | | X | X | X | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 6 | 16 | ![](samples/6/clu6-sample0.png) | ![](samples/6/clu6-sample1.png) | ![](samples/6/clu6-sample2.png) | ![](samples/6/clu6-sample3.png) | ![](samples/6/clu6-sample4.png) | X | | X | | | | | X | | | | | | | | | | | | | | | X | | | | X | | X | | | | | | | X | | | | | | | | | | X | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 7 | 7 | ![](samples/7/clu7-sample0.png) | ![](samples/7/clu7-sample1.png) | ![](samples/7/clu7-sample2.png) | ![](samples/7/clu7-sample3.png) | ![](samples/7/clu7-sample4.png) | | | | | | | | X | | | | | | | | | | | | | | | | | | X | | X | | | X | | | | | X | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 8 | 7 | ![](samples/8/clu8-sample0.png) | ![](samples/8/clu8-sample1.png) | ![](samples/8/clu8-sample2.png) | ![](samples/8/clu8-sample3.png) | ![](samples/8/clu8-sample4.png) | X | X | X | X | X | X | X | X | X | X | X | X | | X | X | X | | | | | | | | | | | | | | | | | X | X | | | | | | | | | | | | X | X | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | 9 | 9 | ![](samples/9/clu9-sample0.png) | ![](samples/9/clu9-sample1.png) | ![](samples/9/clu9-sample2.png) | ![](samples/9/clu9-sample3.png) | ![](samples/9/clu9-sample4.png) | X | | | | | | | X | X | | | | | | | | | | | | | | | | | | | | X | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
CyberHarem/aki_shizuha_touhou
[ "task_categories:text-to-image", "size_categories:n<1K", "license:mit", "art", "not-for-all-audiences", "region:us" ]
2023-08-18T19:25:37+00:00
{"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]}
2024-01-14T19:07:02+00:00
[]
[]
TAGS #task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
Dataset of aki\_shizuha/秋静葉 (Touhou) ==================================== This is the dataset of aki\_shizuha/秋静葉 (Touhou), containing 500 images and their tags. The core tags of this character are 'blonde\_hair, short\_hair, hair\_ornament, leaf\_hair\_ornament, yellow\_eyes', which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization). List of Packages ---------------- ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code List of Clusters ---------------- List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version ### Table Version
[ "### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.", "### Raw Text Version", "### Table Version" ]
[ "TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n", "### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.", "### Raw Text Version", "### Table Version" ]
[ 44, 61, 5, 4 ]
[ "passage: TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.### Raw Text Version### Table Version" ]
296d0c2560b5df4af27459dc60393d53493dd4d5
# Dataset Card for "Thunderbird_GPT2_Finetuned" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
EgilKarlsen/Thunderbird_GPT2_Finetuned
[ "region:us" ]
2023-08-18T19:26:17+00:00
{"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "test", "path": "data/test-*"}]}], "dataset_info": {"features": [{"name": "0", "dtype": "float32"}, {"name": "1", "dtype": "float32"}, {"name": "2", "dtype": "float32"}, {"name": "3", "dtype": "float32"}, {"name": "4", "dtype": "float32"}, {"name": "5", "dtype": "float32"}, {"name": "6", "dtype": "float32"}, {"name": "7", "dtype": "float32"}, {"name": "8", "dtype": "float32"}, {"name": "9", "dtype": "float32"}, {"name": "10", "dtype": "float32"}, {"name": "11", "dtype": "float32"}, {"name": "12", "dtype": "float32"}, {"name": "13", "dtype": "float32"}, {"name": "14", "dtype": "float32"}, {"name": "15", "dtype": "float32"}, {"name": "16", "dtype": "float32"}, {"name": "17", "dtype": "float32"}, {"name": "18", "dtype": "float32"}, {"name": "19", "dtype": "float32"}, {"name": "20", "dtype": "float32"}, {"name": "21", "dtype": "float32"}, {"name": "22", "dtype": "float32"}, {"name": "23", "dtype": "float32"}, {"name": "24", "dtype": "float32"}, {"name": "25", "dtype": "float32"}, {"name": "26", "dtype": "float32"}, {"name": "27", "dtype": "float32"}, {"name": "28", "dtype": "float32"}, {"name": "29", "dtype": "float32"}, {"name": "30", "dtype": "float32"}, {"name": "31", "dtype": "float32"}, {"name": "32", "dtype": "float32"}, {"name": "33", "dtype": "float32"}, {"name": "34", "dtype": "float32"}, {"name": "35", "dtype": "float32"}, {"name": "36", "dtype": "float32"}, {"name": "37", "dtype": "float32"}, {"name": "38", "dtype": "float32"}, {"name": "39", "dtype": "float32"}, {"name": "40", "dtype": "float32"}, {"name": "41", "dtype": "float32"}, {"name": "42", "dtype": "float32"}, {"name": "43", "dtype": "float32"}, {"name": "44", "dtype": "float32"}, {"name": "45", "dtype": "float32"}, {"name": "46", "dtype": "float32"}, {"name": "47", "dtype": "float32"}, {"name": "48", "dtype": "float32"}, {"name": "49", "dtype": "float32"}, {"name": "50", "dtype": "float32"}, {"name": "51", "dtype": "float32"}, {"name": "52", "dtype": "float32"}, {"name": "53", "dtype": "float32"}, {"name": "54", "dtype": "float32"}, {"name": "55", "dtype": "float32"}, {"name": "56", "dtype": "float32"}, {"name": "57", "dtype": "float32"}, {"name": "58", "dtype": "float32"}, {"name": "59", "dtype": "float32"}, {"name": "60", "dtype": "float32"}, {"name": "61", "dtype": "float32"}, {"name": "62", "dtype": "float32"}, {"name": "63", "dtype": "float32"}, {"name": "64", "dtype": "float32"}, {"name": "65", "dtype": "float32"}, {"name": "66", "dtype": "float32"}, {"name": "67", "dtype": "float32"}, {"name": "68", "dtype": "float32"}, {"name": "69", "dtype": "float32"}, {"name": "70", "dtype": "float32"}, {"name": "71", "dtype": "float32"}, {"name": "72", "dtype": "float32"}, {"name": "73", "dtype": "float32"}, {"name": "74", "dtype": "float32"}, {"name": "75", "dtype": "float32"}, {"name": "76", "dtype": "float32"}, {"name": "77", "dtype": "float32"}, {"name": "78", "dtype": "float32"}, {"name": "79", "dtype": "float32"}, {"name": "80", "dtype": "float32"}, {"name": "81", "dtype": "float32"}, {"name": "82", "dtype": "float32"}, {"name": "83", "dtype": "float32"}, {"name": "84", "dtype": "float32"}, {"name": "85", "dtype": "float32"}, {"name": "86", "dtype": "float32"}, {"name": "87", "dtype": "float32"}, {"name": "88", "dtype": "float32"}, {"name": "89", "dtype": "float32"}, {"name": "90", "dtype": "float32"}, {"name": "91", "dtype": "float32"}, {"name": "92", "dtype": "float32"}, {"name": "93", "dtype": "float32"}, {"name": "94", "dtype": "float32"}, {"name": "95", "dtype": "float32"}, {"name": "96", "dtype": "float32"}, {"name": "97", "dtype": "float32"}, {"name": "98", "dtype": "float32"}, {"name": "99", "dtype": "float32"}, {"name": "100", "dtype": "float32"}, {"name": "101", "dtype": "float32"}, {"name": "102", "dtype": "float32"}, {"name": "103", "dtype": "float32"}, {"name": "104", "dtype": "float32"}, {"name": "105", "dtype": "float32"}, {"name": "106", "dtype": "float32"}, {"name": "107", "dtype": "float32"}, {"name": "108", "dtype": "float32"}, {"name": "109", "dtype": "float32"}, {"name": "110", "dtype": "float32"}, {"name": "111", "dtype": "float32"}, {"name": "112", "dtype": "float32"}, {"name": "113", "dtype": "float32"}, {"name": "114", "dtype": "float32"}, {"name": "115", "dtype": "float32"}, {"name": "116", "dtype": "float32"}, {"name": "117", "dtype": "float32"}, {"name": "118", "dtype": "float32"}, {"name": "119", "dtype": "float32"}, {"name": "120", "dtype": "float32"}, {"name": "121", "dtype": "float32"}, {"name": "122", "dtype": "float32"}, {"name": "123", "dtype": "float32"}, {"name": "124", "dtype": "float32"}, {"name": "125", "dtype": "float32"}, {"name": "126", "dtype": "float32"}, {"name": "127", "dtype": "float32"}, {"name": "128", "dtype": "float32"}, {"name": "129", "dtype": "float32"}, {"name": "130", "dtype": "float32"}, {"name": "131", "dtype": "float32"}, {"name": "132", "dtype": "float32"}, {"name": "133", "dtype": "float32"}, {"name": "134", "dtype": "float32"}, {"name": "135", "dtype": "float32"}, {"name": "136", "dtype": "float32"}, {"name": "137", "dtype": "float32"}, {"name": "138", "dtype": "float32"}, {"name": "139", "dtype": "float32"}, {"name": "140", "dtype": "float32"}, {"name": "141", "dtype": "float32"}, {"name": "142", "dtype": "float32"}, {"name": "143", "dtype": "float32"}, {"name": "144", "dtype": "float32"}, {"name": "145", "dtype": "float32"}, {"name": "146", "dtype": "float32"}, {"name": "147", "dtype": "float32"}, {"name": "148", "dtype": "float32"}, {"name": "149", "dtype": "float32"}, {"name": "150", "dtype": "float32"}, {"name": "151", "dtype": "float32"}, {"name": "152", "dtype": "float32"}, {"name": "153", "dtype": "float32"}, {"name": "154", "dtype": "float32"}, {"name": "155", "dtype": "float32"}, {"name": "156", "dtype": "float32"}, {"name": "157", "dtype": "float32"}, {"name": "158", "dtype": "float32"}, {"name": "159", "dtype": "float32"}, {"name": "160", "dtype": "float32"}, {"name": "161", "dtype": "float32"}, {"name": "162", "dtype": "float32"}, {"name": "163", "dtype": "float32"}, {"name": "164", "dtype": "float32"}, {"name": "165", "dtype": "float32"}, {"name": "166", "dtype": "float32"}, {"name": "167", "dtype": "float32"}, {"name": "168", "dtype": "float32"}, {"name": "169", "dtype": "float32"}, {"name": "170", "dtype": "float32"}, {"name": "171", "dtype": "float32"}, {"name": "172", "dtype": "float32"}, {"name": "173", "dtype": "float32"}, {"name": "174", "dtype": "float32"}, {"name": "175", "dtype": "float32"}, {"name": "176", "dtype": "float32"}, {"name": "177", "dtype": "float32"}, {"name": "178", "dtype": "float32"}, {"name": "179", "dtype": "float32"}, {"name": "180", "dtype": "float32"}, {"name": "181", "dtype": "float32"}, {"name": "182", "dtype": "float32"}, {"name": "183", "dtype": "float32"}, {"name": "184", "dtype": "float32"}, {"name": "185", "dtype": "float32"}, {"name": "186", "dtype": "float32"}, {"name": "187", "dtype": "float32"}, {"name": "188", "dtype": "float32"}, {"name": "189", "dtype": "float32"}, {"name": "190", "dtype": "float32"}, {"name": "191", "dtype": "float32"}, {"name": "192", "dtype": "float32"}, {"name": "193", "dtype": "float32"}, {"name": "194", "dtype": "float32"}, {"name": "195", "dtype": "float32"}, {"name": "196", "dtype": "float32"}, {"name": "197", "dtype": "float32"}, {"name": "198", "dtype": "float32"}, {"name": "199", "dtype": "float32"}, {"name": "200", "dtype": "float32"}, {"name": "201", "dtype": "float32"}, {"name": "202", "dtype": "float32"}, {"name": "203", "dtype": "float32"}, {"name": "204", "dtype": "float32"}, {"name": "205", "dtype": "float32"}, {"name": "206", "dtype": "float32"}, {"name": "207", "dtype": "float32"}, {"name": "208", "dtype": "float32"}, {"name": "209", "dtype": "float32"}, {"name": "210", "dtype": "float32"}, {"name": "211", "dtype": "float32"}, {"name": "212", "dtype": "float32"}, {"name": "213", "dtype": "float32"}, {"name": "214", "dtype": "float32"}, {"name": "215", "dtype": "float32"}, {"name": "216", "dtype": "float32"}, {"name": "217", "dtype": "float32"}, {"name": "218", "dtype": "float32"}, {"name": "219", "dtype": "float32"}, {"name": "220", "dtype": "float32"}, {"name": "221", "dtype": "float32"}, {"name": "222", "dtype": "float32"}, {"name": "223", "dtype": "float32"}, {"name": "224", "dtype": "float32"}, {"name": "225", "dtype": "float32"}, {"name": "226", "dtype": "float32"}, {"name": "227", "dtype": "float32"}, {"name": "228", "dtype": "float32"}, {"name": "229", "dtype": "float32"}, {"name": "230", "dtype": "float32"}, {"name": "231", "dtype": "float32"}, {"name": "232", "dtype": "float32"}, {"name": "233", "dtype": "float32"}, {"name": "234", "dtype": "float32"}, {"name": "235", "dtype": "float32"}, {"name": "236", "dtype": "float32"}, {"name": "237", "dtype": "float32"}, {"name": "238", "dtype": "float32"}, {"name": "239", "dtype": "float32"}, {"name": "240", "dtype": "float32"}, {"name": "241", "dtype": "float32"}, {"name": "242", "dtype": "float32"}, {"name": "243", "dtype": "float32"}, {"name": "244", "dtype": "float32"}, {"name": "245", "dtype": "float32"}, {"name": "246", "dtype": "float32"}, {"name": "247", "dtype": "float32"}, {"name": "248", "dtype": "float32"}, {"name": "249", "dtype": "float32"}, {"name": "250", "dtype": "float32"}, {"name": "251", "dtype": "float32"}, {"name": "252", "dtype": "float32"}, {"name": "253", "dtype": "float32"}, {"name": "254", "dtype": "float32"}, {"name": "255", "dtype": "float32"}, {"name": "256", "dtype": "float32"}, {"name": "257", "dtype": "float32"}, {"name": "258", "dtype": "float32"}, {"name": "259", "dtype": "float32"}, {"name": "260", "dtype": "float32"}, {"name": "261", "dtype": "float32"}, {"name": "262", "dtype": "float32"}, {"name": "263", "dtype": "float32"}, {"name": "264", "dtype": "float32"}, {"name": "265", "dtype": "float32"}, {"name": "266", "dtype": "float32"}, {"name": "267", "dtype": "float32"}, {"name": "268", "dtype": "float32"}, {"name": "269", "dtype": "float32"}, {"name": "270", "dtype": "float32"}, {"name": "271", "dtype": "float32"}, {"name": "272", "dtype": "float32"}, {"name": "273", "dtype": "float32"}, {"name": "274", "dtype": "float32"}, {"name": "275", "dtype": "float32"}, {"name": "276", "dtype": "float32"}, {"name": "277", "dtype": "float32"}, {"name": "278", "dtype": "float32"}, {"name": "279", "dtype": "float32"}, {"name": "280", "dtype": "float32"}, {"name": "281", "dtype": "float32"}, {"name": "282", "dtype": "float32"}, {"name": "283", "dtype": "float32"}, {"name": "284", "dtype": "float32"}, {"name": "285", "dtype": "float32"}, {"name": "286", "dtype": "float32"}, {"name": "287", "dtype": "float32"}, {"name": "288", "dtype": "float32"}, {"name": "289", "dtype": "float32"}, {"name": "290", "dtype": "float32"}, {"name": "291", "dtype": "float32"}, {"name": "292", "dtype": "float32"}, {"name": "293", "dtype": "float32"}, {"name": "294", "dtype": "float32"}, {"name": "295", "dtype": "float32"}, {"name": "296", "dtype": "float32"}, {"name": "297", "dtype": "float32"}, {"name": "298", "dtype": "float32"}, {"name": "299", "dtype": "float32"}, {"name": "300", "dtype": "float32"}, {"name": "301", "dtype": "float32"}, {"name": "302", "dtype": "float32"}, {"name": "303", "dtype": "float32"}, {"name": "304", "dtype": "float32"}, {"name": "305", "dtype": "float32"}, {"name": "306", "dtype": "float32"}, {"name": "307", "dtype": "float32"}, {"name": "308", "dtype": "float32"}, {"name": "309", "dtype": "float32"}, {"name": "310", "dtype": "float32"}, {"name": "311", "dtype": "float32"}, {"name": "312", "dtype": "float32"}, {"name": "313", "dtype": "float32"}, {"name": "314", "dtype": "float32"}, {"name": "315", "dtype": "float32"}, {"name": "316", "dtype": "float32"}, {"name": "317", "dtype": "float32"}, {"name": "318", "dtype": "float32"}, {"name": "319", "dtype": "float32"}, {"name": "320", "dtype": "float32"}, {"name": "321", "dtype": "float32"}, {"name": "322", "dtype": "float32"}, {"name": "323", "dtype": "float32"}, {"name": "324", "dtype": "float32"}, {"name": "325", "dtype": "float32"}, {"name": "326", "dtype": "float32"}, {"name": "327", "dtype": "float32"}, {"name": "328", "dtype": "float32"}, {"name": "329", "dtype": "float32"}, {"name": "330", "dtype": "float32"}, {"name": "331", "dtype": "float32"}, {"name": "332", "dtype": "float32"}, {"name": "333", "dtype": "float32"}, {"name": "334", "dtype": "float32"}, {"name": "335", "dtype": "float32"}, {"name": "336", "dtype": "float32"}, {"name": "337", "dtype": "float32"}, {"name": "338", "dtype": "float32"}, {"name": "339", "dtype": "float32"}, {"name": "340", "dtype": "float32"}, {"name": "341", "dtype": "float32"}, {"name": "342", "dtype": "float32"}, {"name": "343", "dtype": "float32"}, {"name": "344", "dtype": "float32"}, {"name": "345", "dtype": "float32"}, {"name": "346", "dtype": "float32"}, {"name": "347", "dtype": "float32"}, {"name": "348", "dtype": "float32"}, {"name": "349", "dtype": "float32"}, {"name": "350", "dtype": "float32"}, {"name": "351", "dtype": "float32"}, {"name": "352", "dtype": "float32"}, {"name": "353", "dtype": "float32"}, {"name": "354", "dtype": "float32"}, {"name": "355", "dtype": "float32"}, {"name": "356", "dtype": "float32"}, {"name": "357", "dtype": "float32"}, {"name": "358", "dtype": "float32"}, {"name": "359", "dtype": "float32"}, {"name": "360", "dtype": "float32"}, {"name": "361", "dtype": "float32"}, {"name": "362", "dtype": "float32"}, {"name": "363", "dtype": "float32"}, {"name": "364", "dtype": "float32"}, {"name": "365", "dtype": "float32"}, {"name": "366", "dtype": "float32"}, {"name": "367", "dtype": "float32"}, {"name": "368", "dtype": "float32"}, {"name": "369", "dtype": "float32"}, {"name": "370", "dtype": "float32"}, {"name": "371", "dtype": "float32"}, {"name": "372", "dtype": "float32"}, {"name": "373", "dtype": "float32"}, {"name": "374", "dtype": "float32"}, {"name": "375", "dtype": "float32"}, {"name": "376", "dtype": "float32"}, {"name": "377", "dtype": "float32"}, {"name": "378", "dtype": "float32"}, {"name": "379", "dtype": "float32"}, {"name": "380", "dtype": "float32"}, {"name": "381", "dtype": "float32"}, {"name": "382", "dtype": "float32"}, {"name": "383", "dtype": "float32"}, {"name": "384", "dtype": "float32"}, {"name": "385", "dtype": "float32"}, {"name": "386", "dtype": "float32"}, {"name": "387", "dtype": "float32"}, {"name": "388", "dtype": "float32"}, {"name": "389", "dtype": "float32"}, {"name": "390", "dtype": "float32"}, {"name": "391", "dtype": "float32"}, {"name": "392", "dtype": "float32"}, {"name": "393", "dtype": "float32"}, {"name": "394", "dtype": "float32"}, {"name": "395", "dtype": "float32"}, {"name": "396", "dtype": "float32"}, {"name": "397", "dtype": "float32"}, {"name": "398", "dtype": "float32"}, {"name": "399", "dtype": "float32"}, {"name": "400", "dtype": "float32"}, {"name": "401", "dtype": "float32"}, {"name": "402", "dtype": "float32"}, {"name": "403", "dtype": "float32"}, {"name": "404", "dtype": "float32"}, {"name": "405", "dtype": "float32"}, {"name": "406", "dtype": "float32"}, {"name": "407", "dtype": "float32"}, {"name": "408", "dtype": "float32"}, {"name": "409", "dtype": "float32"}, {"name": "410", "dtype": "float32"}, {"name": "411", "dtype": "float32"}, {"name": "412", "dtype": "float32"}, {"name": "413", "dtype": "float32"}, {"name": "414", "dtype": "float32"}, {"name": "415", "dtype": "float32"}, {"name": "416", "dtype": "float32"}, {"name": "417", "dtype": "float32"}, {"name": "418", "dtype": "float32"}, {"name": "419", "dtype": "float32"}, {"name": "420", "dtype": "float32"}, {"name": "421", "dtype": "float32"}, {"name": "422", "dtype": "float32"}, {"name": "423", "dtype": "float32"}, {"name": "424", "dtype": "float32"}, {"name": "425", "dtype": "float32"}, {"name": "426", "dtype": "float32"}, {"name": "427", "dtype": "float32"}, {"name": "428", "dtype": "float32"}, {"name": "429", "dtype": "float32"}, {"name": "430", "dtype": "float32"}, {"name": "431", "dtype": "float32"}, {"name": "432", "dtype": "float32"}, {"name": "433", "dtype": "float32"}, {"name": "434", "dtype": "float32"}, {"name": "435", "dtype": "float32"}, {"name": "436", "dtype": "float32"}, {"name": "437", "dtype": "float32"}, {"name": "438", "dtype": "float32"}, {"name": "439", "dtype": "float32"}, {"name": "440", "dtype": "float32"}, {"name": "441", "dtype": "float32"}, {"name": "442", "dtype": "float32"}, {"name": "443", "dtype": "float32"}, {"name": "444", "dtype": "float32"}, {"name": "445", "dtype": "float32"}, {"name": "446", "dtype": "float32"}, {"name": "447", "dtype": "float32"}, {"name": "448", "dtype": "float32"}, {"name": "449", "dtype": "float32"}, {"name": "450", "dtype": "float32"}, {"name": "451", "dtype": "float32"}, {"name": "452", "dtype": "float32"}, {"name": "453", "dtype": "float32"}, {"name": "454", "dtype": "float32"}, {"name": "455", "dtype": "float32"}, {"name": "456", "dtype": "float32"}, {"name": "457", "dtype": "float32"}, {"name": "458", "dtype": "float32"}, {"name": "459", "dtype": "float32"}, {"name": "460", "dtype": "float32"}, {"name": "461", "dtype": "float32"}, {"name": "462", "dtype": "float32"}, {"name": "463", "dtype": "float32"}, {"name": "464", "dtype": "float32"}, {"name": "465", "dtype": "float32"}, {"name": "466", "dtype": "float32"}, {"name": "467", "dtype": "float32"}, {"name": "468", "dtype": "float32"}, {"name": "469", "dtype": "float32"}, {"name": "470", "dtype": "float32"}, {"name": "471", "dtype": "float32"}, {"name": "472", "dtype": "float32"}, {"name": "473", "dtype": "float32"}, {"name": "474", "dtype": "float32"}, {"name": "475", "dtype": "float32"}, {"name": "476", "dtype": "float32"}, {"name": "477", "dtype": "float32"}, {"name": "478", "dtype": "float32"}, {"name": "479", "dtype": "float32"}, {"name": "480", "dtype": "float32"}, {"name": "481", "dtype": "float32"}, {"name": "482", "dtype": "float32"}, {"name": "483", "dtype": "float32"}, {"name": "484", "dtype": "float32"}, {"name": "485", "dtype": "float32"}, {"name": "486", "dtype": "float32"}, {"name": "487", "dtype": "float32"}, {"name": "488", "dtype": "float32"}, {"name": "489", "dtype": "float32"}, {"name": "490", "dtype": "float32"}, {"name": "491", "dtype": "float32"}, {"name": "492", "dtype": "float32"}, {"name": "493", "dtype": "float32"}, {"name": "494", "dtype": "float32"}, {"name": "495", "dtype": "float32"}, {"name": "496", "dtype": "float32"}, {"name": "497", "dtype": "float32"}, {"name": "498", "dtype": "float32"}, {"name": "499", "dtype": "float32"}, {"name": "500", "dtype": "float32"}, {"name": "501", "dtype": "float32"}, {"name": "502", "dtype": "float32"}, {"name": "503", "dtype": "float32"}, {"name": "504", "dtype": "float32"}, {"name": "505", "dtype": "float32"}, {"name": "506", "dtype": "float32"}, {"name": "507", "dtype": "float32"}, {"name": "508", "dtype": "float32"}, {"name": "509", "dtype": "float32"}, {"name": "510", "dtype": "float32"}, {"name": "511", "dtype": "float32"}, {"name": "512", "dtype": "float32"}, {"name": "513", "dtype": "float32"}, {"name": "514", "dtype": "float32"}, {"name": "515", "dtype": "float32"}, {"name": "516", "dtype": "float32"}, {"name": "517", "dtype": "float32"}, {"name": "518", "dtype": "float32"}, {"name": "519", "dtype": "float32"}, {"name": "520", "dtype": "float32"}, {"name": "521", "dtype": "float32"}, {"name": "522", "dtype": "float32"}, {"name": "523", "dtype": "float32"}, {"name": "524", "dtype": "float32"}, {"name": "525", "dtype": "float32"}, {"name": "526", "dtype": "float32"}, {"name": "527", "dtype": "float32"}, {"name": "528", "dtype": "float32"}, {"name": "529", "dtype": "float32"}, {"name": "530", "dtype": "float32"}, {"name": "531", "dtype": "float32"}, {"name": "532", "dtype": "float32"}, {"name": "533", "dtype": "float32"}, {"name": "534", "dtype": "float32"}, {"name": "535", "dtype": "float32"}, {"name": "536", "dtype": "float32"}, {"name": "537", "dtype": "float32"}, {"name": "538", "dtype": "float32"}, {"name": "539", "dtype": "float32"}, {"name": "540", "dtype": "float32"}, {"name": "541", "dtype": "float32"}, {"name": "542", "dtype": "float32"}, {"name": "543", "dtype": "float32"}, {"name": "544", "dtype": "float32"}, {"name": "545", "dtype": "float32"}, {"name": "546", "dtype": "float32"}, {"name": "547", "dtype": "float32"}, {"name": "548", "dtype": "float32"}, {"name": "549", "dtype": "float32"}, {"name": "550", "dtype": "float32"}, {"name": "551", "dtype": "float32"}, {"name": "552", "dtype": "float32"}, {"name": "553", "dtype": "float32"}, {"name": "554", "dtype": "float32"}, {"name": "555", "dtype": "float32"}, {"name": "556", "dtype": "float32"}, {"name": "557", "dtype": "float32"}, {"name": "558", "dtype": "float32"}, {"name": "559", "dtype": "float32"}, {"name": "560", "dtype": "float32"}, {"name": "561", "dtype": "float32"}, {"name": "562", "dtype": "float32"}, {"name": "563", "dtype": "float32"}, {"name": "564", "dtype": "float32"}, {"name": "565", "dtype": "float32"}, {"name": "566", "dtype": "float32"}, {"name": "567", "dtype": "float32"}, {"name": "568", "dtype": "float32"}, {"name": "569", "dtype": "float32"}, {"name": "570", "dtype": "float32"}, {"name": "571", "dtype": "float32"}, {"name": "572", "dtype": "float32"}, {"name": "573", "dtype": "float32"}, {"name": "574", "dtype": "float32"}, {"name": "575", "dtype": "float32"}, {"name": "576", "dtype": "float32"}, {"name": "577", "dtype": "float32"}, {"name": "578", "dtype": "float32"}, {"name": "579", "dtype": "float32"}, {"name": "580", "dtype": "float32"}, {"name": "581", "dtype": "float32"}, {"name": "582", "dtype": "float32"}, {"name": "583", "dtype": "float32"}, {"name": "584", "dtype": "float32"}, {"name": "585", "dtype": "float32"}, {"name": "586", "dtype": "float32"}, {"name": "587", "dtype": "float32"}, {"name": "588", "dtype": "float32"}, {"name": "589", "dtype": "float32"}, {"name": "590", "dtype": "float32"}, {"name": "591", "dtype": "float32"}, {"name": "592", "dtype": "float32"}, {"name": "593", "dtype": "float32"}, {"name": "594", "dtype": "float32"}, {"name": "595", "dtype": "float32"}, {"name": "596", "dtype": "float32"}, {"name": "597", "dtype": "float32"}, {"name": "598", "dtype": "float32"}, {"name": "599", "dtype": "float32"}, {"name": "600", "dtype": "float32"}, {"name": "601", "dtype": "float32"}, {"name": "602", "dtype": "float32"}, {"name": "603", "dtype": "float32"}, {"name": "604", "dtype": "float32"}, {"name": "605", "dtype": "float32"}, {"name": "606", "dtype": "float32"}, {"name": "607", "dtype": "float32"}, {"name": "608", "dtype": "float32"}, {"name": "609", "dtype": "float32"}, {"name": "610", "dtype": "float32"}, {"name": "611", "dtype": "float32"}, {"name": "612", "dtype": "float32"}, {"name": "613", "dtype": "float32"}, {"name": "614", "dtype": "float32"}, {"name": "615", "dtype": "float32"}, {"name": "616", "dtype": "float32"}, {"name": "617", "dtype": "float32"}, {"name": "618", "dtype": "float32"}, {"name": "619", "dtype": "float32"}, {"name": "620", "dtype": "float32"}, {"name": "621", "dtype": "float32"}, {"name": "622", "dtype": "float32"}, {"name": "623", "dtype": "float32"}, {"name": "624", "dtype": "float32"}, {"name": "625", "dtype": "float32"}, {"name": "626", "dtype": "float32"}, {"name": "627", "dtype": "float32"}, {"name": "628", "dtype": "float32"}, {"name": "629", "dtype": "float32"}, {"name": "630", "dtype": "float32"}, {"name": "631", "dtype": "float32"}, {"name": "632", "dtype": "float32"}, {"name": "633", "dtype": "float32"}, {"name": "634", "dtype": "float32"}, {"name": "635", "dtype": "float32"}, {"name": "636", "dtype": "float32"}, {"name": "637", "dtype": "float32"}, {"name": "638", "dtype": "float32"}, {"name": "639", "dtype": "float32"}, {"name": "640", "dtype": "float32"}, {"name": "641", "dtype": "float32"}, {"name": "642", "dtype": "float32"}, {"name": "643", "dtype": "float32"}, {"name": "644", "dtype": "float32"}, {"name": "645", "dtype": "float32"}, {"name": "646", "dtype": "float32"}, {"name": "647", "dtype": "float32"}, {"name": "648", "dtype": "float32"}, {"name": "649", "dtype": "float32"}, {"name": "650", "dtype": "float32"}, {"name": "651", "dtype": "float32"}, {"name": "652", "dtype": "float32"}, {"name": "653", "dtype": "float32"}, {"name": "654", "dtype": "float32"}, {"name": "655", "dtype": "float32"}, {"name": "656", "dtype": "float32"}, {"name": "657", "dtype": "float32"}, {"name": "658", "dtype": "float32"}, {"name": "659", "dtype": "float32"}, {"name": "660", "dtype": "float32"}, {"name": "661", "dtype": "float32"}, {"name": "662", "dtype": "float32"}, {"name": "663", "dtype": "float32"}, {"name": "664", "dtype": "float32"}, {"name": "665", "dtype": "float32"}, {"name": "666", "dtype": "float32"}, {"name": "667", "dtype": "float32"}, {"name": "668", "dtype": "float32"}, {"name": "669", "dtype": "float32"}, {"name": "670", "dtype": "float32"}, {"name": "671", "dtype": "float32"}, {"name": "672", "dtype": "float32"}, {"name": "673", "dtype": "float32"}, {"name": "674", "dtype": "float32"}, {"name": "675", "dtype": "float32"}, {"name": "676", "dtype": "float32"}, {"name": "677", "dtype": "float32"}, {"name": "678", "dtype": "float32"}, {"name": "679", "dtype": "float32"}, {"name": "680", "dtype": "float32"}, {"name": "681", "dtype": "float32"}, {"name": "682", "dtype": "float32"}, {"name": "683", "dtype": "float32"}, {"name": "684", "dtype": "float32"}, {"name": "685", "dtype": "float32"}, {"name": "686", "dtype": "float32"}, {"name": "687", "dtype": "float32"}, {"name": "688", "dtype": "float32"}, {"name": "689", "dtype": "float32"}, {"name": "690", "dtype": "float32"}, {"name": "691", "dtype": "float32"}, {"name": "692", "dtype": "float32"}, {"name": "693", "dtype": "float32"}, {"name": "694", "dtype": "float32"}, {"name": "695", "dtype": "float32"}, {"name": "696", "dtype": "float32"}, {"name": "697", "dtype": "float32"}, {"name": "698", "dtype": "float32"}, {"name": "699", "dtype": "float32"}, {"name": "700", "dtype": "float32"}, {"name": "701", "dtype": "float32"}, {"name": "702", "dtype": "float32"}, {"name": "703", "dtype": "float32"}, {"name": "704", "dtype": "float32"}, {"name": "705", "dtype": "float32"}, {"name": "706", "dtype": "float32"}, {"name": "707", "dtype": "float32"}, {"name": "708", "dtype": "float32"}, {"name": "709", "dtype": "float32"}, {"name": "710", "dtype": "float32"}, {"name": "711", "dtype": "float32"}, {"name": "712", "dtype": "float32"}, {"name": "713", "dtype": "float32"}, {"name": "714", "dtype": "float32"}, {"name": "715", "dtype": "float32"}, {"name": "716", "dtype": "float32"}, {"name": "717", "dtype": "float32"}, {"name": "718", "dtype": "float32"}, {"name": "719", "dtype": "float32"}, {"name": "720", "dtype": "float32"}, {"name": "721", "dtype": "float32"}, {"name": "722", "dtype": "float32"}, {"name": "723", "dtype": "float32"}, {"name": "724", "dtype": "float32"}, {"name": "725", "dtype": "float32"}, {"name": "726", "dtype": "float32"}, {"name": "727", "dtype": "float32"}, {"name": "728", "dtype": "float32"}, {"name": "729", "dtype": "float32"}, {"name": "730", "dtype": "float32"}, {"name": "731", "dtype": "float32"}, {"name": "732", "dtype": "float32"}, {"name": "733", "dtype": "float32"}, {"name": "734", "dtype": "float32"}, {"name": "735", "dtype": "float32"}, {"name": "736", "dtype": "float32"}, {"name": "737", "dtype": "float32"}, {"name": "738", "dtype": "float32"}, {"name": "739", "dtype": "float32"}, {"name": "740", "dtype": "float32"}, {"name": "741", "dtype": "float32"}, {"name": "742", "dtype": "float32"}, {"name": "743", "dtype": "float32"}, {"name": "744", "dtype": "float32"}, {"name": "745", "dtype": "float32"}, {"name": "746", "dtype": "float32"}, {"name": "747", "dtype": "float32"}, {"name": "748", "dtype": "float32"}, {"name": "749", "dtype": "float32"}, {"name": "750", "dtype": "float32"}, {"name": "751", "dtype": "float32"}, {"name": "752", "dtype": "float32"}, {"name": "753", "dtype": "float32"}, {"name": "754", "dtype": "float32"}, {"name": "755", "dtype": "float32"}, {"name": "756", "dtype": "float32"}, {"name": "757", "dtype": "float32"}, {"name": "758", "dtype": "float32"}, {"name": "759", "dtype": "float32"}, {"name": "760", "dtype": "float32"}, {"name": "761", "dtype": "float32"}, {"name": "762", "dtype": "float32"}, {"name": "763", "dtype": "float32"}, {"name": "764", "dtype": "float32"}, {"name": "765", "dtype": "float32"}, {"name": "766", "dtype": "float32"}, {"name": "767", "dtype": "float32"}, {"name": "label", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 115576729.6875, "num_examples": 37500}, {"name": "test", "num_bytes": 38525577.5, "num_examples": 12500}], "download_size": 211858414, "dataset_size": 154102307.1875}}
2023-08-23T03:20:47+00:00
[]
[]
TAGS #region-us
# Dataset Card for "Thunderbird_GPT2_Finetuned" More Information needed
[ "# Dataset Card for \"Thunderbird_GPT2_Finetuned\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"Thunderbird_GPT2_Finetuned\"\n\nMore Information needed" ]
[ 6, 22 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"Thunderbird_GPT2_Finetuned\"\n\nMore Information needed" ]
8f600f12b14c19a282d9642163a71806bfc06bc6
# Dataset Card for "aya_persian_instruction_pn-summary-title" # Summary aya_persian_instruction_pn-summary-title is an open source dataset of instruct-style records generated from [pn-summary](https://huggingface.co/datasets/pn_summary) dataset. pn-summary is a Persian summarization dataset which contains a set of documents along with their summaries and titles. Here we transformed it to prompt-completion style to be used in the [Aya project](https://sites.google.com/cohere.com/aya-en/home) from Cohere For AI. # Templates For the creation of instruct-style prompts and completions from the original dataset, the following templates were used: - Given a text, suggest a title for it. | template_id | inputs | targets | |-------------|--------|---------| | 1 | ```برای متن زیر یک عنوان مناسب پیشنهاد دهید:\n{{Original Text}}``` | ```این عنوان می‌تواند برای متن مورد نظر مناسب باشد:{{Original Title}}``` | | 2 | ```از نظر شما یک تیتر مناسب برای مقاله زیر چه می‌تواند باشد؟:\n{Original Text}}``` | ```این عنوان می‌تواند برای متن مورد نظر مناسب باشد:{{Original Title}}``` | # Language Persian # Licensing Information This dataset is licensed under MIT License.
Shafagh/aya_persian_instruction_pn-summary-title
[ "license:mit", "region:us" ]
2023-08-18T19:30:44+00:00
{"license": "mit", "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "validation", "path": "data/validation-*"}, {"split": "test", "path": "data/test-*"}]}], "dataset_info": {"features": [{"name": "inputs", "dtype": "string"}, {"name": "targets", "dtype": "string"}, {"name": "template_lang", "sequence": "string"}, {"name": "template_id", "dtype": "int64"}], "splits": [{"name": "train", "num_bytes": 271655885, "num_examples": 82022}, {"name": "validation", "num_bytes": 18753925, "num_examples": 5592}, {"name": "test", "num_bytes": 18353375, "num_examples": 5593}], "download_size": 135315191, "dataset_size": 308763185}}
2024-01-25T15:37:28+00:00
[]
[]
TAGS #license-mit #region-us
Dataset Card for "aya\_persian\_instruction\_pn-summary-title" ============================================================== Summary ======= aya\_persian\_instruction\_pn-summary-title is an open source dataset of instruct-style records generated from pn-summary dataset. pn-summary is a Persian summarization dataset which contains a set of documents along with their summaries and titles. Here we transformed it to prompt-completion style to be used in the Aya project from Cohere For AI. Templates ========= For the creation of instruct-style prompts and completions from the original dataset, the following templates were used: * Given a text, suggest a title for it. template\_id: 1, inputs: , targets: template\_id: 2, inputs: , targets: Language ======== Persian Licensing Information ===================== This dataset is licensed under MIT License.
[]
[ "TAGS\n#license-mit #region-us \n" ]
[ 11 ]
[ "passage: TAGS\n#license-mit #region-us \n" ]
7ee4456a9716a9ae4a0ae2b53ec68ba036c2b229
Inputs and targets in this dataset are pre-normalized and scaled with .nc files found on the GitHub repo: https://github.com/leap-stc/ClimSim/tree/main/preprocessing/normalizations Read more: https://arxiv.org/abs/2306.08754.
LEAP/subsampled_low_res
[ "arxiv:2306.08754", "region:us" ]
2023-08-18T19:31:09+00:00
{}
2023-10-09T15:42:18+00:00
[ "2306.08754" ]
[]
TAGS #arxiv-2306.08754 #region-us
Inputs and targets in this dataset are pre-normalized and scaled with .nc files found on the GitHub repo: URL Read more: URL
[]
[ "TAGS\n#arxiv-2306.08754 #region-us \n" ]
[ 14 ]
[ "passage: TAGS\n#arxiv-2306.08754 #region-us \n" ]
48d8d159d6447168aeadb6d210f0ed70d7dccccf
# COMPARING SIMPLE DEEP LEARNING MODELS TO A COMPLEX MODEL FOR SMOKE DETECTION - **Homepage:** [Sage Continuum](https://sagecontinuum.org/) - **Author:** Jakub Szumny, Math and Computer Science Division, University of Illinois at Urbana-Champaign - **Mentors:** Bhupendra Raut, Seongha Park - **Repository:** [GitHub Repository](https://github.com/waggle-sensor/summer2023/tree/main/szumny) # Motivation - Forest fires are a major problem, and have detrimental effects on the environment. Current solutions to detecting forest fires are not efficient enough, and other machine learning models have far too long computational speeds and poor accuracies. This study is a continuation of the work done by UCSD, and their SmokeyNet deep learning architecture for smoke detection. - My goal is to compare many different deep learning models, in order to find the best model for this issue, and to find if a simple model can compare to a complex model. The models which I compared are: VGG16, UCSD SmokeyNet, Resnet18, Resnet34, and Resnet50. # Major Accomplishments - Created a large dataset of 41,000 images, comprised of many different wildfire events from HPWREN. I split the images into 5 different classes: sky, ground, horizon, cloud, and smoke. - Tested in many different ways, and found that the best results are when the classes: sky, ground, and horizon, are grouped together as other, and smoke and cloud are left separate. The major issue with this, is that smoke and clouds often look very similar. - On my dataset, created with HPWREN images, each model performed rather well, having about the same accuracy at around 90%. - Found that the VGG16 model with 3 features (smoke, cloud, other), was the best performing model on the testing dataset from ARM, and all the other models performed quite poorly. - Must keep in mind that the burning event was not very obvious in the ARM testing data, but it won’t always be cut and clear, so it is a great test to see which model perform best with the least. - With a FPR of about 13%, a TPR of about 96%, a FNR of about 4%, and a TNR of about 88%, the VGG16 model had the best results, on the ARM Data. - Created a plugin application to be able to test and use my model and algorithm on wild sage nodes, taking images and detecting smoke in real time. # Impact - The impact my research has made, is having created a large dataset for future research, and for better model creation. - Found that a simple model is very accurate and can compare to a complex model. - An algorithm which can compute and classify an entire image in a very short period of time. - This research can greatly help the fight against forest fires, in order to at one point solve the problem of forest fires, by being able to attend to them before they get out of control. # Future Direction - More work is needed on creating a more efficient model. There may be a different model which can perform even better on detecting smoke. - It is helpful as a dataset is already created, and through my Github repository, anyone can replicate my work, and try to improve on it. - Need to explore more ways to augment the images, by scaling the contrast levels, etc, as I believe this would be a good way to separate smoke from cloud from other. # Citation Dewangan A, Pande Y, Braun H-W, Vernon F, Perez I, Altintas I, Cottrell GW, Nguyen MH. FIgLib & SmokeyNet: Dataset and Deep Learning Model for Real-Time Wildland Fire Smoke Detection. Remote Sensing. 2022; 14(4):1007. https://doi.org/10.3390/rs14041007
sagecontinuum/smokedataset
[ "task_categories:image-classification", "task_ids:multi-label-image-classification", "license:mit", "climate", "region:us" ]
2023-08-18T19:35:00+00:00
{"license": "mit", "task_categories": ["image-classification"], "task_ids": ["multi-label-image-classification"], "dataset_info": {"features": [{"name": "image", "dtype": "image"}, {"name": "label", "dtype": {"class_label": {"names": {"0": "cloud", "1": "other", "2": "smoke"}}}}], "splits": [{"name": "train", "num_bytes": 85556006, "num_examples": 14318}, {"name": "validation", "num_bytes": 22137739, "num_examples": 3671}, {"name": "test", "num_bytes": 11026374, "num_examples": 1843}], "download_size": 132474880, "dataset_size": 118720119}, "tags": ["climate"]}
2023-09-11T19:57:58+00:00
[]
[]
TAGS #task_categories-image-classification #task_ids-multi-label-image-classification #license-mit #climate #region-us
# COMPARING SIMPLE DEEP LEARNING MODELS TO A COMPLEX MODEL FOR SMOKE DETECTION - Homepage: Sage Continuum - Author: Jakub Szumny, Math and Computer Science Division, University of Illinois at Urbana-Champaign - Mentors: Bhupendra Raut, Seongha Park - Repository: GitHub Repository # Motivation - Forest fires are a major problem, and have detrimental effects on the environment. Current solutions to detecting forest fires are not efficient enough, and other machine learning models have far too long computational speeds and poor accuracies. This study is a continuation of the work done by UCSD, and their SmokeyNet deep learning architecture for smoke detection. - My goal is to compare many different deep learning models, in order to find the best model for this issue, and to find if a simple model can compare to a complex model. The models which I compared are: VGG16, UCSD SmokeyNet, Resnet18, Resnet34, and Resnet50. # Major Accomplishments - Created a large dataset of 41,000 images, comprised of many different wildfire events from HPWREN. I split the images into 5 different classes: sky, ground, horizon, cloud, and smoke. - Tested in many different ways, and found that the best results are when the classes: sky, ground, and horizon, are grouped together as other, and smoke and cloud are left separate. The major issue with this, is that smoke and clouds often look very similar. - On my dataset, created with HPWREN images, each model performed rather well, having about the same accuracy at around 90%. - Found that the VGG16 model with 3 features (smoke, cloud, other), was the best performing model on the testing dataset from ARM, and all the other models performed quite poorly. - Must keep in mind that the burning event was not very obvious in the ARM testing data, but it won’t always be cut and clear, so it is a great test to see which model perform best with the least. - With a FPR of about 13%, a TPR of about 96%, a FNR of about 4%, and a TNR of about 88%, the VGG16 model had the best results, on the ARM Data. - Created a plugin application to be able to test and use my model and algorithm on wild sage nodes, taking images and detecting smoke in real time. # Impact - The impact my research has made, is having created a large dataset for future research, and for better model creation. - Found that a simple model is very accurate and can compare to a complex model. - An algorithm which can compute and classify an entire image in a very short period of time. - This research can greatly help the fight against forest fires, in order to at one point solve the problem of forest fires, by being able to attend to them before they get out of control. # Future Direction - More work is needed on creating a more efficient model. There may be a different model which can perform even better on detecting smoke. - It is helpful as a dataset is already created, and through my Github repository, anyone can replicate my work, and try to improve on it. - Need to explore more ways to augment the images, by scaling the contrast levels, etc, as I believe this would be a good way to separate smoke from cloud from other. Dewangan A, Pande Y, Braun H-W, Vernon F, Perez I, Altintas I, Cottrell GW, Nguyen MH. FIgLib & SmokeyNet: Dataset and Deep Learning Model for Real-Time Wildland Fire Smoke Detection. Remote Sensing. 2022; 14(4):1007. URL
[ "# COMPARING SIMPLE DEEP LEARNING MODELS TO A COMPLEX MODEL FOR SMOKE DETECTION\n- Homepage: Sage Continuum\n- Author: Jakub Szumny, Math and Computer Science Division, University of Illinois at Urbana-Champaign\n- Mentors: Bhupendra Raut, Seongha Park\n- Repository: GitHub Repository", "# Motivation\n- Forest fires are a major problem, and have detrimental effects on the environment. Current solutions to detecting forest fires are not efficient enough, and other machine learning models have far too long computational speeds and poor accuracies. This study is a\ncontinuation of the work done by UCSD, and their SmokeyNet deep learning architecture for smoke detection.\n- My goal is to compare many different deep learning models, in order to find the best model for this issue, and to find if a simple model can compare to a complex model. The models which I compared are: VGG16, UCSD SmokeyNet, Resnet18, Resnet34, and Resnet50.", "# Major Accomplishments\n- Created a large dataset of 41,000 images, comprised of many different wildfire events from HPWREN. I split the images into 5 different classes: sky, ground, horizon, cloud, and smoke.\n- Tested in many different ways, and found that the best results are when the classes: sky, ground, and horizon, are grouped together as other, and smoke and cloud are left separate. The major issue with this, is that smoke and clouds often look very similar.\n- On my dataset, created with HPWREN images, each model performed rather well, having about the same accuracy at around 90%.\n- Found that the VGG16 model with 3 features (smoke, cloud, other), was the best performing model on the testing dataset from ARM, and all the other models performed quite poorly.\n- Must keep in mind that the burning event was not very obvious in the ARM testing data, but it won’t always be cut and clear, so it is a great test to see which model perform best with the least.\n- With a FPR of about 13%, a TPR of about 96%, a FNR of about 4%, and a TNR of about 88%, the VGG16 model had the best results, on\nthe ARM Data.\n- Created a plugin application to be able to test and use my model and algorithm on wild sage nodes, taking images and detecting smoke in real time.", "# Impact\n- The impact my research has made, is having created a large dataset for future research, and for better model creation.\n- Found that a simple model is very accurate and can compare to a complex model.\n- An algorithm which can compute and classify an entire image in a very short period of time.\n- This research can greatly help the fight against forest fires, in order to at one point solve the problem of forest fires, by being able to attend to them before they get out of control.", "# Future Direction\n- More work is needed on creating a more efficient model. There may be a different model which can perform even better on detecting smoke.\n- It is helpful as a dataset is already created, and through my Github repository, anyone can replicate my work,\nand try to improve on it.\n- Need to explore more ways to augment the images, by scaling the contrast levels, etc, as I believe this would be a good way to separate smoke from cloud from other.\n\nDewangan A, Pande Y, Braun H-W, Vernon F, Perez I, Altintas I, Cottrell GW, Nguyen MH. FIgLib & SmokeyNet: Dataset and Deep Learning Model for\nReal-Time Wildland Fire Smoke Detection. Remote Sensing. 2022; 14(4):1007. URL" ]
[ "TAGS\n#task_categories-image-classification #task_ids-multi-label-image-classification #license-mit #climate #region-us \n", "# COMPARING SIMPLE DEEP LEARNING MODELS TO A COMPLEX MODEL FOR SMOKE DETECTION\n- Homepage: Sage Continuum\n- Author: Jakub Szumny, Math and Computer Science Division, University of Illinois at Urbana-Champaign\n- Mentors: Bhupendra Raut, Seongha Park\n- Repository: GitHub Repository", "# Motivation\n- Forest fires are a major problem, and have detrimental effects on the environment. Current solutions to detecting forest fires are not efficient enough, and other machine learning models have far too long computational speeds and poor accuracies. This study is a\ncontinuation of the work done by UCSD, and their SmokeyNet deep learning architecture for smoke detection.\n- My goal is to compare many different deep learning models, in order to find the best model for this issue, and to find if a simple model can compare to a complex model. The models which I compared are: VGG16, UCSD SmokeyNet, Resnet18, Resnet34, and Resnet50.", "# Major Accomplishments\n- Created a large dataset of 41,000 images, comprised of many different wildfire events from HPWREN. I split the images into 5 different classes: sky, ground, horizon, cloud, and smoke.\n- Tested in many different ways, and found that the best results are when the classes: sky, ground, and horizon, are grouped together as other, and smoke and cloud are left separate. The major issue with this, is that smoke and clouds often look very similar.\n- On my dataset, created with HPWREN images, each model performed rather well, having about the same accuracy at around 90%.\n- Found that the VGG16 model with 3 features (smoke, cloud, other), was the best performing model on the testing dataset from ARM, and all the other models performed quite poorly.\n- Must keep in mind that the burning event was not very obvious in the ARM testing data, but it won’t always be cut and clear, so it is a great test to see which model perform best with the least.\n- With a FPR of about 13%, a TPR of about 96%, a FNR of about 4%, and a TNR of about 88%, the VGG16 model had the best results, on\nthe ARM Data.\n- Created a plugin application to be able to test and use my model and algorithm on wild sage nodes, taking images and detecting smoke in real time.", "# Impact\n- The impact my research has made, is having created a large dataset for future research, and for better model creation.\n- Found that a simple model is very accurate and can compare to a complex model.\n- An algorithm which can compute and classify an entire image in a very short period of time.\n- This research can greatly help the fight against forest fires, in order to at one point solve the problem of forest fires, by being able to attend to them before they get out of control.", "# Future Direction\n- More work is needed on creating a more efficient model. There may be a different model which can perform even better on detecting smoke.\n- It is helpful as a dataset is already created, and through my Github repository, anyone can replicate my work,\nand try to improve on it.\n- Need to explore more ways to augment the images, by scaling the contrast levels, etc, as I believe this would be a good way to separate smoke from cloud from other.\n\nDewangan A, Pande Y, Braun H-W, Vernon F, Perez I, Altintas I, Cottrell GW, Nguyen MH. FIgLib & SmokeyNet: Dataset and Deep Learning Model for\nReal-Time Wildland Fire Smoke Detection. Remote Sensing. 2022; 14(4):1007. URL" ]
[ 40, 83, 149, 319, 109, 179 ]
[ "passage: TAGS\n#task_categories-image-classification #task_ids-multi-label-image-classification #license-mit #climate #region-us \n# COMPARING SIMPLE DEEP LEARNING MODELS TO A COMPLEX MODEL FOR SMOKE DETECTION\n- Homepage: Sage Continuum\n- Author: Jakub Szumny, Math and Computer Science Division, University of Illinois at Urbana-Champaign\n- Mentors: Bhupendra Raut, Seongha Park\n- Repository: GitHub Repository# Motivation\n- Forest fires are a major problem, and have detrimental effects on the environment. Current solutions to detecting forest fires are not efficient enough, and other machine learning models have far too long computational speeds and poor accuracies. This study is a\ncontinuation of the work done by UCSD, and their SmokeyNet deep learning architecture for smoke detection.\n- My goal is to compare many different deep learning models, in order to find the best model for this issue, and to find if a simple model can compare to a complex model. The models which I compared are: VGG16, UCSD SmokeyNet, Resnet18, Resnet34, and Resnet50." ]
589f237c7ecaec28043849253759b0ed52446d4f
# Images from the Bad River site in Northern Wisconsin - **Homepage:** [Sage Continuum](https://sagecontinuum.org/) - **Author:** Alex Arnold, Northwestern University - **Mentors:** Bhupendra Raut, Seongha Park - **Repository:** [GitHub Repository](https://github.com/waggle-sensor/summer2023/tree/main/Arnold) # Introduction Ice and snowfall are incredibly important parts of a river ecosystem. The Bad River is home to wild rice, which is very temperamental and prone to natural boom/bust years. Having a snow classifier can be used to create a larger dataset of snow that can be used for a variety of these additional tasks including assisting with predicting wild rice yields. # The Data Two Waggle nodes were collecting both images and other data from the Bad River in the past year. The W014 Waggle node was collecting data in 2022 up until December when it went offline, In January a second node (W083) started collecting images pointing at essentially the same spot. This gave me a collection of 3500 images to work with. Luckily about half of them had snow of some kind and half did not so there weren't any major class imbalance problems. One of the big decisions I had to make was when to count an image as having snow. Did a few patches count? Did a light dusting of snow count? In the end, I elected to count _any_ snow on the ground to simplify the problem. The two images below are from W014 and W083 respectively. ![W014](md_images/W014.jpg) ![Wo83](md_images/W083.jpg) The nodes took a picture once every hour, so some images were at night and too dark to see. Images where I couldn't discern whether there was snow or not (snow fell at night at an unclear time) were discarded from the dataset. Darker images were still included if I could confirm that they contained snow. # Approach First, the images needed to be preprocessed and transformed. One problem snow detection runs into is the similarity between snow and clouds. Unsupervised methods based on color often classify clouds as also being snow, but this issue is solved through the use of deep learning and some more heavy-handed techniques. Neural networks (hopefully) can learn not to depend only on color but instead on other information such as texture. To help the network along I also cropped out the sky from the images in addition to other transforms such as solarization. Solarization randomly reverses the brightness of pixels over a certain threshold so it can't depend on which pixels are very bright. These changes force the model to learn to recognize snow on the ground through additional attributes in addition to color. Our goal was to create a machine-learning model that could detect whether there was snow on the ground around the river. Convolutional neural networks are the main tool of choice for these kinds of image related tasks. They work by using a sliding "window" across an image to capture relationships and patterns between pixels across the image. This sliding window approach reduces the number of parameters and complexity of the model. There are already a multitude of pre-trained convolutional network models out there that perform well on image classification tasks, but there aren't any deep learning models trained specifically for snow detection. Luckily _transfer learning_ comes to the rescue to make training a new model incredibly easy with limited time and computational power. Transfer learning works by taking an image classification model that someone else has already taken the time to train reusing it for a new purpose. I utilized ResNet50 [1], a popular convolutional neural network model that pioneered a technique called residual connections. Residual connections allow neural networks to optimize quickly while still being deep enough to capture complex relationships. ResNet50 is a very deep network with fifty layers (hence the name) and would take a lot of time and computing power to train even with the residual connections, but luckily some free pre-trained models are essentially plug-and-play with only small modifications. A visualization of ResNet50's architecture is seen below [2]. ![ResNet50 Model (without additional layers)](md_images/ResNet50.png) The theory behind transfer learning is that ResNet50 has already learned to encode certain aspects of an image that are generalizable, so all it needs is a few changes to use those lower-level features to create a new prediction. To turn the model into a snow detector, I tacked on a couple of extra linear layers at the end to generate a prediction score for whether there is snow or not. This vastly sped up training time compared to creating a whole new model. # Results The classifier was able to detect snow incredibly accurately from images collected from W014 and W083. ![confusion matrix](md_images/badrivermatrix.png) However, we wanted to ensure that the model wasn't completely overfitting to the images from these nodes and was learning something about snow. I also tested it on images from a node in Montana (W084). It didn't perform quite as well but still performed accurately enough to indicate that it wasn't overfitting horrendously. That being said, currently, the plugin is released to be used at the Bad River W083 node as it's not fit to be a general snow classifier quite yet. # Future Steps We weren't able to get additional data from the Bad River, but additional work could look at using these images to predict turbidity data and other information about the river. This could be used to facilitate and predict wild rice yields as well. More data from other Waggle nodes could also be used to create a more general snow classifier that could be used at other locations with more confidence, but for now it's best only at the Bad River site. # Citations [1] K. He, X. Zhang, S. Ren, and J. Sun, “Deep residual learning for image recognition,” 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), 2016. doi:10.1109/cvpr.2016.90 [2] https://commons.wikimedia.org/wiki/File:ResNet50.png
sagecontinuum/snowdataset
[ "task_categories:image-classification", "task_ids:multi-label-image-classification", "license:mit", "climate", "region:us" ]
2023-08-18T19:38:50+00:00
{"license": "mit", "task_categories": ["image-classification"], "task_ids": ["multi-label-image-classification"], "dataset_info": {"features": [{"name": "image", "dtype": "image"}, {"name": "path", "dtype": "string"}, {"name": "snow", "dtype": {"class_label": {"names": {"0": false, "1": true}}}}, {"name": "day", "dtype": "bool"}, {"name": "node", "dtype": "string"}], "splits": [{"name": "full", "num_bytes": 845769, "num_examples": 3563}], "download_size": 4740076026, "dataset_size": 845769}, "tags": ["climate"]}
2023-09-11T19:56:52+00:00
[]
[]
TAGS #task_categories-image-classification #task_ids-multi-label-image-classification #license-mit #climate #region-us
# Images from the Bad River site in Northern Wisconsin - Homepage: Sage Continuum - Author: Alex Arnold, Northwestern University - Mentors: Bhupendra Raut, Seongha Park - Repository: GitHub Repository # Introduction Ice and snowfall are incredibly important parts of a river ecosystem. The Bad River is home to wild rice, which is very temperamental and prone to natural boom/bust years. Having a snow classifier can be used to create a larger dataset of snow that can be used for a variety of these additional tasks including assisting with predicting wild rice yields. # The Data Two Waggle nodes were collecting both images and other data from the Bad River in the past year. The W014 Waggle node was collecting data in 2022 up until December when it went offline, In January a second node (W083) started collecting images pointing at essentially the same spot. This gave me a collection of 3500 images to work with. Luckily about half of them had snow of some kind and half did not so there weren't any major class imbalance problems. One of the big decisions I had to make was when to count an image as having snow. Did a few patches count? Did a light dusting of snow count? In the end, I elected to count _any_ snow on the ground to simplify the problem. The two images below are from W014 and W083 respectively. !W014 !Wo83 The nodes took a picture once every hour, so some images were at night and too dark to see. Images where I couldn't discern whether there was snow or not (snow fell at night at an unclear time) were discarded from the dataset. Darker images were still included if I could confirm that they contained snow. # Approach First, the images needed to be preprocessed and transformed. One problem snow detection runs into is the similarity between snow and clouds. Unsupervised methods based on color often classify clouds as also being snow, but this issue is solved through the use of deep learning and some more heavy-handed techniques. Neural networks (hopefully) can learn not to depend only on color but instead on other information such as texture. To help the network along I also cropped out the sky from the images in addition to other transforms such as solarization. Solarization randomly reverses the brightness of pixels over a certain threshold so it can't depend on which pixels are very bright. These changes force the model to learn to recognize snow on the ground through additional attributes in addition to color. Our goal was to create a machine-learning model that could detect whether there was snow on the ground around the river. Convolutional neural networks are the main tool of choice for these kinds of image related tasks. They work by using a sliding "window" across an image to capture relationships and patterns between pixels across the image. This sliding window approach reduces the number of parameters and complexity of the model. There are already a multitude of pre-trained convolutional network models out there that perform well on image classification tasks, but there aren't any deep learning models trained specifically for snow detection. Luckily _transfer learning_ comes to the rescue to make training a new model incredibly easy with limited time and computational power. Transfer learning works by taking an image classification model that someone else has already taken the time to train reusing it for a new purpose. I utilized ResNet50 [1], a popular convolutional neural network model that pioneered a technique called residual connections. Residual connections allow neural networks to optimize quickly while still being deep enough to capture complex relationships. ResNet50 is a very deep network with fifty layers (hence the name) and would take a lot of time and computing power to train even with the residual connections, but luckily some free pre-trained models are essentially plug-and-play with only small modifications. A visualization of ResNet50's architecture is seen below [2]. !ResNet50 Model (without additional layers) The theory behind transfer learning is that ResNet50 has already learned to encode certain aspects of an image that are generalizable, so all it needs is a few changes to use those lower-level features to create a new prediction. To turn the model into a snow detector, I tacked on a couple of extra linear layers at the end to generate a prediction score for whether there is snow or not. This vastly sped up training time compared to creating a whole new model. # Results The classifier was able to detect snow incredibly accurately from images collected from W014 and W083. !confusion matrix However, we wanted to ensure that the model wasn't completely overfitting to the images from these nodes and was learning something about snow. I also tested it on images from a node in Montana (W084). It didn't perform quite as well but still performed accurately enough to indicate that it wasn't overfitting horrendously. That being said, currently, the plugin is released to be used at the Bad River W083 node as it's not fit to be a general snow classifier quite yet. # Future Steps We weren't able to get additional data from the Bad River, but additional work could look at using these images to predict turbidity data and other information about the river. This could be used to facilitate and predict wild rice yields as well. More data from other Waggle nodes could also be used to create a more general snow classifier that could be used at other locations with more confidence, but for now it's best only at the Bad River site. s [1] K. He, X. Zhang, S. Ren, and J. Sun, “Deep residual learning for image recognition,” 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), 2016. doi:10.1109/cvpr.2016.90 [2] URL
[ "# Images from the Bad River site in Northern Wisconsin\n- Homepage: Sage Continuum\n- Author: Alex Arnold, Northwestern University\n- Mentors: Bhupendra Raut, Seongha Park\n- Repository: GitHub Repository", "# Introduction\nIce and snowfall are incredibly important parts of a river ecosystem. The Bad River is home to wild rice, which is very temperamental and prone to natural boom/bust years. Having a snow classifier can be used to create a larger dataset of snow that can be used for a variety of these additional tasks including assisting with predicting wild rice yields.", "# The Data\nTwo Waggle nodes were collecting both images and other data from the Bad River in the past year. The W014 Waggle node was collecting data in 2022 up until December when it went offline, In January a second node (W083) started collecting images pointing at essentially the same spot. This gave me a collection of 3500 images to work with. Luckily about half of them had snow of some kind and half did not so there weren't any major class imbalance problems. One of the big decisions I had to make was when to count an image as having snow. Did a few patches count? Did a light dusting of snow count? In the end, I elected to count _any_ snow on the ground to simplify the problem. The two images below are from W014 and W083 respectively.\n\n!W014\n\n!Wo83\n\nThe nodes took a picture once every hour, so some images were at night and too dark to see. Images where I couldn't discern whether there was snow or not (snow fell at night at an unclear time) were discarded from the dataset. Darker images were still included if I could confirm that they contained snow.", "# Approach\n\nFirst, the images needed to be preprocessed and transformed. One problem snow detection runs into is the similarity between snow and clouds. Unsupervised methods based on color often classify clouds as also being snow, but this issue is solved through the use of deep learning and some more heavy-handed techniques. Neural networks (hopefully) can learn not to depend only on color but instead on other information such as texture. To help the network along I also cropped out the sky from the images in addition to other transforms such as solarization. Solarization randomly reverses the brightness of pixels over a certain threshold so it can't depend on which pixels are very bright. These changes force the model to learn to recognize snow on the ground through additional attributes in addition to color.\n\nOur goal was to create a machine-learning model that could detect whether there was snow on the ground around the river. Convolutional neural networks are the main tool of choice for these kinds of image related tasks. They work by using a sliding \"window\" across an image to capture relationships and patterns between pixels across the image. This sliding window approach reduces the number of parameters and complexity of the model. There are already a multitude of pre-trained convolutional network models out there that perform well on image classification tasks, but there aren't any deep learning models trained specifically for snow detection. Luckily _transfer learning_ comes to the rescue to make training a new model incredibly easy with limited time and computational power. \n\nTransfer learning works by taking an image classification model that someone else has already taken the time to train reusing it for a new purpose. I utilized ResNet50 [1], a popular convolutional neural network model that pioneered a technique called residual connections. Residual connections allow neural networks to optimize quickly while still being deep enough to capture complex relationships. ResNet50 is a very deep network with fifty layers (hence the name) and would take a lot of time and computing power to train even with the residual connections, but luckily some free pre-trained models are essentially plug-and-play with only small modifications. A visualization of ResNet50's architecture is seen below [2].\n\n!ResNet50 Model (without additional layers)\n\nThe theory behind transfer learning is that ResNet50 has already learned to encode certain aspects of an image that are generalizable, so all it needs is a few changes to use those lower-level features to create a new prediction. To turn the model into a snow detector, I tacked on a couple of extra linear layers at the end to generate a prediction score for whether there is snow or not. This vastly sped up training time compared to creating a whole new model.", "# Results\nThe classifier was able to detect snow incredibly accurately from images collected from W014 and W083.\n!confusion matrix\nHowever, we wanted to ensure that the model wasn't completely overfitting to the images from these nodes and was learning something about snow. I also tested it on images from a node in Montana (W084). It didn't perform quite as well but still performed accurately enough to indicate that it wasn't overfitting horrendously. That being said, currently, the plugin is released to be used at the Bad River W083 node as it's not fit to be a general snow classifier quite yet.", "# Future Steps\n\nWe weren't able to get additional data from the Bad River, but additional work could look at using these images to predict turbidity data and other information about the river. This could be used to facilitate and predict wild rice yields as well. More data from other Waggle nodes could also be used to create a more general snow classifier that could be used at other locations with more confidence, but for now it's best only at the Bad River site. \n\ns\n[1] K. He, X. Zhang, S. Ren, and J. Sun, “Deep residual learning for image recognition,” 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), 2016. doi:10.1109/cvpr.2016.90 \n\n[2] URL" ]
[ "TAGS\n#task_categories-image-classification #task_ids-multi-label-image-classification #license-mit #climate #region-us \n", "# Images from the Bad River site in Northern Wisconsin\n- Homepage: Sage Continuum\n- Author: Alex Arnold, Northwestern University\n- Mentors: Bhupendra Raut, Seongha Park\n- Repository: GitHub Repository", "# Introduction\nIce and snowfall are incredibly important parts of a river ecosystem. The Bad River is home to wild rice, which is very temperamental and prone to natural boom/bust years. Having a snow classifier can be used to create a larger dataset of snow that can be used for a variety of these additional tasks including assisting with predicting wild rice yields.", "# The Data\nTwo Waggle nodes were collecting both images and other data from the Bad River in the past year. The W014 Waggle node was collecting data in 2022 up until December when it went offline, In January a second node (W083) started collecting images pointing at essentially the same spot. This gave me a collection of 3500 images to work with. Luckily about half of them had snow of some kind and half did not so there weren't any major class imbalance problems. One of the big decisions I had to make was when to count an image as having snow. Did a few patches count? Did a light dusting of snow count? In the end, I elected to count _any_ snow on the ground to simplify the problem. The two images below are from W014 and W083 respectively.\n\n!W014\n\n!Wo83\n\nThe nodes took a picture once every hour, so some images were at night and too dark to see. Images where I couldn't discern whether there was snow or not (snow fell at night at an unclear time) were discarded from the dataset. Darker images were still included if I could confirm that they contained snow.", "# Approach\n\nFirst, the images needed to be preprocessed and transformed. One problem snow detection runs into is the similarity between snow and clouds. Unsupervised methods based on color often classify clouds as also being snow, but this issue is solved through the use of deep learning and some more heavy-handed techniques. Neural networks (hopefully) can learn not to depend only on color but instead on other information such as texture. To help the network along I also cropped out the sky from the images in addition to other transforms such as solarization. Solarization randomly reverses the brightness of pixels over a certain threshold so it can't depend on which pixels are very bright. These changes force the model to learn to recognize snow on the ground through additional attributes in addition to color.\n\nOur goal was to create a machine-learning model that could detect whether there was snow on the ground around the river. Convolutional neural networks are the main tool of choice for these kinds of image related tasks. They work by using a sliding \"window\" across an image to capture relationships and patterns between pixels across the image. This sliding window approach reduces the number of parameters and complexity of the model. There are already a multitude of pre-trained convolutional network models out there that perform well on image classification tasks, but there aren't any deep learning models trained specifically for snow detection. Luckily _transfer learning_ comes to the rescue to make training a new model incredibly easy with limited time and computational power. \n\nTransfer learning works by taking an image classification model that someone else has already taken the time to train reusing it for a new purpose. I utilized ResNet50 [1], a popular convolutional neural network model that pioneered a technique called residual connections. Residual connections allow neural networks to optimize quickly while still being deep enough to capture complex relationships. ResNet50 is a very deep network with fifty layers (hence the name) and would take a lot of time and computing power to train even with the residual connections, but luckily some free pre-trained models are essentially plug-and-play with only small modifications. A visualization of ResNet50's architecture is seen below [2].\n\n!ResNet50 Model (without additional layers)\n\nThe theory behind transfer learning is that ResNet50 has already learned to encode certain aspects of an image that are generalizable, so all it needs is a few changes to use those lower-level features to create a new prediction. To turn the model into a snow detector, I tacked on a couple of extra linear layers at the end to generate a prediction score for whether there is snow or not. This vastly sped up training time compared to creating a whole new model.", "# Results\nThe classifier was able to detect snow incredibly accurately from images collected from W014 and W083.\n!confusion matrix\nHowever, we wanted to ensure that the model wasn't completely overfitting to the images from these nodes and was learning something about snow. I also tested it on images from a node in Montana (W084). It didn't perform quite as well but still performed accurately enough to indicate that it wasn't overfitting horrendously. That being said, currently, the plugin is released to be used at the Bad River W083 node as it's not fit to be a general snow classifier quite yet.", "# Future Steps\n\nWe weren't able to get additional data from the Bad River, but additional work could look at using these images to predict turbidity data and other information about the river. This could be used to facilitate and predict wild rice yields as well. More data from other Waggle nodes could also be used to create a more general snow classifier that could be used at other locations with more confidence, but for now it's best only at the Bad River site. \n\ns\n[1] K. He, X. Zhang, S. Ren, and J. Sun, “Deep residual learning for image recognition,” 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), 2016. doi:10.1109/cvpr.2016.90 \n\n[2] URL" ]
[ 40, 53, 82, 262, 624, 146, 168 ]
[ "passage: TAGS\n#task_categories-image-classification #task_ids-multi-label-image-classification #license-mit #climate #region-us \n# Images from the Bad River site in Northern Wisconsin\n- Homepage: Sage Continuum\n- Author: Alex Arnold, Northwestern University\n- Mentors: Bhupendra Raut, Seongha Park\n- Repository: GitHub Repository# Introduction\nIce and snowfall are incredibly important parts of a river ecosystem. The Bad River is home to wild rice, which is very temperamental and prone to natural boom/bust years. Having a snow classifier can be used to create a larger dataset of snow that can be used for a variety of these additional tasks including assisting with predicting wild rice yields.# The Data\nTwo Waggle nodes were collecting both images and other data from the Bad River in the past year. The W014 Waggle node was collecting data in 2022 up until December when it went offline, In January a second node (W083) started collecting images pointing at essentially the same spot. This gave me a collection of 3500 images to work with. Luckily about half of them had snow of some kind and half did not so there weren't any major class imbalance problems. One of the big decisions I had to make was when to count an image as having snow. Did a few patches count? Did a light dusting of snow count? In the end, I elected to count _any_ snow on the ground to simplify the problem. The two images below are from W014 and W083 respectively.\n\n!W014\n\n!Wo83\n\nThe nodes took a picture once every hour, so some images were at night and too dark to see. Images where I couldn't discern whether there was snow or not (snow fell at night at an unclear time) were discarded from the dataset. Darker images were still included if I could confirm that they contained snow." ]
941a4804fbc8a60786b858304ce8be7b3998b643
# Dataset Card for "Thunderbird_GPTNEO_Finetuned" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
EgilKarlsen/Thunderbird_GPTNEO_Finetuned
[ "region:us" ]
2023-08-18T19:47:54+00:00
{"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "test", "path": "data/test-*"}]}], "dataset_info": {"features": [{"name": "0", "dtype": "float32"}, {"name": "1", "dtype": "float32"}, {"name": "2", "dtype": "float32"}, {"name": "3", "dtype": "float32"}, {"name": "4", "dtype": "float32"}, {"name": "5", "dtype": "float32"}, {"name": "6", "dtype": "float32"}, {"name": "7", "dtype": "float32"}, {"name": "8", "dtype": "float32"}, {"name": "9", "dtype": "float32"}, {"name": "10", "dtype": "float32"}, {"name": "11", "dtype": "float32"}, {"name": "12", "dtype": "float32"}, {"name": "13", "dtype": "float32"}, {"name": "14", "dtype": "float32"}, {"name": "15", "dtype": "float32"}, {"name": "16", "dtype": "float32"}, {"name": "17", "dtype": "float32"}, {"name": "18", "dtype": "float32"}, {"name": "19", "dtype": "float32"}, {"name": "20", "dtype": "float32"}, {"name": "21", "dtype": "float32"}, {"name": "22", "dtype": "float32"}, {"name": "23", "dtype": "float32"}, {"name": "24", "dtype": "float32"}, {"name": "25", "dtype": "float32"}, {"name": "26", "dtype": "float32"}, {"name": "27", "dtype": "float32"}, {"name": "28", "dtype": "float32"}, {"name": "29", "dtype": "float32"}, {"name": "30", "dtype": "float32"}, {"name": "31", "dtype": "float32"}, {"name": "32", "dtype": "float32"}, {"name": "33", "dtype": "float32"}, {"name": "34", "dtype": "float32"}, {"name": "35", "dtype": "float32"}, {"name": "36", "dtype": "float32"}, {"name": "37", "dtype": "float32"}, {"name": "38", "dtype": "float32"}, {"name": "39", "dtype": "float32"}, {"name": "40", "dtype": "float32"}, {"name": "41", "dtype": "float32"}, {"name": "42", "dtype": "float32"}, {"name": "43", "dtype": "float32"}, {"name": "44", "dtype": "float32"}, {"name": "45", "dtype": "float32"}, {"name": "46", "dtype": "float32"}, {"name": "47", "dtype": "float32"}, {"name": "48", "dtype": "float32"}, {"name": "49", "dtype": "float32"}, {"name": "50", "dtype": "float32"}, {"name": "51", "dtype": "float32"}, {"name": "52", "dtype": "float32"}, {"name": "53", "dtype": "float32"}, {"name": "54", "dtype": "float32"}, {"name": "55", "dtype": "float32"}, {"name": "56", "dtype": "float32"}, {"name": "57", "dtype": "float32"}, {"name": "58", "dtype": "float32"}, {"name": "59", "dtype": "float32"}, {"name": "60", "dtype": "float32"}, {"name": "61", "dtype": "float32"}, {"name": "62", "dtype": "float32"}, {"name": "63", "dtype": "float32"}, {"name": "64", "dtype": "float32"}, {"name": "65", "dtype": "float32"}, {"name": "66", "dtype": "float32"}, {"name": "67", "dtype": "float32"}, {"name": "68", "dtype": "float32"}, {"name": "69", "dtype": "float32"}, {"name": "70", "dtype": "float32"}, {"name": "71", "dtype": "float32"}, {"name": "72", "dtype": "float32"}, {"name": "73", "dtype": "float32"}, {"name": "74", "dtype": "float32"}, {"name": "75", "dtype": "float32"}, {"name": "76", "dtype": "float32"}, {"name": "77", "dtype": "float32"}, {"name": "78", "dtype": "float32"}, {"name": "79", "dtype": "float32"}, {"name": "80", "dtype": "float32"}, {"name": "81", "dtype": "float32"}, {"name": "82", "dtype": "float32"}, {"name": "83", "dtype": "float32"}, {"name": "84", "dtype": "float32"}, {"name": "85", "dtype": "float32"}, {"name": "86", "dtype": "float32"}, {"name": "87", "dtype": "float32"}, {"name": "88", "dtype": "float32"}, {"name": "89", "dtype": "float32"}, {"name": "90", "dtype": "float32"}, {"name": "91", "dtype": "float32"}, {"name": "92", "dtype": "float32"}, {"name": "93", "dtype": "float32"}, {"name": "94", "dtype": "float32"}, {"name": "95", "dtype": "float32"}, {"name": "96", "dtype": "float32"}, {"name": "97", "dtype": "float32"}, {"name": "98", "dtype": "float32"}, {"name": "99", "dtype": "float32"}, {"name": "100", "dtype": "float32"}, {"name": "101", "dtype": "float32"}, {"name": "102", "dtype": "float32"}, {"name": "103", "dtype": "float32"}, {"name": "104", "dtype": "float32"}, {"name": "105", "dtype": "float32"}, {"name": "106", "dtype": "float32"}, {"name": "107", "dtype": "float32"}, {"name": "108", "dtype": "float32"}, {"name": "109", "dtype": "float32"}, {"name": "110", "dtype": "float32"}, {"name": "111", "dtype": "float32"}, {"name": "112", "dtype": "float32"}, {"name": "113", "dtype": "float32"}, {"name": "114", "dtype": "float32"}, {"name": "115", "dtype": "float32"}, {"name": "116", "dtype": "float32"}, {"name": "117", "dtype": "float32"}, {"name": "118", "dtype": "float32"}, {"name": "119", "dtype": "float32"}, {"name": "120", "dtype": "float32"}, {"name": "121", "dtype": "float32"}, {"name": "122", "dtype": "float32"}, {"name": "123", "dtype": "float32"}, {"name": "124", "dtype": "float32"}, {"name": "125", "dtype": "float32"}, {"name": "126", "dtype": "float32"}, {"name": "127", "dtype": "float32"}, {"name": "128", "dtype": "float32"}, {"name": "129", "dtype": "float32"}, {"name": "130", "dtype": "float32"}, {"name": "131", "dtype": "float32"}, {"name": "132", "dtype": "float32"}, {"name": "133", "dtype": "float32"}, {"name": "134", "dtype": "float32"}, {"name": "135", "dtype": "float32"}, {"name": "136", "dtype": "float32"}, {"name": "137", "dtype": "float32"}, {"name": "138", "dtype": "float32"}, {"name": "139", "dtype": "float32"}, {"name": "140", "dtype": "float32"}, {"name": "141", "dtype": "float32"}, {"name": "142", "dtype": "float32"}, {"name": "143", "dtype": "float32"}, {"name": "144", "dtype": "float32"}, {"name": "145", "dtype": "float32"}, {"name": "146", "dtype": "float32"}, {"name": "147", "dtype": "float32"}, {"name": "148", "dtype": "float32"}, {"name": "149", "dtype": "float32"}, {"name": "150", "dtype": "float32"}, {"name": "151", "dtype": "float32"}, {"name": "152", "dtype": "float32"}, {"name": "153", "dtype": "float32"}, {"name": "154", "dtype": "float32"}, {"name": "155", "dtype": "float32"}, {"name": "156", "dtype": "float32"}, {"name": "157", "dtype": "float32"}, {"name": "158", "dtype": "float32"}, {"name": "159", "dtype": "float32"}, {"name": "160", "dtype": "float32"}, {"name": "161", "dtype": "float32"}, {"name": "162", "dtype": "float32"}, {"name": "163", "dtype": "float32"}, {"name": "164", "dtype": "float32"}, {"name": "165", "dtype": "float32"}, {"name": "166", "dtype": "float32"}, {"name": "167", "dtype": "float32"}, {"name": "168", "dtype": "float32"}, {"name": "169", "dtype": "float32"}, {"name": "170", "dtype": "float32"}, {"name": "171", "dtype": "float32"}, {"name": "172", "dtype": "float32"}, {"name": "173", "dtype": "float32"}, {"name": "174", "dtype": "float32"}, {"name": "175", "dtype": "float32"}, {"name": "176", "dtype": "float32"}, {"name": "177", "dtype": "float32"}, {"name": "178", "dtype": "float32"}, {"name": "179", "dtype": "float32"}, {"name": "180", "dtype": "float32"}, {"name": "181", "dtype": "float32"}, {"name": "182", "dtype": "float32"}, {"name": "183", "dtype": "float32"}, {"name": "184", "dtype": "float32"}, {"name": "185", "dtype": "float32"}, {"name": "186", "dtype": "float32"}, {"name": "187", "dtype": "float32"}, {"name": "188", "dtype": "float32"}, {"name": "189", "dtype": "float32"}, {"name": "190", "dtype": "float32"}, {"name": "191", "dtype": "float32"}, {"name": "192", "dtype": "float32"}, {"name": "193", "dtype": "float32"}, {"name": "194", "dtype": "float32"}, {"name": "195", "dtype": "float32"}, {"name": "196", "dtype": "float32"}, {"name": "197", "dtype": "float32"}, {"name": "198", "dtype": "float32"}, {"name": "199", "dtype": "float32"}, {"name": "200", "dtype": "float32"}, {"name": "201", "dtype": "float32"}, {"name": "202", "dtype": "float32"}, {"name": "203", "dtype": "float32"}, {"name": "204", "dtype": "float32"}, {"name": "205", "dtype": "float32"}, {"name": "206", "dtype": "float32"}, {"name": "207", "dtype": "float32"}, {"name": "208", "dtype": "float32"}, {"name": "209", "dtype": "float32"}, {"name": "210", "dtype": "float32"}, {"name": "211", "dtype": "float32"}, {"name": "212", "dtype": "float32"}, {"name": "213", "dtype": "float32"}, {"name": "214", "dtype": "float32"}, {"name": "215", "dtype": "float32"}, {"name": "216", "dtype": "float32"}, {"name": "217", "dtype": "float32"}, {"name": "218", "dtype": "float32"}, {"name": "219", "dtype": "float32"}, {"name": "220", "dtype": "float32"}, {"name": "221", "dtype": "float32"}, {"name": "222", "dtype": "float32"}, {"name": "223", "dtype": "float32"}, {"name": "224", "dtype": "float32"}, {"name": "225", "dtype": "float32"}, {"name": "226", "dtype": "float32"}, {"name": "227", "dtype": "float32"}, {"name": "228", "dtype": "float32"}, {"name": "229", "dtype": "float32"}, {"name": "230", "dtype": "float32"}, {"name": "231", "dtype": "float32"}, {"name": "232", "dtype": "float32"}, {"name": "233", "dtype": "float32"}, {"name": "234", "dtype": "float32"}, {"name": "235", "dtype": "float32"}, {"name": "236", "dtype": "float32"}, {"name": "237", "dtype": "float32"}, {"name": "238", "dtype": "float32"}, {"name": "239", "dtype": "float32"}, {"name": "240", "dtype": "float32"}, {"name": "241", "dtype": "float32"}, {"name": "242", "dtype": "float32"}, {"name": "243", "dtype": "float32"}, {"name": "244", "dtype": "float32"}, {"name": "245", "dtype": "float32"}, {"name": "246", "dtype": "float32"}, {"name": "247", "dtype": "float32"}, {"name": "248", "dtype": "float32"}, {"name": "249", "dtype": "float32"}, {"name": "250", "dtype": "float32"}, {"name": "251", "dtype": "float32"}, {"name": "252", "dtype": "float32"}, {"name": "253", "dtype": "float32"}, {"name": "254", "dtype": "float32"}, {"name": "255", "dtype": "float32"}, {"name": "256", "dtype": "float32"}, {"name": "257", "dtype": "float32"}, {"name": "258", "dtype": "float32"}, {"name": "259", "dtype": "float32"}, {"name": "260", "dtype": "float32"}, {"name": "261", "dtype": "float32"}, {"name": "262", "dtype": "float32"}, {"name": "263", "dtype": "float32"}, {"name": "264", "dtype": "float32"}, {"name": "265", "dtype": "float32"}, {"name": "266", "dtype": "float32"}, {"name": "267", "dtype": "float32"}, {"name": "268", "dtype": "float32"}, {"name": "269", "dtype": "float32"}, {"name": "270", "dtype": "float32"}, {"name": "271", "dtype": "float32"}, {"name": "272", "dtype": "float32"}, {"name": "273", "dtype": "float32"}, {"name": "274", "dtype": "float32"}, {"name": "275", "dtype": "float32"}, {"name": "276", "dtype": "float32"}, {"name": "277", "dtype": "float32"}, {"name": "278", "dtype": "float32"}, {"name": "279", "dtype": "float32"}, {"name": "280", "dtype": "float32"}, {"name": "281", "dtype": "float32"}, {"name": "282", "dtype": "float32"}, {"name": "283", "dtype": "float32"}, {"name": "284", "dtype": "float32"}, {"name": "285", "dtype": "float32"}, {"name": "286", "dtype": "float32"}, {"name": "287", "dtype": "float32"}, {"name": "288", "dtype": "float32"}, {"name": "289", "dtype": "float32"}, {"name": "290", "dtype": "float32"}, {"name": "291", "dtype": "float32"}, {"name": "292", "dtype": "float32"}, {"name": "293", "dtype": "float32"}, {"name": "294", "dtype": "float32"}, {"name": "295", "dtype": "float32"}, {"name": "296", "dtype": "float32"}, {"name": "297", "dtype": "float32"}, {"name": "298", "dtype": "float32"}, {"name": "299", "dtype": "float32"}, {"name": "300", "dtype": "float32"}, {"name": "301", "dtype": "float32"}, {"name": "302", "dtype": "float32"}, {"name": "303", "dtype": "float32"}, {"name": "304", "dtype": "float32"}, {"name": "305", "dtype": "float32"}, {"name": "306", "dtype": "float32"}, {"name": "307", "dtype": "float32"}, {"name": "308", "dtype": "float32"}, {"name": "309", "dtype": "float32"}, {"name": "310", "dtype": "float32"}, {"name": "311", "dtype": "float32"}, {"name": "312", "dtype": "float32"}, {"name": "313", "dtype": "float32"}, {"name": "314", "dtype": "float32"}, {"name": "315", "dtype": "float32"}, {"name": "316", "dtype": "float32"}, {"name": "317", "dtype": "float32"}, {"name": "318", "dtype": "float32"}, {"name": "319", "dtype": "float32"}, {"name": "320", "dtype": "float32"}, {"name": "321", "dtype": "float32"}, {"name": "322", "dtype": "float32"}, {"name": "323", "dtype": "float32"}, {"name": "324", "dtype": "float32"}, {"name": "325", "dtype": "float32"}, {"name": "326", "dtype": "float32"}, {"name": "327", "dtype": "float32"}, {"name": "328", "dtype": "float32"}, {"name": "329", "dtype": "float32"}, {"name": "330", "dtype": "float32"}, {"name": "331", "dtype": "float32"}, {"name": "332", "dtype": "float32"}, {"name": "333", "dtype": "float32"}, {"name": "334", "dtype": "float32"}, {"name": "335", "dtype": "float32"}, {"name": "336", "dtype": "float32"}, {"name": "337", "dtype": "float32"}, {"name": "338", "dtype": "float32"}, {"name": "339", "dtype": "float32"}, {"name": "340", "dtype": "float32"}, {"name": "341", "dtype": "float32"}, {"name": "342", "dtype": "float32"}, {"name": "343", "dtype": "float32"}, {"name": "344", "dtype": "float32"}, {"name": "345", "dtype": "float32"}, {"name": "346", "dtype": "float32"}, {"name": "347", "dtype": "float32"}, {"name": "348", "dtype": "float32"}, {"name": "349", "dtype": "float32"}, {"name": "350", "dtype": "float32"}, {"name": "351", "dtype": "float32"}, {"name": "352", "dtype": "float32"}, {"name": "353", "dtype": "float32"}, {"name": "354", "dtype": "float32"}, {"name": "355", "dtype": "float32"}, {"name": "356", "dtype": "float32"}, {"name": "357", "dtype": "float32"}, {"name": "358", "dtype": "float32"}, {"name": "359", "dtype": "float32"}, {"name": "360", "dtype": "float32"}, {"name": "361", "dtype": "float32"}, {"name": "362", "dtype": "float32"}, {"name": "363", "dtype": "float32"}, {"name": "364", "dtype": "float32"}, {"name": "365", "dtype": "float32"}, {"name": "366", "dtype": "float32"}, {"name": "367", "dtype": "float32"}, {"name": "368", "dtype": "float32"}, {"name": "369", "dtype": "float32"}, {"name": "370", "dtype": "float32"}, {"name": "371", "dtype": "float32"}, {"name": "372", "dtype": "float32"}, {"name": "373", "dtype": "float32"}, {"name": "374", "dtype": "float32"}, {"name": "375", "dtype": "float32"}, {"name": "376", "dtype": "float32"}, {"name": "377", "dtype": "float32"}, {"name": "378", "dtype": "float32"}, {"name": "379", "dtype": "float32"}, {"name": "380", "dtype": "float32"}, {"name": "381", "dtype": "float32"}, {"name": "382", "dtype": "float32"}, {"name": "383", "dtype": "float32"}, {"name": "384", "dtype": "float32"}, {"name": "385", "dtype": "float32"}, {"name": "386", "dtype": "float32"}, {"name": "387", "dtype": "float32"}, {"name": "388", "dtype": "float32"}, {"name": "389", "dtype": "float32"}, {"name": "390", "dtype": "float32"}, {"name": "391", "dtype": "float32"}, {"name": "392", "dtype": "float32"}, {"name": "393", "dtype": "float32"}, {"name": "394", "dtype": "float32"}, {"name": "395", "dtype": "float32"}, {"name": "396", "dtype": "float32"}, {"name": "397", "dtype": "float32"}, {"name": "398", "dtype": "float32"}, {"name": "399", "dtype": "float32"}, {"name": "400", "dtype": "float32"}, {"name": "401", "dtype": "float32"}, {"name": "402", "dtype": "float32"}, {"name": "403", "dtype": "float32"}, {"name": "404", "dtype": "float32"}, {"name": "405", "dtype": "float32"}, {"name": "406", "dtype": "float32"}, {"name": "407", "dtype": "float32"}, {"name": "408", "dtype": "float32"}, {"name": "409", "dtype": "float32"}, {"name": "410", "dtype": "float32"}, {"name": "411", "dtype": "float32"}, {"name": "412", "dtype": "float32"}, {"name": "413", "dtype": "float32"}, {"name": "414", "dtype": "float32"}, {"name": "415", "dtype": "float32"}, {"name": "416", "dtype": "float32"}, {"name": "417", "dtype": "float32"}, {"name": "418", "dtype": "float32"}, {"name": "419", "dtype": "float32"}, {"name": "420", "dtype": "float32"}, {"name": "421", "dtype": "float32"}, {"name": "422", "dtype": "float32"}, {"name": "423", "dtype": "float32"}, {"name": "424", "dtype": "float32"}, {"name": "425", "dtype": "float32"}, {"name": "426", "dtype": "float32"}, {"name": "427", "dtype": "float32"}, {"name": "428", "dtype": "float32"}, {"name": "429", "dtype": "float32"}, {"name": "430", "dtype": "float32"}, {"name": "431", "dtype": "float32"}, {"name": "432", "dtype": "float32"}, {"name": "433", "dtype": "float32"}, {"name": "434", "dtype": "float32"}, {"name": "435", "dtype": "float32"}, {"name": "436", "dtype": "float32"}, {"name": "437", "dtype": "float32"}, {"name": "438", "dtype": "float32"}, {"name": "439", "dtype": "float32"}, {"name": "440", "dtype": "float32"}, {"name": "441", "dtype": "float32"}, {"name": "442", "dtype": "float32"}, {"name": "443", "dtype": "float32"}, {"name": "444", "dtype": "float32"}, {"name": "445", "dtype": "float32"}, {"name": "446", "dtype": "float32"}, {"name": "447", "dtype": "float32"}, {"name": "448", "dtype": "float32"}, {"name": "449", "dtype": "float32"}, {"name": "450", "dtype": "float32"}, {"name": "451", "dtype": "float32"}, {"name": "452", "dtype": "float32"}, {"name": "453", "dtype": "float32"}, {"name": "454", "dtype": "float32"}, {"name": "455", "dtype": "float32"}, {"name": "456", "dtype": "float32"}, {"name": "457", "dtype": "float32"}, {"name": "458", "dtype": "float32"}, {"name": "459", "dtype": "float32"}, {"name": "460", "dtype": "float32"}, {"name": "461", "dtype": "float32"}, {"name": "462", "dtype": "float32"}, {"name": "463", "dtype": "float32"}, {"name": "464", "dtype": "float32"}, {"name": "465", "dtype": "float32"}, {"name": "466", "dtype": "float32"}, {"name": "467", "dtype": "float32"}, {"name": "468", "dtype": "float32"}, {"name": "469", "dtype": "float32"}, {"name": "470", "dtype": "float32"}, {"name": "471", "dtype": "float32"}, {"name": "472", "dtype": "float32"}, {"name": "473", "dtype": "float32"}, {"name": "474", "dtype": "float32"}, {"name": "475", "dtype": "float32"}, {"name": "476", "dtype": "float32"}, {"name": "477", "dtype": "float32"}, {"name": "478", "dtype": "float32"}, {"name": "479", "dtype": "float32"}, {"name": "480", "dtype": "float32"}, {"name": "481", "dtype": "float32"}, {"name": "482", "dtype": "float32"}, {"name": "483", "dtype": "float32"}, {"name": "484", "dtype": "float32"}, {"name": "485", "dtype": "float32"}, {"name": "486", "dtype": "float32"}, {"name": "487", "dtype": "float32"}, {"name": "488", "dtype": "float32"}, {"name": "489", "dtype": "float32"}, {"name": "490", "dtype": "float32"}, {"name": "491", "dtype": "float32"}, {"name": "492", "dtype": "float32"}, {"name": "493", "dtype": "float32"}, {"name": "494", "dtype": "float32"}, {"name": "495", "dtype": "float32"}, {"name": "496", "dtype": "float32"}, {"name": "497", "dtype": "float32"}, {"name": "498", "dtype": "float32"}, {"name": "499", "dtype": "float32"}, {"name": "500", "dtype": "float32"}, {"name": "501", "dtype": "float32"}, {"name": "502", "dtype": "float32"}, {"name": "503", "dtype": "float32"}, {"name": "504", "dtype": "float32"}, {"name": "505", "dtype": "float32"}, {"name": "506", "dtype": "float32"}, {"name": "507", "dtype": "float32"}, {"name": "508", "dtype": "float32"}, {"name": "509", "dtype": "float32"}, {"name": "510", "dtype": "float32"}, {"name": "511", "dtype": "float32"}, {"name": "512", "dtype": "float32"}, {"name": "513", "dtype": "float32"}, {"name": "514", "dtype": "float32"}, {"name": "515", "dtype": "float32"}, {"name": "516", "dtype": "float32"}, {"name": "517", "dtype": "float32"}, {"name": "518", "dtype": "float32"}, {"name": "519", "dtype": "float32"}, {"name": "520", "dtype": "float32"}, {"name": "521", "dtype": "float32"}, {"name": "522", "dtype": "float32"}, {"name": "523", "dtype": "float32"}, {"name": "524", "dtype": "float32"}, {"name": "525", "dtype": "float32"}, {"name": "526", "dtype": "float32"}, {"name": "527", "dtype": "float32"}, {"name": "528", "dtype": "float32"}, {"name": "529", "dtype": "float32"}, {"name": "530", "dtype": "float32"}, {"name": "531", "dtype": "float32"}, {"name": "532", "dtype": "float32"}, {"name": "533", "dtype": "float32"}, {"name": "534", "dtype": "float32"}, {"name": "535", "dtype": "float32"}, {"name": "536", "dtype": "float32"}, {"name": "537", "dtype": "float32"}, {"name": "538", "dtype": "float32"}, {"name": "539", "dtype": "float32"}, {"name": "540", "dtype": "float32"}, {"name": "541", "dtype": "float32"}, {"name": "542", "dtype": "float32"}, {"name": "543", "dtype": "float32"}, {"name": "544", "dtype": "float32"}, {"name": "545", "dtype": "float32"}, {"name": "546", "dtype": "float32"}, {"name": "547", "dtype": "float32"}, {"name": "548", "dtype": "float32"}, {"name": "549", "dtype": "float32"}, {"name": "550", "dtype": "float32"}, {"name": "551", "dtype": "float32"}, {"name": "552", "dtype": "float32"}, {"name": "553", "dtype": "float32"}, {"name": "554", "dtype": "float32"}, {"name": "555", "dtype": "float32"}, {"name": "556", "dtype": "float32"}, {"name": "557", "dtype": "float32"}, {"name": "558", "dtype": "float32"}, {"name": "559", "dtype": "float32"}, {"name": "560", "dtype": "float32"}, {"name": "561", "dtype": "float32"}, {"name": "562", "dtype": "float32"}, {"name": "563", "dtype": "float32"}, {"name": "564", "dtype": "float32"}, {"name": "565", "dtype": "float32"}, {"name": "566", "dtype": "float32"}, {"name": "567", "dtype": "float32"}, {"name": "568", "dtype": "float32"}, {"name": "569", "dtype": "float32"}, {"name": "570", "dtype": "float32"}, {"name": "571", "dtype": "float32"}, {"name": "572", "dtype": "float32"}, {"name": "573", "dtype": "float32"}, {"name": "574", "dtype": "float32"}, {"name": "575", "dtype": "float32"}, {"name": "576", "dtype": "float32"}, {"name": "577", "dtype": "float32"}, {"name": "578", "dtype": "float32"}, {"name": "579", "dtype": "float32"}, {"name": "580", "dtype": "float32"}, {"name": "581", "dtype": "float32"}, {"name": "582", "dtype": "float32"}, {"name": "583", "dtype": "float32"}, {"name": "584", "dtype": "float32"}, {"name": "585", "dtype": "float32"}, {"name": "586", "dtype": "float32"}, {"name": "587", "dtype": "float32"}, {"name": "588", "dtype": "float32"}, {"name": "589", "dtype": "float32"}, {"name": "590", "dtype": "float32"}, {"name": "591", "dtype": "float32"}, {"name": "592", "dtype": "float32"}, {"name": "593", "dtype": "float32"}, {"name": "594", "dtype": "float32"}, {"name": "595", "dtype": "float32"}, {"name": "596", "dtype": "float32"}, {"name": "597", "dtype": "float32"}, {"name": "598", "dtype": "float32"}, {"name": "599", "dtype": "float32"}, {"name": "600", "dtype": "float32"}, {"name": "601", "dtype": "float32"}, {"name": "602", "dtype": "float32"}, {"name": "603", "dtype": "float32"}, {"name": "604", "dtype": "float32"}, {"name": "605", "dtype": "float32"}, {"name": "606", "dtype": "float32"}, {"name": "607", "dtype": "float32"}, {"name": "608", "dtype": "float32"}, {"name": "609", "dtype": "float32"}, {"name": "610", "dtype": "float32"}, {"name": "611", "dtype": "float32"}, {"name": "612", "dtype": "float32"}, {"name": "613", "dtype": "float32"}, {"name": "614", "dtype": "float32"}, {"name": "615", "dtype": "float32"}, {"name": "616", "dtype": "float32"}, {"name": "617", "dtype": "float32"}, {"name": "618", "dtype": "float32"}, {"name": "619", "dtype": "float32"}, {"name": "620", "dtype": "float32"}, {"name": "621", "dtype": "float32"}, {"name": "622", "dtype": "float32"}, {"name": "623", "dtype": "float32"}, {"name": "624", "dtype": "float32"}, {"name": "625", "dtype": "float32"}, {"name": "626", "dtype": "float32"}, {"name": "627", "dtype": "float32"}, {"name": "628", "dtype": "float32"}, {"name": "629", "dtype": "float32"}, {"name": "630", "dtype": "float32"}, {"name": "631", "dtype": "float32"}, {"name": "632", "dtype": "float32"}, {"name": "633", "dtype": "float32"}, {"name": "634", "dtype": "float32"}, {"name": "635", "dtype": "float32"}, {"name": "636", "dtype": "float32"}, {"name": "637", "dtype": "float32"}, {"name": "638", "dtype": "float32"}, {"name": "639", "dtype": "float32"}, {"name": "640", "dtype": "float32"}, {"name": "641", "dtype": "float32"}, {"name": "642", "dtype": "float32"}, {"name": "643", "dtype": "float32"}, {"name": "644", "dtype": "float32"}, {"name": "645", "dtype": "float32"}, {"name": "646", "dtype": "float32"}, {"name": "647", "dtype": "float32"}, {"name": "648", "dtype": "float32"}, {"name": "649", "dtype": "float32"}, {"name": "650", "dtype": "float32"}, {"name": "651", "dtype": "float32"}, {"name": "652", "dtype": "float32"}, {"name": "653", "dtype": "float32"}, {"name": "654", "dtype": "float32"}, {"name": "655", "dtype": "float32"}, {"name": "656", "dtype": "float32"}, {"name": "657", "dtype": "float32"}, {"name": "658", "dtype": "float32"}, {"name": "659", "dtype": "float32"}, {"name": "660", "dtype": "float32"}, {"name": "661", "dtype": "float32"}, {"name": "662", "dtype": "float32"}, {"name": "663", "dtype": "float32"}, {"name": "664", "dtype": "float32"}, {"name": "665", "dtype": "float32"}, {"name": "666", "dtype": "float32"}, {"name": "667", "dtype": "float32"}, {"name": "668", "dtype": "float32"}, {"name": "669", "dtype": "float32"}, {"name": "670", "dtype": "float32"}, {"name": "671", "dtype": "float32"}, {"name": "672", "dtype": "float32"}, {"name": "673", "dtype": "float32"}, {"name": "674", "dtype": "float32"}, {"name": "675", "dtype": "float32"}, {"name": "676", "dtype": "float32"}, {"name": "677", "dtype": "float32"}, {"name": "678", "dtype": "float32"}, {"name": "679", "dtype": "float32"}, {"name": "680", "dtype": "float32"}, {"name": "681", "dtype": "float32"}, {"name": "682", "dtype": "float32"}, {"name": "683", "dtype": "float32"}, {"name": "684", "dtype": "float32"}, {"name": "685", "dtype": "float32"}, {"name": "686", "dtype": "float32"}, {"name": "687", "dtype": "float32"}, {"name": "688", "dtype": "float32"}, {"name": "689", "dtype": "float32"}, {"name": "690", "dtype": "float32"}, {"name": "691", "dtype": "float32"}, {"name": "692", "dtype": "float32"}, {"name": "693", "dtype": "float32"}, {"name": "694", "dtype": "float32"}, {"name": "695", "dtype": "float32"}, {"name": "696", "dtype": "float32"}, {"name": "697", "dtype": "float32"}, {"name": "698", "dtype": "float32"}, {"name": "699", "dtype": "float32"}, {"name": "700", "dtype": "float32"}, {"name": "701", "dtype": "float32"}, {"name": "702", "dtype": "float32"}, {"name": "703", "dtype": "float32"}, {"name": "704", "dtype": "float32"}, {"name": "705", "dtype": "float32"}, {"name": "706", "dtype": "float32"}, {"name": "707", "dtype": "float32"}, {"name": "708", "dtype": "float32"}, {"name": "709", "dtype": "float32"}, {"name": "710", "dtype": "float32"}, {"name": "711", "dtype": "float32"}, {"name": "712", "dtype": "float32"}, {"name": "713", "dtype": "float32"}, {"name": "714", "dtype": "float32"}, {"name": "715", "dtype": "float32"}, {"name": "716", "dtype": "float32"}, {"name": "717", "dtype": "float32"}, {"name": "718", "dtype": "float32"}, {"name": "719", "dtype": "float32"}, {"name": "720", "dtype": "float32"}, {"name": "721", "dtype": "float32"}, {"name": "722", "dtype": "float32"}, {"name": "723", "dtype": "float32"}, {"name": "724", "dtype": "float32"}, {"name": "725", "dtype": "float32"}, {"name": "726", "dtype": "float32"}, {"name": "727", "dtype": "float32"}, {"name": "728", "dtype": "float32"}, {"name": "729", "dtype": "float32"}, {"name": "730", "dtype": "float32"}, {"name": "731", "dtype": "float32"}, {"name": "732", "dtype": "float32"}, {"name": "733", "dtype": "float32"}, {"name": "734", "dtype": "float32"}, {"name": "735", "dtype": "float32"}, {"name": "736", "dtype": "float32"}, {"name": "737", "dtype": "float32"}, {"name": "738", "dtype": "float32"}, {"name": "739", "dtype": "float32"}, {"name": "740", "dtype": "float32"}, {"name": "741", "dtype": "float32"}, {"name": "742", "dtype": "float32"}, {"name": "743", "dtype": "float32"}, {"name": "744", "dtype": "float32"}, {"name": "745", "dtype": "float32"}, {"name": "746", "dtype": "float32"}, {"name": "747", "dtype": "float32"}, {"name": "748", "dtype": "float32"}, {"name": "749", "dtype": "float32"}, {"name": "750", "dtype": "float32"}, {"name": "751", "dtype": "float32"}, {"name": "752", "dtype": "float32"}, {"name": "753", "dtype": "float32"}, {"name": "754", "dtype": "float32"}, {"name": "755", "dtype": "float32"}, {"name": "756", "dtype": "float32"}, {"name": "757", "dtype": "float32"}, {"name": "758", "dtype": "float32"}, {"name": "759", "dtype": "float32"}, {"name": "760", "dtype": "float32"}, {"name": "761", "dtype": "float32"}, {"name": "762", "dtype": "float32"}, {"name": "763", "dtype": "float32"}, {"name": "764", "dtype": "float32"}, {"name": "765", "dtype": "float32"}, {"name": "766", "dtype": "float32"}, {"name": "767", "dtype": "float32"}, {"name": "768", "dtype": "float32"}, {"name": "769", "dtype": "float32"}, {"name": "770", "dtype": "float32"}, {"name": "771", "dtype": "float32"}, {"name": "772", "dtype": "float32"}, {"name": "773", "dtype": "float32"}, {"name": "774", "dtype": "float32"}, {"name": "775", "dtype": "float32"}, {"name": "776", "dtype": "float32"}, {"name": "777", "dtype": "float32"}, {"name": "778", "dtype": "float32"}, {"name": "779", "dtype": "float32"}, {"name": "780", "dtype": "float32"}, {"name": "781", "dtype": "float32"}, {"name": "782", "dtype": "float32"}, {"name": "783", "dtype": "float32"}, {"name": "784", "dtype": "float32"}, {"name": "785", "dtype": "float32"}, {"name": "786", "dtype": "float32"}, {"name": "787", "dtype": "float32"}, {"name": "788", "dtype": "float32"}, {"name": "789", "dtype": "float32"}, {"name": "790", "dtype": "float32"}, {"name": "791", "dtype": "float32"}, {"name": "792", "dtype": "float32"}, {"name": "793", "dtype": "float32"}, {"name": "794", "dtype": "float32"}, {"name": "795", "dtype": "float32"}, {"name": "796", "dtype": "float32"}, {"name": "797", "dtype": "float32"}, {"name": "798", "dtype": "float32"}, {"name": "799", "dtype": "float32"}, {"name": "800", "dtype": "float32"}, {"name": "801", "dtype": "float32"}, {"name": "802", "dtype": "float32"}, {"name": "803", "dtype": "float32"}, {"name": "804", "dtype": "float32"}, {"name": "805", "dtype": "float32"}, {"name": "806", "dtype": "float32"}, {"name": "807", "dtype": "float32"}, {"name": "808", "dtype": "float32"}, {"name": "809", "dtype": "float32"}, {"name": "810", "dtype": "float32"}, {"name": "811", "dtype": "float32"}, {"name": "812", "dtype": "float32"}, {"name": "813", "dtype": "float32"}, {"name": "814", "dtype": "float32"}, {"name": "815", "dtype": "float32"}, {"name": "816", "dtype": "float32"}, {"name": "817", "dtype": "float32"}, {"name": "818", "dtype": "float32"}, {"name": "819", "dtype": "float32"}, {"name": "820", "dtype": "float32"}, {"name": "821", "dtype": "float32"}, {"name": "822", "dtype": "float32"}, {"name": "823", "dtype": "float32"}, {"name": "824", "dtype": "float32"}, {"name": "825", "dtype": "float32"}, {"name": "826", "dtype": "float32"}, {"name": "827", "dtype": "float32"}, {"name": "828", "dtype": "float32"}, {"name": "829", "dtype": "float32"}, {"name": "830", "dtype": "float32"}, {"name": "831", "dtype": "float32"}, {"name": "832", "dtype": "float32"}, {"name": "833", "dtype": "float32"}, {"name": "834", "dtype": "float32"}, {"name": "835", "dtype": "float32"}, {"name": "836", "dtype": "float32"}, {"name": "837", "dtype": "float32"}, {"name": "838", "dtype": "float32"}, {"name": "839", "dtype": "float32"}, {"name": "840", "dtype": "float32"}, {"name": "841", "dtype": "float32"}, {"name": "842", "dtype": "float32"}, {"name": "843", "dtype": "float32"}, {"name": "844", "dtype": "float32"}, {"name": "845", "dtype": "float32"}, {"name": "846", "dtype": "float32"}, {"name": "847", "dtype": "float32"}, {"name": "848", "dtype": "float32"}, {"name": "849", "dtype": "float32"}, {"name": "850", "dtype": "float32"}, {"name": "851", "dtype": "float32"}, {"name": "852", "dtype": "float32"}, {"name": "853", "dtype": "float32"}, {"name": "854", "dtype": "float32"}, {"name": "855", "dtype": "float32"}, {"name": "856", "dtype": "float32"}, {"name": "857", "dtype": "float32"}, {"name": "858", "dtype": "float32"}, {"name": "859", "dtype": "float32"}, {"name": "860", "dtype": "float32"}, {"name": "861", "dtype": "float32"}, {"name": "862", "dtype": "float32"}, {"name": "863", "dtype": "float32"}, {"name": "864", "dtype": "float32"}, {"name": "865", "dtype": "float32"}, {"name": "866", "dtype": "float32"}, {"name": "867", "dtype": "float32"}, {"name": "868", "dtype": "float32"}, {"name": "869", "dtype": "float32"}, {"name": "870", "dtype": "float32"}, {"name": "871", "dtype": "float32"}, {"name": "872", "dtype": "float32"}, {"name": "873", "dtype": "float32"}, {"name": "874", "dtype": "float32"}, {"name": "875", "dtype": "float32"}, {"name": "876", "dtype": "float32"}, {"name": "877", "dtype": "float32"}, {"name": "878", "dtype": "float32"}, {"name": "879", "dtype": "float32"}, {"name": "880", "dtype": "float32"}, {"name": "881", "dtype": "float32"}, {"name": "882", "dtype": "float32"}, {"name": "883", "dtype": "float32"}, {"name": "884", "dtype": "float32"}, {"name": "885", "dtype": "float32"}, {"name": "886", "dtype": "float32"}, {"name": "887", "dtype": "float32"}, {"name": "888", "dtype": "float32"}, {"name": "889", "dtype": "float32"}, {"name": "890", "dtype": "float32"}, {"name": "891", "dtype": "float32"}, {"name": "892", "dtype": "float32"}, {"name": "893", "dtype": "float32"}, {"name": "894", "dtype": "float32"}, {"name": "895", "dtype": "float32"}, {"name": "896", "dtype": "float32"}, {"name": "897", "dtype": "float32"}, {"name": "898", "dtype": "float32"}, {"name": "899", "dtype": "float32"}, {"name": "900", "dtype": "float32"}, {"name": "901", "dtype": "float32"}, {"name": "902", "dtype": "float32"}, {"name": "903", "dtype": "float32"}, {"name": "904", "dtype": "float32"}, {"name": "905", "dtype": "float32"}, {"name": "906", "dtype": "float32"}, {"name": "907", "dtype": "float32"}, {"name": "908", "dtype": "float32"}, {"name": "909", "dtype": "float32"}, {"name": "910", "dtype": "float32"}, {"name": "911", "dtype": "float32"}, {"name": "912", "dtype": "float32"}, {"name": "913", "dtype": "float32"}, {"name": "914", "dtype": "float32"}, {"name": "915", "dtype": "float32"}, {"name": "916", "dtype": "float32"}, {"name": "917", "dtype": "float32"}, {"name": "918", "dtype": "float32"}, {"name": "919", "dtype": "float32"}, {"name": "920", "dtype": "float32"}, {"name": "921", "dtype": "float32"}, {"name": "922", "dtype": "float32"}, {"name": "923", "dtype": "float32"}, {"name": "924", "dtype": "float32"}, {"name": "925", "dtype": "float32"}, {"name": "926", "dtype": "float32"}, {"name": "927", "dtype": "float32"}, {"name": "928", "dtype": "float32"}, {"name": "929", "dtype": "float32"}, {"name": "930", "dtype": "float32"}, {"name": "931", "dtype": "float32"}, {"name": "932", "dtype": "float32"}, {"name": "933", "dtype": "float32"}, {"name": "934", "dtype": "float32"}, {"name": "935", "dtype": "float32"}, {"name": "936", "dtype": "float32"}, {"name": "937", "dtype": "float32"}, {"name": "938", "dtype": "float32"}, {"name": "939", "dtype": "float32"}, {"name": "940", "dtype": "float32"}, {"name": "941", "dtype": "float32"}, {"name": "942", "dtype": "float32"}, {"name": "943", "dtype": "float32"}, {"name": "944", "dtype": "float32"}, {"name": "945", "dtype": "float32"}, {"name": "946", "dtype": "float32"}, {"name": "947", "dtype": "float32"}, {"name": "948", "dtype": "float32"}, {"name": "949", "dtype": "float32"}, {"name": "950", "dtype": "float32"}, {"name": "951", "dtype": "float32"}, {"name": "952", "dtype": "float32"}, {"name": "953", "dtype": "float32"}, {"name": "954", "dtype": "float32"}, {"name": "955", "dtype": "float32"}, {"name": "956", "dtype": "float32"}, {"name": "957", "dtype": "float32"}, {"name": "958", "dtype": "float32"}, {"name": "959", "dtype": "float32"}, {"name": "960", "dtype": "float32"}, {"name": "961", "dtype": "float32"}, {"name": "962", "dtype": "float32"}, {"name": "963", "dtype": "float32"}, {"name": "964", "dtype": "float32"}, {"name": "965", "dtype": "float32"}, {"name": "966", "dtype": "float32"}, {"name": "967", "dtype": "float32"}, {"name": "968", "dtype": "float32"}, {"name": "969", "dtype": "float32"}, {"name": "970", "dtype": "float32"}, {"name": "971", "dtype": "float32"}, {"name": "972", "dtype": "float32"}, {"name": "973", "dtype": "float32"}, {"name": "974", "dtype": "float32"}, {"name": "975", "dtype": "float32"}, {"name": "976", "dtype": "float32"}, {"name": "977", "dtype": "float32"}, {"name": "978", "dtype": "float32"}, {"name": "979", "dtype": "float32"}, {"name": "980", "dtype": "float32"}, {"name": "981", "dtype": "float32"}, {"name": "982", "dtype": "float32"}, {"name": "983", "dtype": "float32"}, {"name": "984", "dtype": "float32"}, {"name": "985", "dtype": "float32"}, {"name": "986", "dtype": "float32"}, {"name": "987", "dtype": "float32"}, {"name": "988", "dtype": "float32"}, {"name": "989", "dtype": "float32"}, {"name": "990", "dtype": "float32"}, {"name": "991", "dtype": "float32"}, {"name": "992", "dtype": "float32"}, {"name": "993", "dtype": "float32"}, {"name": "994", "dtype": "float32"}, {"name": "995", "dtype": "float32"}, {"name": "996", "dtype": "float32"}, {"name": "997", "dtype": "float32"}, {"name": "998", "dtype": "float32"}, {"name": "999", "dtype": "float32"}, {"name": "1000", "dtype": "float32"}, {"name": "1001", "dtype": "float32"}, {"name": "1002", "dtype": "float32"}, {"name": "1003", "dtype": "float32"}, {"name": "1004", "dtype": "float32"}, {"name": "1005", "dtype": "float32"}, {"name": "1006", "dtype": "float32"}, {"name": "1007", "dtype": "float32"}, {"name": "1008", "dtype": "float32"}, {"name": "1009", "dtype": "float32"}, {"name": "1010", "dtype": "float32"}, {"name": "1011", "dtype": "float32"}, {"name": "1012", "dtype": "float32"}, {"name": "1013", "dtype": "float32"}, {"name": "1014", "dtype": "float32"}, {"name": "1015", "dtype": "float32"}, {"name": "1016", "dtype": "float32"}, {"name": "1017", "dtype": "float32"}, {"name": "1018", "dtype": "float32"}, {"name": "1019", "dtype": "float32"}, {"name": "1020", "dtype": "float32"}, {"name": "1021", "dtype": "float32"}, {"name": "1022", "dtype": "float32"}, {"name": "1023", "dtype": "float32"}, {"name": "1024", "dtype": "float32"}, {"name": "1025", "dtype": "float32"}, {"name": "1026", "dtype": "float32"}, {"name": "1027", "dtype": "float32"}, {"name": "1028", "dtype": "float32"}, {"name": "1029", "dtype": "float32"}, {"name": "1030", "dtype": "float32"}, {"name": "1031", "dtype": "float32"}, {"name": "1032", "dtype": "float32"}, {"name": "1033", "dtype": "float32"}, {"name": "1034", "dtype": "float32"}, {"name": "1035", "dtype": "float32"}, {"name": "1036", "dtype": "float32"}, {"name": "1037", "dtype": "float32"}, {"name": "1038", "dtype": "float32"}, {"name": "1039", "dtype": "float32"}, {"name": "1040", "dtype": "float32"}, {"name": "1041", "dtype": "float32"}, {"name": "1042", "dtype": "float32"}, {"name": "1043", "dtype": "float32"}, {"name": "1044", "dtype": "float32"}, {"name": "1045", "dtype": "float32"}, {"name": "1046", "dtype": "float32"}, {"name": "1047", "dtype": "float32"}, {"name": "1048", "dtype": "float32"}, {"name": "1049", "dtype": "float32"}, {"name": "1050", "dtype": "float32"}, {"name": "1051", "dtype": "float32"}, {"name": "1052", "dtype": "float32"}, {"name": "1053", "dtype": "float32"}, {"name": "1054", "dtype": "float32"}, {"name": "1055", "dtype": "float32"}, {"name": "1056", "dtype": "float32"}, {"name": "1057", "dtype": "float32"}, {"name": "1058", "dtype": "float32"}, {"name": "1059", "dtype": "float32"}, {"name": "1060", "dtype": "float32"}, {"name": "1061", "dtype": "float32"}, {"name": "1062", "dtype": "float32"}, {"name": "1063", "dtype": "float32"}, {"name": "1064", "dtype": "float32"}, {"name": "1065", "dtype": "float32"}, {"name": "1066", "dtype": "float32"}, {"name": "1067", "dtype": "float32"}, {"name": "1068", "dtype": "float32"}, {"name": "1069", "dtype": "float32"}, {"name": "1070", "dtype": "float32"}, {"name": "1071", "dtype": "float32"}, {"name": "1072", "dtype": "float32"}, {"name": "1073", "dtype": "float32"}, {"name": "1074", "dtype": "float32"}, {"name": "1075", "dtype": "float32"}, {"name": "1076", "dtype": "float32"}, {"name": "1077", "dtype": "float32"}, {"name": "1078", "dtype": "float32"}, {"name": "1079", "dtype": "float32"}, {"name": "1080", "dtype": "float32"}, {"name": "1081", "dtype": "float32"}, {"name": "1082", "dtype": "float32"}, {"name": "1083", "dtype": "float32"}, {"name": "1084", "dtype": "float32"}, {"name": "1085", "dtype": "float32"}, {"name": "1086", "dtype": "float32"}, {"name": "1087", "dtype": "float32"}, {"name": "1088", "dtype": "float32"}, {"name": "1089", "dtype": "float32"}, {"name": "1090", "dtype": "float32"}, {"name": "1091", "dtype": "float32"}, {"name": "1092", "dtype": "float32"}, {"name": "1093", "dtype": "float32"}, {"name": "1094", "dtype": "float32"}, {"name": "1095", "dtype": "float32"}, {"name": "1096", "dtype": "float32"}, {"name": "1097", "dtype": "float32"}, {"name": "1098", "dtype": "float32"}, {"name": "1099", "dtype": "float32"}, {"name": "1100", "dtype": "float32"}, {"name": "1101", "dtype": "float32"}, {"name": "1102", "dtype": "float32"}, {"name": "1103", "dtype": "float32"}, {"name": "1104", "dtype": "float32"}, {"name": "1105", "dtype": "float32"}, {"name": "1106", "dtype": "float32"}, {"name": "1107", "dtype": "float32"}, {"name": "1108", "dtype": "float32"}, {"name": "1109", "dtype": "float32"}, {"name": "1110", "dtype": "float32"}, {"name": "1111", "dtype": "float32"}, {"name": "1112", "dtype": "float32"}, {"name": "1113", "dtype": "float32"}, {"name": "1114", "dtype": "float32"}, {"name": "1115", "dtype": "float32"}, {"name": "1116", "dtype": "float32"}, {"name": "1117", "dtype": "float32"}, {"name": "1118", "dtype": "float32"}, {"name": "1119", "dtype": "float32"}, {"name": "1120", "dtype": "float32"}, {"name": "1121", "dtype": "float32"}, {"name": "1122", "dtype": "float32"}, {"name": "1123", "dtype": "float32"}, {"name": "1124", "dtype": "float32"}, {"name": "1125", "dtype": "float32"}, {"name": "1126", "dtype": "float32"}, {"name": "1127", "dtype": "float32"}, {"name": "1128", "dtype": "float32"}, {"name": "1129", "dtype": "float32"}, {"name": "1130", "dtype": "float32"}, {"name": "1131", "dtype": "float32"}, {"name": "1132", "dtype": "float32"}, {"name": "1133", "dtype": "float32"}, {"name": "1134", "dtype": "float32"}, {"name": "1135", "dtype": "float32"}, {"name": "1136", "dtype": "float32"}, {"name": "1137", "dtype": "float32"}, {"name": "1138", "dtype": "float32"}, {"name": "1139", "dtype": "float32"}, {"name": "1140", "dtype": "float32"}, {"name": "1141", "dtype": "float32"}, {"name": "1142", "dtype": "float32"}, {"name": "1143", "dtype": "float32"}, {"name": "1144", "dtype": "float32"}, {"name": "1145", "dtype": "float32"}, {"name": "1146", "dtype": "float32"}, {"name": "1147", "dtype": "float32"}, {"name": "1148", "dtype": "float32"}, {"name": "1149", "dtype": "float32"}, {"name": "1150", "dtype": "float32"}, {"name": "1151", "dtype": "float32"}, {"name": "1152", "dtype": "float32"}, {"name": "1153", "dtype": "float32"}, {"name": "1154", "dtype": "float32"}, {"name": "1155", "dtype": "float32"}, {"name": "1156", "dtype": "float32"}, {"name": "1157", "dtype": "float32"}, {"name": "1158", "dtype": "float32"}, {"name": "1159", "dtype": "float32"}, {"name": "1160", "dtype": "float32"}, {"name": "1161", "dtype": "float32"}, {"name": "1162", "dtype": "float32"}, {"name": "1163", "dtype": "float32"}, {"name": "1164", "dtype": "float32"}, {"name": "1165", "dtype": "float32"}, {"name": "1166", "dtype": "float32"}, {"name": "1167", "dtype": "float32"}, {"name": "1168", "dtype": "float32"}, {"name": "1169", "dtype": "float32"}, {"name": "1170", "dtype": "float32"}, {"name": "1171", "dtype": "float32"}, {"name": "1172", "dtype": "float32"}, {"name": "1173", "dtype": "float32"}, {"name": "1174", "dtype": "float32"}, {"name": "1175", "dtype": "float32"}, {"name": "1176", "dtype": "float32"}, {"name": "1177", "dtype": "float32"}, {"name": "1178", "dtype": "float32"}, {"name": "1179", "dtype": "float32"}, {"name": "1180", "dtype": "float32"}, {"name": "1181", "dtype": "float32"}, {"name": "1182", "dtype": "float32"}, {"name": "1183", "dtype": "float32"}, {"name": "1184", "dtype": "float32"}, {"name": "1185", "dtype": "float32"}, {"name": "1186", "dtype": "float32"}, {"name": "1187", "dtype": "float32"}, {"name": "1188", "dtype": "float32"}, {"name": "1189", "dtype": "float32"}, {"name": "1190", "dtype": "float32"}, {"name": "1191", "dtype": "float32"}, {"name": "1192", "dtype": "float32"}, {"name": "1193", "dtype": "float32"}, {"name": "1194", "dtype": "float32"}, {"name": "1195", "dtype": "float32"}, {"name": "1196", "dtype": "float32"}, {"name": "1197", "dtype": "float32"}, {"name": "1198", "dtype": "float32"}, {"name": "1199", "dtype": "float32"}, {"name": "1200", "dtype": "float32"}, {"name": "1201", "dtype": "float32"}, {"name": "1202", "dtype": "float32"}, {"name": "1203", "dtype": "float32"}, {"name": "1204", "dtype": "float32"}, {"name": "1205", "dtype": "float32"}, {"name": "1206", "dtype": "float32"}, {"name": "1207", "dtype": "float32"}, {"name": "1208", "dtype": "float32"}, {"name": "1209", "dtype": "float32"}, {"name": "1210", "dtype": "float32"}, {"name": "1211", "dtype": "float32"}, {"name": "1212", "dtype": "float32"}, {"name": "1213", "dtype": "float32"}, {"name": "1214", "dtype": "float32"}, {"name": "1215", "dtype": "float32"}, {"name": "1216", "dtype": "float32"}, {"name": "1217", "dtype": "float32"}, {"name": "1218", "dtype": "float32"}, {"name": "1219", "dtype": "float32"}, {"name": "1220", "dtype": "float32"}, {"name": "1221", "dtype": "float32"}, {"name": "1222", "dtype": "float32"}, {"name": "1223", "dtype": "float32"}, {"name": "1224", "dtype": "float32"}, {"name": "1225", "dtype": "float32"}, {"name": "1226", "dtype": "float32"}, {"name": "1227", "dtype": "float32"}, {"name": "1228", "dtype": "float32"}, {"name": "1229", "dtype": "float32"}, {"name": "1230", "dtype": "float32"}, {"name": "1231", "dtype": "float32"}, {"name": "1232", "dtype": "float32"}, {"name": "1233", "dtype": "float32"}, {"name": "1234", "dtype": "float32"}, {"name": "1235", "dtype": "float32"}, {"name": "1236", "dtype": "float32"}, {"name": "1237", "dtype": "float32"}, {"name": "1238", "dtype": "float32"}, {"name": "1239", "dtype": "float32"}, {"name": "1240", "dtype": "float32"}, {"name": "1241", "dtype": "float32"}, {"name": "1242", "dtype": "float32"}, {"name": "1243", "dtype": "float32"}, {"name": "1244", "dtype": "float32"}, {"name": "1245", "dtype": "float32"}, {"name": "1246", "dtype": "float32"}, {"name": "1247", "dtype": "float32"}, {"name": "1248", "dtype": "float32"}, {"name": "1249", "dtype": "float32"}, {"name": "1250", "dtype": "float32"}, {"name": "1251", "dtype": "float32"}, {"name": "1252", "dtype": "float32"}, {"name": "1253", "dtype": "float32"}, {"name": "1254", "dtype": "float32"}, {"name": "1255", "dtype": "float32"}, {"name": "1256", "dtype": "float32"}, {"name": "1257", "dtype": "float32"}, {"name": "1258", "dtype": "float32"}, {"name": "1259", "dtype": "float32"}, {"name": "1260", "dtype": "float32"}, {"name": "1261", "dtype": "float32"}, {"name": "1262", "dtype": "float32"}, {"name": "1263", "dtype": "float32"}, {"name": "1264", "dtype": "float32"}, {"name": "1265", "dtype": "float32"}, {"name": "1266", "dtype": "float32"}, {"name": "1267", "dtype": "float32"}, {"name": "1268", "dtype": "float32"}, {"name": "1269", "dtype": "float32"}, {"name": "1270", "dtype": "float32"}, {"name": "1271", "dtype": "float32"}, {"name": "1272", "dtype": "float32"}, {"name": "1273", "dtype": "float32"}, {"name": "1274", "dtype": "float32"}, {"name": "1275", "dtype": "float32"}, {"name": "1276", "dtype": "float32"}, {"name": "1277", "dtype": "float32"}, {"name": "1278", "dtype": "float32"}, {"name": "1279", "dtype": "float32"}, {"name": "1280", "dtype": "float32"}, {"name": "1281", "dtype": "float32"}, {"name": "1282", "dtype": "float32"}, {"name": "1283", "dtype": "float32"}, {"name": "1284", "dtype": "float32"}, {"name": "1285", "dtype": "float32"}, {"name": "1286", "dtype": "float32"}, {"name": "1287", "dtype": "float32"}, {"name": "1288", "dtype": "float32"}, {"name": "1289", "dtype": "float32"}, {"name": "1290", "dtype": "float32"}, {"name": "1291", "dtype": "float32"}, {"name": "1292", "dtype": "float32"}, {"name": "1293", "dtype": "float32"}, {"name": "1294", "dtype": "float32"}, {"name": "1295", "dtype": "float32"}, {"name": "1296", "dtype": "float32"}, {"name": "1297", "dtype": "float32"}, {"name": "1298", "dtype": "float32"}, {"name": "1299", "dtype": "float32"}, {"name": "1300", "dtype": "float32"}, {"name": "1301", "dtype": "float32"}, {"name": "1302", "dtype": "float32"}, {"name": "1303", "dtype": "float32"}, {"name": "1304", "dtype": "float32"}, {"name": "1305", "dtype": "float32"}, {"name": "1306", "dtype": "float32"}, {"name": "1307", "dtype": "float32"}, {"name": "1308", "dtype": "float32"}, {"name": "1309", "dtype": "float32"}, {"name": "1310", "dtype": "float32"}, {"name": "1311", "dtype": "float32"}, {"name": "1312", "dtype": "float32"}, {"name": "1313", "dtype": "float32"}, {"name": "1314", "dtype": "float32"}, {"name": "1315", "dtype": "float32"}, {"name": "1316", "dtype": "float32"}, {"name": "1317", "dtype": "float32"}, {"name": "1318", "dtype": "float32"}, {"name": "1319", "dtype": "float32"}, {"name": "1320", "dtype": "float32"}, {"name": "1321", "dtype": "float32"}, {"name": "1322", "dtype": "float32"}, {"name": "1323", "dtype": "float32"}, {"name": "1324", "dtype": "float32"}, {"name": "1325", "dtype": "float32"}, {"name": "1326", "dtype": "float32"}, {"name": "1327", "dtype": "float32"}, {"name": "1328", "dtype": "float32"}, {"name": "1329", "dtype": "float32"}, {"name": "1330", "dtype": "float32"}, {"name": "1331", "dtype": "float32"}, {"name": "1332", "dtype": "float32"}, {"name": "1333", "dtype": "float32"}, {"name": "1334", "dtype": "float32"}, {"name": "1335", "dtype": "float32"}, {"name": "1336", "dtype": "float32"}, {"name": "1337", "dtype": "float32"}, {"name": "1338", "dtype": "float32"}, {"name": "1339", "dtype": "float32"}, {"name": "1340", "dtype": "float32"}, {"name": "1341", "dtype": "float32"}, {"name": "1342", "dtype": "float32"}, {"name": "1343", "dtype": "float32"}, {"name": "1344", "dtype": "float32"}, {"name": "1345", "dtype": "float32"}, {"name": "1346", "dtype": "float32"}, {"name": "1347", "dtype": "float32"}, {"name": "1348", "dtype": "float32"}, {"name": "1349", "dtype": "float32"}, {"name": "1350", "dtype": "float32"}, {"name": "1351", "dtype": "float32"}, {"name": "1352", "dtype": "float32"}, {"name": "1353", "dtype": "float32"}, {"name": "1354", "dtype": "float32"}, {"name": "1355", "dtype": "float32"}, {"name": "1356", "dtype": "float32"}, {"name": "1357", "dtype": "float32"}, {"name": "1358", "dtype": "float32"}, {"name": "1359", "dtype": "float32"}, {"name": "1360", "dtype": "float32"}, {"name": "1361", "dtype": "float32"}, {"name": "1362", "dtype": "float32"}, {"name": "1363", "dtype": "float32"}, {"name": "1364", "dtype": "float32"}, {"name": "1365", "dtype": "float32"}, {"name": "1366", "dtype": "float32"}, {"name": "1367", "dtype": "float32"}, {"name": "1368", "dtype": "float32"}, {"name": "1369", "dtype": "float32"}, {"name": "1370", "dtype": "float32"}, {"name": "1371", "dtype": "float32"}, {"name": "1372", "dtype": "float32"}, {"name": "1373", "dtype": "float32"}, {"name": "1374", "dtype": "float32"}, {"name": "1375", "dtype": "float32"}, {"name": "1376", "dtype": "float32"}, {"name": "1377", "dtype": "float32"}, {"name": "1378", "dtype": "float32"}, {"name": "1379", "dtype": "float32"}, {"name": "1380", "dtype": "float32"}, {"name": "1381", "dtype": "float32"}, {"name": "1382", "dtype": "float32"}, {"name": "1383", "dtype": "float32"}, {"name": "1384", "dtype": "float32"}, {"name": "1385", "dtype": "float32"}, {"name": "1386", "dtype": "float32"}, {"name": "1387", "dtype": "float32"}, {"name": "1388", "dtype": "float32"}, {"name": "1389", "dtype": "float32"}, {"name": "1390", "dtype": "float32"}, {"name": "1391", "dtype": "float32"}, {"name": "1392", "dtype": "float32"}, {"name": "1393", "dtype": "float32"}, {"name": "1394", "dtype": "float32"}, {"name": "1395", "dtype": "float32"}, {"name": "1396", "dtype": "float32"}, {"name": "1397", "dtype": "float32"}, {"name": "1398", "dtype": "float32"}, {"name": "1399", "dtype": "float32"}, {"name": "1400", "dtype": "float32"}, {"name": "1401", "dtype": "float32"}, {"name": "1402", "dtype": "float32"}, {"name": "1403", "dtype": "float32"}, {"name": "1404", "dtype": "float32"}, {"name": "1405", "dtype": "float32"}, {"name": "1406", "dtype": "float32"}, {"name": "1407", "dtype": "float32"}, {"name": "1408", "dtype": "float32"}, {"name": "1409", "dtype": "float32"}, {"name": "1410", "dtype": "float32"}, {"name": "1411", "dtype": "float32"}, {"name": "1412", "dtype": "float32"}, {"name": "1413", "dtype": "float32"}, {"name": "1414", "dtype": "float32"}, {"name": "1415", "dtype": "float32"}, {"name": "1416", "dtype": "float32"}, {"name": "1417", "dtype": "float32"}, {"name": "1418", "dtype": "float32"}, {"name": "1419", "dtype": "float32"}, {"name": "1420", "dtype": "float32"}, {"name": "1421", "dtype": "float32"}, {"name": "1422", "dtype": "float32"}, {"name": "1423", "dtype": "float32"}, {"name": "1424", "dtype": "float32"}, {"name": "1425", "dtype": "float32"}, {"name": "1426", "dtype": "float32"}, {"name": "1427", "dtype": "float32"}, {"name": "1428", "dtype": "float32"}, {"name": "1429", "dtype": "float32"}, {"name": "1430", "dtype": "float32"}, {"name": "1431", "dtype": "float32"}, {"name": "1432", "dtype": "float32"}, {"name": "1433", "dtype": "float32"}, {"name": "1434", "dtype": "float32"}, {"name": "1435", "dtype": "float32"}, {"name": "1436", "dtype": "float32"}, {"name": "1437", "dtype": "float32"}, {"name": "1438", "dtype": "float32"}, {"name": "1439", "dtype": "float32"}, {"name": "1440", "dtype": "float32"}, {"name": "1441", "dtype": "float32"}, {"name": "1442", "dtype": "float32"}, {"name": "1443", "dtype": "float32"}, {"name": "1444", "dtype": "float32"}, {"name": "1445", "dtype": "float32"}, {"name": "1446", "dtype": "float32"}, {"name": "1447", "dtype": "float32"}, {"name": "1448", "dtype": "float32"}, {"name": "1449", "dtype": "float32"}, {"name": "1450", "dtype": "float32"}, {"name": "1451", "dtype": "float32"}, {"name": "1452", "dtype": "float32"}, {"name": "1453", "dtype": "float32"}, {"name": "1454", "dtype": "float32"}, {"name": "1455", "dtype": "float32"}, {"name": "1456", "dtype": "float32"}, {"name": "1457", "dtype": "float32"}, {"name": "1458", "dtype": "float32"}, {"name": "1459", "dtype": "float32"}, {"name": "1460", "dtype": "float32"}, {"name": "1461", "dtype": "float32"}, {"name": "1462", "dtype": "float32"}, {"name": "1463", "dtype": "float32"}, {"name": "1464", "dtype": "float32"}, {"name": "1465", "dtype": "float32"}, {"name": "1466", "dtype": "float32"}, {"name": "1467", "dtype": "float32"}, {"name": "1468", "dtype": "float32"}, {"name": "1469", "dtype": "float32"}, {"name": "1470", "dtype": "float32"}, {"name": "1471", "dtype": "float32"}, {"name": "1472", "dtype": "float32"}, {"name": "1473", "dtype": "float32"}, {"name": "1474", "dtype": "float32"}, {"name": "1475", "dtype": "float32"}, {"name": "1476", "dtype": "float32"}, {"name": "1477", "dtype": "float32"}, {"name": "1478", "dtype": "float32"}, {"name": "1479", "dtype": "float32"}, {"name": "1480", "dtype": "float32"}, {"name": "1481", "dtype": "float32"}, {"name": "1482", "dtype": "float32"}, {"name": "1483", "dtype": "float32"}, {"name": "1484", "dtype": "float32"}, {"name": "1485", "dtype": "float32"}, {"name": "1486", "dtype": "float32"}, {"name": "1487", "dtype": "float32"}, {"name": "1488", "dtype": "float32"}, {"name": "1489", "dtype": "float32"}, {"name": "1490", "dtype": "float32"}, {"name": "1491", "dtype": "float32"}, {"name": "1492", "dtype": "float32"}, {"name": "1493", "dtype": "float32"}, {"name": "1494", "dtype": "float32"}, {"name": "1495", "dtype": "float32"}, {"name": "1496", "dtype": "float32"}, {"name": "1497", "dtype": "float32"}, {"name": "1498", "dtype": "float32"}, {"name": "1499", "dtype": "float32"}, {"name": "1500", "dtype": "float32"}, {"name": "1501", "dtype": "float32"}, {"name": "1502", "dtype": "float32"}, {"name": "1503", "dtype": "float32"}, {"name": "1504", "dtype": "float32"}, {"name": "1505", "dtype": "float32"}, {"name": "1506", "dtype": "float32"}, {"name": "1507", "dtype": "float32"}, {"name": "1508", "dtype": "float32"}, {"name": "1509", "dtype": "float32"}, {"name": "1510", "dtype": "float32"}, {"name": "1511", "dtype": "float32"}, {"name": "1512", "dtype": "float32"}, {"name": "1513", "dtype": "float32"}, {"name": "1514", "dtype": "float32"}, {"name": "1515", "dtype": "float32"}, {"name": "1516", "dtype": "float32"}, {"name": "1517", "dtype": "float32"}, {"name": "1518", "dtype": "float32"}, {"name": "1519", "dtype": "float32"}, {"name": "1520", "dtype": "float32"}, {"name": "1521", "dtype": "float32"}, {"name": "1522", "dtype": "float32"}, {"name": "1523", "dtype": "float32"}, {"name": "1524", "dtype": "float32"}, {"name": "1525", "dtype": "float32"}, {"name": "1526", "dtype": "float32"}, {"name": "1527", "dtype": "float32"}, {"name": "1528", "dtype": "float32"}, {"name": "1529", "dtype": "float32"}, {"name": "1530", "dtype": "float32"}, {"name": "1531", "dtype": "float32"}, {"name": "1532", "dtype": "float32"}, {"name": "1533", "dtype": "float32"}, {"name": "1534", "dtype": "float32"}, {"name": "1535", "dtype": "float32"}, {"name": "1536", "dtype": "float32"}, {"name": "1537", "dtype": "float32"}, {"name": "1538", "dtype": "float32"}, {"name": "1539", "dtype": "float32"}, {"name": "1540", "dtype": "float32"}, {"name": "1541", "dtype": "float32"}, {"name": "1542", "dtype": "float32"}, {"name": "1543", "dtype": "float32"}, {"name": "1544", "dtype": "float32"}, {"name": "1545", "dtype": "float32"}, {"name": "1546", "dtype": "float32"}, {"name": "1547", "dtype": "float32"}, {"name": "1548", "dtype": "float32"}, {"name": "1549", "dtype": "float32"}, {"name": "1550", "dtype": "float32"}, {"name": "1551", "dtype": "float32"}, {"name": "1552", "dtype": "float32"}, {"name": "1553", "dtype": "float32"}, {"name": "1554", "dtype": "float32"}, {"name": "1555", "dtype": "float32"}, {"name": "1556", "dtype": "float32"}, {"name": "1557", "dtype": "float32"}, {"name": "1558", "dtype": "float32"}, {"name": "1559", "dtype": "float32"}, {"name": "1560", "dtype": "float32"}, {"name": "1561", "dtype": "float32"}, {"name": "1562", "dtype": "float32"}, {"name": "1563", "dtype": "float32"}, {"name": "1564", "dtype": "float32"}, {"name": "1565", "dtype": "float32"}, {"name": "1566", "dtype": "float32"}, {"name": "1567", "dtype": "float32"}, {"name": "1568", "dtype": "float32"}, {"name": "1569", "dtype": "float32"}, {"name": "1570", "dtype": "float32"}, {"name": "1571", "dtype": "float32"}, {"name": "1572", "dtype": "float32"}, {"name": "1573", "dtype": "float32"}, {"name": "1574", "dtype": "float32"}, {"name": "1575", "dtype": "float32"}, {"name": "1576", "dtype": "float32"}, {"name": "1577", "dtype": "float32"}, {"name": "1578", "dtype": "float32"}, {"name": "1579", "dtype": "float32"}, {"name": "1580", "dtype": "float32"}, {"name": "1581", "dtype": "float32"}, {"name": "1582", "dtype": "float32"}, {"name": "1583", "dtype": "float32"}, {"name": "1584", "dtype": "float32"}, {"name": "1585", "dtype": "float32"}, {"name": "1586", "dtype": "float32"}, {"name": "1587", "dtype": "float32"}, {"name": "1588", "dtype": "float32"}, {"name": "1589", "dtype": "float32"}, {"name": "1590", "dtype": "float32"}, {"name": "1591", "dtype": "float32"}, {"name": "1592", "dtype": "float32"}, {"name": "1593", "dtype": "float32"}, {"name": "1594", "dtype": "float32"}, {"name": "1595", "dtype": "float32"}, {"name": "1596", "dtype": "float32"}, {"name": "1597", "dtype": "float32"}, {"name": "1598", "dtype": "float32"}, {"name": "1599", "dtype": "float32"}, {"name": "1600", "dtype": "float32"}, {"name": "1601", "dtype": "float32"}, {"name": "1602", "dtype": "float32"}, {"name": "1603", "dtype": "float32"}, {"name": "1604", "dtype": "float32"}, {"name": "1605", "dtype": "float32"}, {"name": "1606", "dtype": "float32"}, {"name": "1607", "dtype": "float32"}, {"name": "1608", "dtype": "float32"}, {"name": "1609", "dtype": "float32"}, {"name": "1610", "dtype": "float32"}, {"name": "1611", "dtype": "float32"}, {"name": "1612", "dtype": "float32"}, {"name": "1613", "dtype": "float32"}, {"name": "1614", "dtype": "float32"}, {"name": "1615", "dtype": "float32"}, {"name": "1616", "dtype": "float32"}, {"name": "1617", "dtype": "float32"}, {"name": "1618", "dtype": "float32"}, {"name": "1619", "dtype": "float32"}, {"name": "1620", "dtype": "float32"}, {"name": "1621", "dtype": "float32"}, {"name": "1622", "dtype": "float32"}, {"name": "1623", "dtype": "float32"}, {"name": "1624", "dtype": "float32"}, {"name": "1625", "dtype": "float32"}, {"name": "1626", "dtype": "float32"}, {"name": "1627", "dtype": "float32"}, {"name": "1628", "dtype": "float32"}, {"name": "1629", "dtype": "float32"}, {"name": "1630", "dtype": "float32"}, {"name": "1631", "dtype": "float32"}, {"name": "1632", "dtype": "float32"}, {"name": "1633", "dtype": "float32"}, {"name": "1634", "dtype": "float32"}, {"name": "1635", "dtype": "float32"}, {"name": "1636", "dtype": "float32"}, {"name": "1637", "dtype": "float32"}, {"name": "1638", "dtype": "float32"}, {"name": "1639", "dtype": "float32"}, {"name": "1640", "dtype": "float32"}, {"name": "1641", "dtype": "float32"}, {"name": "1642", "dtype": "float32"}, {"name": "1643", "dtype": "float32"}, {"name": "1644", "dtype": "float32"}, {"name": "1645", "dtype": "float32"}, {"name": "1646", "dtype": "float32"}, {"name": "1647", "dtype": "float32"}, {"name": "1648", "dtype": "float32"}, {"name": "1649", "dtype": "float32"}, {"name": "1650", "dtype": "float32"}, {"name": "1651", "dtype": "float32"}, {"name": "1652", "dtype": "float32"}, {"name": "1653", "dtype": "float32"}, {"name": "1654", "dtype": "float32"}, {"name": "1655", "dtype": "float32"}, {"name": "1656", "dtype": "float32"}, {"name": "1657", "dtype": "float32"}, {"name": "1658", "dtype": "float32"}, {"name": "1659", "dtype": "float32"}, {"name": "1660", "dtype": "float32"}, {"name": "1661", "dtype": "float32"}, {"name": "1662", "dtype": "float32"}, {"name": "1663", "dtype": "float32"}, {"name": "1664", "dtype": "float32"}, {"name": "1665", "dtype": "float32"}, {"name": "1666", "dtype": "float32"}, {"name": "1667", "dtype": "float32"}, {"name": "1668", "dtype": "float32"}, {"name": "1669", "dtype": "float32"}, {"name": "1670", "dtype": "float32"}, {"name": "1671", "dtype": "float32"}, {"name": "1672", "dtype": "float32"}, {"name": "1673", "dtype": "float32"}, {"name": "1674", "dtype": "float32"}, {"name": "1675", "dtype": "float32"}, {"name": "1676", "dtype": "float32"}, {"name": "1677", "dtype": "float32"}, {"name": "1678", "dtype": "float32"}, {"name": "1679", "dtype": "float32"}, {"name": "1680", "dtype": "float32"}, {"name": "1681", "dtype": "float32"}, {"name": "1682", "dtype": "float32"}, {"name": "1683", "dtype": "float32"}, {"name": "1684", "dtype": "float32"}, {"name": "1685", "dtype": "float32"}, {"name": "1686", "dtype": "float32"}, {"name": "1687", "dtype": "float32"}, {"name": "1688", "dtype": "float32"}, {"name": "1689", "dtype": "float32"}, {"name": "1690", "dtype": "float32"}, {"name": "1691", "dtype": "float32"}, {"name": "1692", "dtype": "float32"}, {"name": "1693", "dtype": "float32"}, {"name": "1694", "dtype": "float32"}, {"name": "1695", "dtype": "float32"}, {"name": "1696", "dtype": "float32"}, {"name": "1697", "dtype": "float32"}, {"name": "1698", "dtype": "float32"}, {"name": "1699", "dtype": "float32"}, {"name": "1700", "dtype": "float32"}, {"name": "1701", "dtype": "float32"}, {"name": "1702", "dtype": "float32"}, {"name": "1703", "dtype": "float32"}, {"name": "1704", "dtype": "float32"}, {"name": "1705", "dtype": "float32"}, {"name": "1706", "dtype": "float32"}, {"name": "1707", "dtype": "float32"}, {"name": "1708", "dtype": "float32"}, {"name": "1709", "dtype": "float32"}, {"name": "1710", "dtype": "float32"}, {"name": "1711", "dtype": "float32"}, {"name": "1712", "dtype": "float32"}, {"name": "1713", "dtype": "float32"}, {"name": "1714", "dtype": "float32"}, {"name": "1715", "dtype": "float32"}, {"name": "1716", "dtype": "float32"}, {"name": "1717", "dtype": "float32"}, {"name": "1718", "dtype": "float32"}, {"name": "1719", "dtype": "float32"}, {"name": "1720", "dtype": "float32"}, {"name": "1721", "dtype": "float32"}, {"name": "1722", "dtype": "float32"}, {"name": "1723", "dtype": "float32"}, {"name": "1724", "dtype": "float32"}, {"name": "1725", "dtype": "float32"}, {"name": "1726", "dtype": "float32"}, {"name": "1727", "dtype": "float32"}, {"name": "1728", "dtype": "float32"}, {"name": "1729", "dtype": "float32"}, {"name": "1730", "dtype": "float32"}, {"name": "1731", "dtype": "float32"}, {"name": "1732", "dtype": "float32"}, {"name": "1733", "dtype": "float32"}, {"name": "1734", "dtype": "float32"}, {"name": "1735", "dtype": "float32"}, {"name": "1736", "dtype": "float32"}, {"name": "1737", "dtype": "float32"}, {"name": "1738", "dtype": "float32"}, {"name": "1739", "dtype": "float32"}, {"name": "1740", "dtype": "float32"}, {"name": "1741", "dtype": "float32"}, {"name": "1742", "dtype": "float32"}, {"name": "1743", "dtype": "float32"}, {"name": "1744", "dtype": "float32"}, {"name": "1745", "dtype": "float32"}, {"name": "1746", "dtype": "float32"}, {"name": "1747", "dtype": "float32"}, {"name": "1748", "dtype": "float32"}, {"name": "1749", "dtype": "float32"}, {"name": "1750", "dtype": "float32"}, {"name": "1751", "dtype": "float32"}, {"name": "1752", "dtype": "float32"}, {"name": "1753", "dtype": "float32"}, {"name": "1754", "dtype": "float32"}, {"name": "1755", "dtype": "float32"}, {"name": "1756", "dtype": "float32"}, {"name": "1757", "dtype": "float32"}, {"name": "1758", "dtype": "float32"}, {"name": "1759", "dtype": "float32"}, {"name": "1760", "dtype": "float32"}, {"name": "1761", "dtype": "float32"}, {"name": "1762", "dtype": "float32"}, {"name": "1763", "dtype": "float32"}, {"name": "1764", "dtype": "float32"}, {"name": "1765", "dtype": "float32"}, {"name": "1766", "dtype": "float32"}, {"name": "1767", "dtype": "float32"}, {"name": "1768", "dtype": "float32"}, {"name": "1769", "dtype": "float32"}, {"name": "1770", "dtype": "float32"}, {"name": "1771", "dtype": "float32"}, {"name": "1772", "dtype": "float32"}, {"name": "1773", "dtype": "float32"}, {"name": "1774", "dtype": "float32"}, {"name": "1775", "dtype": "float32"}, {"name": "1776", "dtype": "float32"}, {"name": "1777", "dtype": "float32"}, {"name": "1778", "dtype": "float32"}, {"name": "1779", "dtype": "float32"}, {"name": "1780", "dtype": "float32"}, {"name": "1781", "dtype": "float32"}, {"name": "1782", "dtype": "float32"}, {"name": "1783", "dtype": "float32"}, {"name": "1784", "dtype": "float32"}, {"name": "1785", "dtype": "float32"}, {"name": "1786", "dtype": "float32"}, {"name": "1787", "dtype": "float32"}, {"name": "1788", "dtype": "float32"}, {"name": "1789", "dtype": "float32"}, {"name": "1790", "dtype": "float32"}, {"name": "1791", "dtype": "float32"}, {"name": "1792", "dtype": "float32"}, {"name": "1793", "dtype": "float32"}, {"name": "1794", "dtype": "float32"}, {"name": "1795", "dtype": "float32"}, {"name": "1796", "dtype": "float32"}, {"name": "1797", "dtype": "float32"}, {"name": "1798", "dtype": "float32"}, {"name": "1799", "dtype": "float32"}, {"name": "1800", "dtype": "float32"}, {"name": "1801", "dtype": "float32"}, {"name": "1802", "dtype": "float32"}, {"name": "1803", "dtype": "float32"}, {"name": "1804", "dtype": "float32"}, {"name": "1805", "dtype": "float32"}, {"name": "1806", "dtype": "float32"}, {"name": "1807", "dtype": "float32"}, {"name": "1808", "dtype": "float32"}, {"name": "1809", "dtype": "float32"}, {"name": "1810", "dtype": "float32"}, {"name": "1811", "dtype": "float32"}, {"name": "1812", "dtype": "float32"}, {"name": "1813", "dtype": "float32"}, {"name": "1814", "dtype": "float32"}, {"name": "1815", "dtype": "float32"}, {"name": "1816", "dtype": "float32"}, {"name": "1817", "dtype": "float32"}, {"name": "1818", "dtype": "float32"}, {"name": "1819", "dtype": "float32"}, {"name": "1820", "dtype": "float32"}, {"name": "1821", "dtype": "float32"}, {"name": "1822", "dtype": "float32"}, {"name": "1823", "dtype": "float32"}, {"name": "1824", "dtype": "float32"}, {"name": "1825", "dtype": "float32"}, {"name": "1826", "dtype": "float32"}, {"name": "1827", "dtype": "float32"}, {"name": "1828", "dtype": "float32"}, {"name": "1829", "dtype": "float32"}, {"name": "1830", "dtype": "float32"}, {"name": "1831", "dtype": "float32"}, {"name": "1832", "dtype": "float32"}, {"name": "1833", "dtype": "float32"}, {"name": "1834", "dtype": "float32"}, {"name": "1835", "dtype": "float32"}, {"name": "1836", "dtype": "float32"}, {"name": "1837", "dtype": "float32"}, {"name": "1838", "dtype": "float32"}, {"name": "1839", "dtype": "float32"}, {"name": "1840", "dtype": "float32"}, {"name": "1841", "dtype": "float32"}, {"name": "1842", "dtype": "float32"}, {"name": "1843", "dtype": "float32"}, {"name": "1844", "dtype": "float32"}, {"name": "1845", "dtype": "float32"}, {"name": "1846", "dtype": "float32"}, {"name": "1847", "dtype": "float32"}, {"name": "1848", "dtype": "float32"}, {"name": "1849", "dtype": "float32"}, {"name": "1850", "dtype": "float32"}, {"name": "1851", "dtype": "float32"}, {"name": "1852", "dtype": "float32"}, {"name": "1853", "dtype": "float32"}, {"name": "1854", "dtype": "float32"}, {"name": "1855", "dtype": "float32"}, {"name": "1856", "dtype": "float32"}, {"name": "1857", "dtype": "float32"}, {"name": "1858", "dtype": "float32"}, {"name": "1859", "dtype": "float32"}, {"name": "1860", "dtype": "float32"}, {"name": "1861", "dtype": "float32"}, {"name": "1862", "dtype": "float32"}, {"name": "1863", "dtype": "float32"}, {"name": "1864", "dtype": "float32"}, {"name": "1865", "dtype": "float32"}, {"name": "1866", "dtype": "float32"}, {"name": "1867", "dtype": "float32"}, {"name": "1868", "dtype": "float32"}, {"name": "1869", "dtype": "float32"}, {"name": "1870", "dtype": "float32"}, {"name": "1871", "dtype": "float32"}, {"name": "1872", "dtype": "float32"}, {"name": "1873", "dtype": "float32"}, {"name": "1874", "dtype": "float32"}, {"name": "1875", "dtype": "float32"}, {"name": "1876", "dtype": "float32"}, {"name": "1877", "dtype": "float32"}, {"name": "1878", "dtype": "float32"}, {"name": "1879", "dtype": "float32"}, {"name": "1880", "dtype": "float32"}, {"name": "1881", "dtype": "float32"}, {"name": "1882", "dtype": "float32"}, {"name": "1883", "dtype": "float32"}, {"name": "1884", "dtype": "float32"}, {"name": "1885", "dtype": "float32"}, {"name": "1886", "dtype": "float32"}, {"name": "1887", "dtype": "float32"}, {"name": "1888", "dtype": "float32"}, {"name": "1889", "dtype": "float32"}, {"name": "1890", "dtype": "float32"}, {"name": "1891", "dtype": "float32"}, {"name": "1892", "dtype": "float32"}, {"name": "1893", "dtype": "float32"}, {"name": "1894", "dtype": "float32"}, {"name": "1895", "dtype": "float32"}, {"name": "1896", "dtype": "float32"}, {"name": "1897", "dtype": "float32"}, {"name": "1898", "dtype": "float32"}, {"name": "1899", "dtype": "float32"}, {"name": "1900", "dtype": "float32"}, {"name": "1901", "dtype": "float32"}, {"name": "1902", "dtype": "float32"}, {"name": "1903", "dtype": "float32"}, {"name": "1904", "dtype": "float32"}, {"name": "1905", "dtype": "float32"}, {"name": "1906", "dtype": "float32"}, {"name": "1907", "dtype": "float32"}, {"name": "1908", "dtype": "float32"}, {"name": "1909", "dtype": "float32"}, {"name": "1910", "dtype": "float32"}, {"name": "1911", "dtype": "float32"}, {"name": "1912", "dtype": "float32"}, {"name": "1913", "dtype": "float32"}, {"name": "1914", "dtype": "float32"}, {"name": "1915", "dtype": "float32"}, {"name": "1916", "dtype": "float32"}, {"name": "1917", "dtype": "float32"}, {"name": "1918", "dtype": "float32"}, {"name": "1919", "dtype": "float32"}, {"name": "1920", "dtype": "float32"}, {"name": "1921", "dtype": "float32"}, {"name": "1922", "dtype": "float32"}, {"name": "1923", "dtype": "float32"}, {"name": "1924", "dtype": "float32"}, {"name": "1925", "dtype": "float32"}, {"name": "1926", "dtype": "float32"}, {"name": "1927", "dtype": "float32"}, {"name": "1928", "dtype": "float32"}, {"name": "1929", "dtype": "float32"}, {"name": "1930", "dtype": "float32"}, {"name": "1931", "dtype": "float32"}, {"name": "1932", "dtype": "float32"}, {"name": "1933", "dtype": "float32"}, {"name": "1934", "dtype": "float32"}, {"name": "1935", "dtype": "float32"}, {"name": "1936", "dtype": "float32"}, {"name": "1937", "dtype": "float32"}, {"name": "1938", "dtype": "float32"}, {"name": "1939", "dtype": "float32"}, {"name": "1940", "dtype": "float32"}, {"name": "1941", "dtype": "float32"}, {"name": "1942", "dtype": "float32"}, {"name": "1943", "dtype": "float32"}, {"name": "1944", "dtype": "float32"}, {"name": "1945", "dtype": "float32"}, {"name": "1946", "dtype": "float32"}, {"name": "1947", "dtype": "float32"}, {"name": "1948", "dtype": "float32"}, {"name": "1949", "dtype": "float32"}, {"name": "1950", "dtype": "float32"}, {"name": "1951", "dtype": "float32"}, {"name": "1952", "dtype": "float32"}, {"name": "1953", "dtype": "float32"}, {"name": "1954", "dtype": "float32"}, {"name": "1955", "dtype": "float32"}, {"name": "1956", "dtype": "float32"}, {"name": "1957", "dtype": "float32"}, {"name": "1958", "dtype": "float32"}, {"name": "1959", "dtype": "float32"}, {"name": "1960", "dtype": "float32"}, {"name": "1961", "dtype": "float32"}, {"name": "1962", "dtype": "float32"}, {"name": "1963", "dtype": "float32"}, {"name": "1964", "dtype": "float32"}, {"name": "1965", "dtype": "float32"}, {"name": "1966", "dtype": "float32"}, {"name": "1967", "dtype": "float32"}, {"name": "1968", "dtype": "float32"}, {"name": "1969", "dtype": "float32"}, {"name": "1970", "dtype": "float32"}, {"name": "1971", "dtype": "float32"}, {"name": "1972", "dtype": "float32"}, {"name": "1973", "dtype": "float32"}, {"name": "1974", "dtype": "float32"}, {"name": "1975", "dtype": "float32"}, {"name": "1976", "dtype": "float32"}, {"name": "1977", "dtype": "float32"}, {"name": "1978", "dtype": "float32"}, {"name": "1979", "dtype": "float32"}, {"name": "1980", "dtype": "float32"}, {"name": "1981", "dtype": "float32"}, {"name": "1982", "dtype": "float32"}, {"name": "1983", "dtype": "float32"}, {"name": "1984", "dtype": "float32"}, {"name": "1985", "dtype": "float32"}, {"name": "1986", "dtype": "float32"}, {"name": "1987", "dtype": "float32"}, {"name": "1988", "dtype": "float32"}, {"name": "1989", "dtype": "float32"}, {"name": "1990", "dtype": "float32"}, {"name": "1991", "dtype": "float32"}, {"name": "1992", "dtype": "float32"}, {"name": "1993", "dtype": "float32"}, {"name": "1994", "dtype": "float32"}, {"name": "1995", "dtype": "float32"}, {"name": "1996", "dtype": "float32"}, {"name": "1997", "dtype": "float32"}, {"name": "1998", "dtype": "float32"}, {"name": "1999", "dtype": "float32"}, {"name": "2000", "dtype": "float32"}, {"name": "2001", "dtype": "float32"}, {"name": "2002", "dtype": "float32"}, {"name": "2003", "dtype": "float32"}, {"name": "2004", "dtype": "float32"}, {"name": "2005", "dtype": "float32"}, {"name": "2006", "dtype": "float32"}, {"name": "2007", "dtype": "float32"}, {"name": "2008", "dtype": "float32"}, {"name": "2009", "dtype": "float32"}, {"name": "2010", "dtype": "float32"}, {"name": "2011", "dtype": "float32"}, {"name": "2012", "dtype": "float32"}, {"name": "2013", "dtype": "float32"}, {"name": "2014", "dtype": "float32"}, {"name": "2015", "dtype": "float32"}, {"name": "2016", "dtype": "float32"}, {"name": "2017", "dtype": "float32"}, {"name": "2018", "dtype": "float32"}, {"name": "2019", "dtype": "float32"}, {"name": "2020", "dtype": "float32"}, {"name": "2021", "dtype": "float32"}, {"name": "2022", "dtype": "float32"}, {"name": "2023", "dtype": "float32"}, {"name": "2024", "dtype": "float32"}, {"name": "2025", "dtype": "float32"}, {"name": "2026", "dtype": "float32"}, {"name": "2027", "dtype": "float32"}, {"name": "2028", "dtype": "float32"}, {"name": "2029", "dtype": "float32"}, {"name": "2030", "dtype": "float32"}, {"name": "2031", "dtype": "float32"}, {"name": "2032", "dtype": "float32"}, {"name": "2033", "dtype": "float32"}, {"name": "2034", "dtype": "float32"}, {"name": "2035", "dtype": "float32"}, {"name": "2036", "dtype": "float32"}, {"name": "2037", "dtype": "float32"}, {"name": "2038", "dtype": "float32"}, {"name": "2039", "dtype": "float32"}, {"name": "2040", "dtype": "float32"}, {"name": "2041", "dtype": "float32"}, {"name": "2042", "dtype": "float32"}, {"name": "2043", "dtype": "float32"}, {"name": "2044", "dtype": "float32"}, {"name": "2045", "dtype": "float32"}, {"name": "2046", "dtype": "float32"}, {"name": "2047", "dtype": "float32"}, {"name": "label", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 307576729.6875, "num_examples": 37500}, {"name": "test", "num_bytes": 102525577.5, "num_examples": 12500}], "download_size": 565394599, "dataset_size": 410102307.1875}}
2023-08-23T04:02:52+00:00
[]
[]
TAGS #region-us
# Dataset Card for "Thunderbird_GPTNEO_Finetuned" More Information needed
[ "# Dataset Card for \"Thunderbird_GPTNEO_Finetuned\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"Thunderbird_GPTNEO_Finetuned\"\n\nMore Information needed" ]
[ 6, 22 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"Thunderbird_GPTNEO_Finetuned\"\n\nMore Information needed" ]
048b4f59e5c270babd085a9a11745e251745b1d4
# Dataset of gengetsu/幻月 (Touhou) This is the dataset of gengetsu/幻月 (Touhou), containing 247 images and their tags. The core tags of this character are `blonde_hair, bow, short_hair, wings, hair_bow, yellow_eyes, red_bow, angel_wings, feathered_wings, white_wings, ribbon`, which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)). ## List of Packages | Name | Images | Size | Download | Type | Description | |:-----------------|---------:|:-----------|:-----------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------| | raw | 247 | 236.22 MiB | [Download](https://huggingface.co/datasets/CyberHarem/gengetsu_touhou/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). | | 800 | 247 | 156.24 MiB | [Download](https://huggingface.co/datasets/CyberHarem/gengetsu_touhou/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. | | stage3-p480-800 | 475 | 296.26 MiB | [Download](https://huggingface.co/datasets/CyberHarem/gengetsu_touhou/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | | 1200 | 247 | 216.57 MiB | [Download](https://huggingface.co/datasets/CyberHarem/gengetsu_touhou/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. | | stage3-p480-1200 | 475 | 381.18 MiB | [Download](https://huggingface.co/datasets/CyberHarem/gengetsu_touhou/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code ```python import os import zipfile from huggingface_hub import hf_hub_download from waifuc.source import LocalSource # download raw archive file zip_file = hf_hub_download( repo_id='CyberHarem/gengetsu_touhou', repo_type='dataset', filename='dataset-raw.zip', ) # extract files to your directory dataset_dir = 'dataset_dir' os.makedirs(dataset_dir, exist_ok=True) with zipfile.ZipFile(zip_file, 'r') as zf: zf.extractall(dataset_dir) # load the dataset with waifuc source = LocalSource(dataset_dir) for item in source: print(item.image, item.meta['filename'], item.meta['tags']) ``` ## List of Clusters List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| | 0 | 6 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, closed_mouth, collared_shirt, juliet_sleeves, looking_at_viewer, red_bowtie, simple_background, solo, white_shirt, bangs, buttons, open_vest, smile, red_vest, upper_body, brown_vest | | 1 | 6 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | 1girl, juliet_sleeves, looking_at_viewer, open_mouth, red_bowtie, solo, white_shirt, :d, brown_vest, open_vest, pink_skirt, blush, medium_breasts | | 2 | 6 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | 1girl, juliet_sleeves, red_bowtie, smile, solo, vest, dress, looking_at_viewer, open_mouth, shirt, long_skirt | | 3 | 13 | ![](samples/3/clu3-sample0.png) | ![](samples/3/clu3-sample1.png) | ![](samples/3/clu3-sample2.png) | ![](samples/3/clu3-sample3.png) | ![](samples/3/clu3-sample4.png) | 1girl, dress, solo, smile | ### Table Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | closed_mouth | collared_shirt | juliet_sleeves | looking_at_viewer | red_bowtie | simple_background | solo | white_shirt | bangs | buttons | open_vest | smile | red_vest | upper_body | brown_vest | open_mouth | :d | pink_skirt | blush | medium_breasts | vest | dress | shirt | long_skirt | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:---------------|:-----------------|:-----------------|:--------------------|:-------------|:--------------------|:-------|:--------------|:--------|:----------|:------------|:--------|:-----------|:-------------|:-------------|:-------------|:-----|:-------------|:--------|:-----------------|:-------|:--------|:--------|:-------------| | 0 | 6 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | 1 | 6 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | X | | | X | X | X | | X | X | | | X | | | | X | X | X | X | X | X | | | | | | 2 | 6 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | X | | | X | X | X | | X | | | | | X | | | | X | | | | | X | X | X | X | | 3 | 13 | ![](samples/3/clu3-sample0.png) | ![](samples/3/clu3-sample1.png) | ![](samples/3/clu3-sample2.png) | ![](samples/3/clu3-sample3.png) | ![](samples/3/clu3-sample4.png) | X | | | | | | | X | | | | | X | | | | | | | | | | X | | |
CyberHarem/gengetsu_touhou
[ "task_categories:text-to-image", "size_categories:n<1K", "license:mit", "art", "not-for-all-audiences", "region:us" ]
2023-08-18T19:49:46+00:00
{"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]}
2024-01-15T02:32:01+00:00
[]
[]
TAGS #task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
Dataset of gengetsu/幻月 (Touhou) =============================== This is the dataset of gengetsu/幻月 (Touhou), containing 247 images and their tags. The core tags of this character are 'blonde\_hair, bow, short\_hair, wings, hair\_bow, yellow\_eyes, red\_bow, angel\_wings, feathered\_wings, white\_wings, ribbon', which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization). List of Packages ---------------- ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code List of Clusters ---------------- List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version ### Table Version
[ "### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.", "### Raw Text Version", "### Table Version" ]
[ "TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n", "### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.", "### Raw Text Version", "### Table Version" ]
[ 44, 61, 5, 4 ]
[ "passage: TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.### Raw Text Version### Table Version" ]
b2cf4276d50911df71ea39613c772f18c5a63fe2
# Dataset of hata_tan/はたたん (Touhou) This is the dataset of hata_tan/はたたん (Touhou), containing 90 images and their tags. The core tags of this character are `twintails, long_hair, hat, tokin_hat, purple_eyes, black_hair, brown_hair`, which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)). ## List of Packages | Name | Images | Size | Download | Type | Description | |:-----------------|---------:|:-----------|:-----------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------| | raw | 90 | 68.20 MiB | [Download](https://huggingface.co/datasets/CyberHarem/hata_tan_touhou/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). | | 800 | 90 | 54.78 MiB | [Download](https://huggingface.co/datasets/CyberHarem/hata_tan_touhou/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. | | stage3-p480-800 | 181 | 102.12 MiB | [Download](https://huggingface.co/datasets/CyberHarem/hata_tan_touhou/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | | 1200 | 90 | 66.16 MiB | [Download](https://huggingface.co/datasets/CyberHarem/hata_tan_touhou/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. | | stage3-p480-1200 | 181 | 118.93 MiB | [Download](https://huggingface.co/datasets/CyberHarem/hata_tan_touhou/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code ```python import os import zipfile from huggingface_hub import hf_hub_download from waifuc.source import LocalSource # download raw archive file zip_file = hf_hub_download( repo_id='CyberHarem/hata_tan_touhou', repo_type='dataset', filename='dataset-raw.zip', ) # extract files to your directory dataset_dir = 'dataset_dir' os.makedirs(dataset_dir, exist_ok=True) with zipfile.ZipFile(zip_file, 'r') as zf: zf.extractall(dataset_dir) # load the dataset with waifuc source = LocalSource(dataset_dir) for item in source: print(item.image, item.meta['filename'], item.meta['tags']) ``` ## List of Clusters List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:---------------------------------------------------| | 0 | 8 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, bags_under_eyes, necktie, solo, hair_ribbon | | 1 | 12 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | 1girl, necktie, solo, cellphone, checkered_skirt | ### Table Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | bags_under_eyes | necktie | solo | hair_ribbon | cellphone | checkered_skirt | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:------------------|:----------|:-------|:--------------|:------------|:------------------| | 0 | 8 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | | | | 1 | 12 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | X | | X | X | | X | X |
CyberHarem/hata_tan_touhou
[ "task_categories:text-to-image", "size_categories:n<1K", "license:mit", "art", "not-for-all-audiences", "region:us" ]
2023-08-18T20:05:14+00:00
{"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]}
2024-01-15T04:37:38+00:00
[]
[]
TAGS #task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
Dataset of hata\_tan/はたたん (Touhou) ================================== This is the dataset of hata\_tan/はたたん (Touhou), containing 90 images and their tags. The core tags of this character are 'twintails, long\_hair, hat, tokin\_hat, purple\_eyes, black\_hair, brown\_hair', which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization). List of Packages ---------------- ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code List of Clusters ---------------- List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version ### Table Version
[ "### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.", "### Raw Text Version", "### Table Version" ]
[ "TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n", "### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.", "### Raw Text Version", "### Table Version" ]
[ 44, 61, 5, 4 ]
[ "passage: TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.### Raw Text Version### Table Version" ]
194a4d82fa649efd26e7b69900d03658bb06faf1
# Dataset of kawashiro_mitori/河城みとり (Touhou) This is the dataset of kawashiro_mitori/河城みとり (Touhou), containing 25 images and their tags. The core tags of this character are `hair_ornament, hat, short_hair, red_eyes, pink_hair, side_ponytail`, which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)). ## List of Packages | Name | Images | Size | Download | Type | Description | |:-----------------|---------:|:----------|:-------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------| | raw | 25 | 21.45 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kawashiro_mitori_touhou/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). | | 800 | 25 | 15.45 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kawashiro_mitori_touhou/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. | | stage3-p480-800 | 44 | 26.74 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kawashiro_mitori_touhou/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | | 1200 | 25 | 20.40 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kawashiro_mitori_touhou/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. | | stage3-p480-1200 | 44 | 31.98 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kawashiro_mitori_touhou/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code ```python import os import zipfile from huggingface_hub import hf_hub_download from waifuc.source import LocalSource # download raw archive file zip_file = hf_hub_download( repo_id='CyberHarem/kawashiro_mitori_touhou', repo_type='dataset', filename='dataset-raw.zip', ) # extract files to your directory dataset_dir = 'dataset_dir' os.makedirs(dataset_dir, exist_ok=True) with zipfile.ZipFile(zip_file, 'r') as zf: zf.extractall(dataset_dir) # load the dataset with waifuc source = LocalSource(dataset_dir) for item in source: print(item.image, item.meta['filename'], item.meta['tags']) ``` ## List of Clusters List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------------------------------------------------| | 0 | 25 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | hair_bobbles, 1girl, solo, lock, layered_sleeves, blush, skirt, road_sign | ### Table Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | hair_bobbles | 1girl | solo | lock | layered_sleeves | blush | skirt | road_sign | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:---------------|:--------|:-------|:-------|:------------------|:--------|:--------|:------------| | 0 | 25 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X |
CyberHarem/kawashiro_mitori_touhou
[ "task_categories:text-to-image", "size_categories:n<1K", "license:mit", "art", "not-for-all-audiences", "region:us" ]
2023-08-18T20:08:44+00:00
{"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]}
2024-01-15T03:24:46+00:00
[]
[]
TAGS #task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
Dataset of kawashiro\_mitori/河城みとり (Touhou) =========================================== This is the dataset of kawashiro\_mitori/河城みとり (Touhou), containing 25 images and their tags. The core tags of this character are 'hair\_ornament, hat, short\_hair, red\_eyes, pink\_hair, side\_ponytail', which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization). List of Packages ---------------- ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code List of Clusters ---------------- List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version ### Table Version
[ "### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.", "### Raw Text Version", "### Table Version" ]
[ "TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n", "### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.", "### Raw Text Version", "### Table Version" ]
[ 44, 61, 5, 4 ]
[ "passage: TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.### Raw Text Version### Table Version" ]
80c317b6497865b60bd8c7afb2d3b694c39d66f3
# Dataset of haniyasushin_keiki/埴安神袿姫/하니야스신케이키 (Touhou) This is the dataset of haniyasushin_keiki/埴安神袿姫/하니야스신케이키 (Touhou), containing 500 images and their tags. The core tags of this character are `blue_hair, long_hair, green_headwear, bangs, ribbon, arm_ribbon, breasts, red_eyes`, which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)). ## List of Packages | Name | Images | Size | Download | Type | Description | |:-----------------|---------:|:-----------|:---------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------| | raw | 500 | 674.01 MiB | [Download](https://huggingface.co/datasets/CyberHarem/haniyasushin_keiki_touhou/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). | | 800 | 500 | 370.58 MiB | [Download](https://huggingface.co/datasets/CyberHarem/haniyasushin_keiki_touhou/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. | | stage3-p480-800 | 1162 | 771.46 MiB | [Download](https://huggingface.co/datasets/CyberHarem/haniyasushin_keiki_touhou/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | | 1200 | 500 | 594.01 MiB | [Download](https://huggingface.co/datasets/CyberHarem/haniyasushin_keiki_touhou/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. | | stage3-p480-1200 | 1162 | 1.09 GiB | [Download](https://huggingface.co/datasets/CyberHarem/haniyasushin_keiki_touhou/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code ```python import os import zipfile from huggingface_hub import hf_hub_download from waifuc.source import LocalSource # download raw archive file zip_file = hf_hub_download( repo_id='CyberHarem/haniyasushin_keiki_touhou', repo_type='dataset', filename='dataset-raw.zip', ) # extract files to your directory dataset_dir = 'dataset_dir' os.makedirs(dataset_dir, exist_ok=True) with zipfile.ZipFile(zip_file, 'r') as zf: zf.extractall(dataset_dir) # load the dataset with waifuc source = LocalSource(dataset_dir) for item in source: print(item.image, item.meta['filename'], item.meta['tags']) ``` ## List of Clusters List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| | 0 | 8 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, green_apron, head_scarf, magatama_necklace, single_strap, smile, solo, yellow_dress, between_fingers, looking_at_viewer, open_mouth, tools, simple_background, black_background, short_sleeves, purple_eyes, upper_body | | 1 | 6 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | 1girl, between_fingers, flower, green_apron, head_scarf, looking_at_viewer, magatama_necklace, smile, solo, yellow_dress, blue_ribbon, open_mouth, puffy_short_sleeves, purple_eyes, tools, fire, pocket, single_strap | | 2 | 10 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | 1girl, green_apron, head_scarf, magatama_necklace, single_strap, smile, solo, yellow_dress, closed_mouth, looking_at_viewer, between_fingers, tools, pink_eyes, blush, puffy_short_sleeves, flower, pocket, purple_eyes | | 3 | 8 | ![](samples/3/clu3-sample0.png) | ![](samples/3/clu3-sample1.png) | ![](samples/3/clu3-sample2.png) | ![](samples/3/clu3-sample3.png) | ![](samples/3/clu3-sample4.png) | 1girl, between_fingers, blue_ribbon, green_apron, head_scarf, looking_at_viewer, magatama_necklace, pocket, sandals, simple_background, single_strap, smile, solo, tools, white_background, yellow_dress, full_body, purple_eyes, standing, flower, wide_sleeves, fire, juliet_sleeves, pink_eyes, closed_mouth | | 4 | 5 | ![](samples/4/clu4-sample0.png) | ![](samples/4/clu4-sample1.png) | ![](samples/4/clu4-sample2.png) | ![](samples/4/clu4-sample3.png) | ![](samples/4/clu4-sample4.png) | 1girl, between_fingers, flower, green_apron, head_scarf, looking_at_viewer, magatama_necklace, open_mouth, pink_eyes, solo, tools, yellow_dress, pocket, blue_ribbon, puffy_sleeves, :d, long_sleeves, short_sleeves, upper_body, wide_sleeves | | 5 | 5 | ![](samples/5/clu5-sample0.png) | ![](samples/5/clu5-sample1.png) | ![](samples/5/clu5-sample2.png) | ![](samples/5/clu5-sample3.png) | ![](samples/5/clu5-sample4.png) | 1girl, barefoot, black_background, full_body, green_apron, head_scarf, holding, looking_at_viewer, magatama_necklace, short_sleeves, simple_background, solo, yellow_dress, closed_mouth, single_strap, black_eyes, puffy_sleeves, standing | | 6 | 6 | ![](samples/6/clu6-sample0.png) | ![](samples/6/clu6-sample1.png) | ![](samples/6/clu6-sample2.png) | ![](samples/6/clu6-sample3.png) | ![](samples/6/clu6-sample4.png) | 1girl, between_fingers, green_apron, green_belt, green_scarf, head_scarf, looking_at_viewer, magatama_necklace, open_mouth, pink_eyes, pocket, simple_background, smile, solo, yellow_dress, blue_ribbon, medium_breasts, puffy_short_sleeves, standing, tools, white_background, white_flower, blush, hair_between_eyes, hands_up, frills, yellow_sleeves | | 7 | 7 | ![](samples/7/clu7-sample0.png) | ![](samples/7/clu7-sample1.png) | ![](samples/7/clu7-sample2.png) | ![](samples/7/clu7-sample3.png) | ![](samples/7/clu7-sample4.png) | 1girl, large_breasts, simple_background, solo, blush, head_scarf, navel, purple_eyes, collarbone, looking_at_viewer, upper_body, closed_mouth, nude, puffy_nipples, armpits, hair_between_eyes, shiny, sweat, white_background | | 8 | 5 | ![](samples/8/clu8-sample0.png) | ![](samples/8/clu8-sample1.png) | ![](samples/8/clu8-sample2.png) | ![](samples/8/clu8-sample3.png) | ![](samples/8/clu8-sample4.png) | 1girl, blush, hetero, large_breasts, nipples, solo_focus, 1boy, bar_censor, navel, open_mouth, penis, completely_nude, magatama, spread_legs, vaginal, cowgirl_position, cum_in_pussy, hair_between_eyes, head_scarf, heart-shaped_pupils, jewelry, sex_from_behind, simple_background, sweat, tongue_out | ### Table Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | green_apron | head_scarf | magatama_necklace | single_strap | smile | solo | yellow_dress | between_fingers | looking_at_viewer | open_mouth | tools | simple_background | black_background | short_sleeves | purple_eyes | upper_body | flower | blue_ribbon | puffy_short_sleeves | fire | pocket | closed_mouth | pink_eyes | blush | sandals | white_background | full_body | standing | wide_sleeves | juliet_sleeves | puffy_sleeves | :d | long_sleeves | barefoot | holding | black_eyes | green_belt | green_scarf | medium_breasts | white_flower | hair_between_eyes | hands_up | frills | yellow_sleeves | large_breasts | navel | collarbone | nude | puffy_nipples | armpits | shiny | sweat | hetero | nipples | solo_focus | 1boy | bar_censor | penis | completely_nude | magatama | spread_legs | vaginal | cowgirl_position | cum_in_pussy | heart-shaped_pupils | jewelry | sex_from_behind | tongue_out | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------------|:-------------|:--------------------|:---------------|:--------|:-------|:---------------|:------------------|:--------------------|:-------------|:--------|:--------------------|:-------------------|:----------------|:--------------|:-------------|:---------|:--------------|:----------------------|:-------|:---------|:---------------|:------------|:--------|:----------|:-------------------|:------------|:-----------|:---------------|:-----------------|:----------------|:-----|:---------------|:-----------|:----------|:-------------|:-------------|:--------------|:-----------------|:---------------|:--------------------|:-----------|:---------|:-----------------|:----------------|:--------|:-------------|:-------|:----------------|:----------|:--------|:--------|:---------|:----------|:-------------|:-------|:-------------|:--------|:------------------|:-----------|:--------------|:----------|:-------------------|:---------------|:----------------------|:----------|:------------------|:-------------| | 0 | 8 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 1 | 6 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | X | X | X | X | X | X | X | X | X | X | X | X | | | | X | | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 2 | 10 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | X | X | X | X | X | X | X | X | X | X | | X | | | | X | | X | | X | | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 3 | 8 | ![](samples/3/clu3-sample0.png) | ![](samples/3/clu3-sample1.png) | ![](samples/3/clu3-sample2.png) | ![](samples/3/clu3-sample3.png) | ![](samples/3/clu3-sample4.png) | X | X | X | X | X | X | X | X | X | X | | X | X | | | X | | X | X | | X | X | X | X | | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 4 | 5 | ![](samples/4/clu4-sample0.png) | ![](samples/4/clu4-sample1.png) | ![](samples/4/clu4-sample2.png) | ![](samples/4/clu4-sample3.png) | ![](samples/4/clu4-sample4.png) | X | X | X | X | | | X | X | X | X | X | X | | | X | | X | X | X | | | X | | X | | | | | | X | | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 5 | 5 | ![](samples/5/clu5-sample0.png) | ![](samples/5/clu5-sample1.png) | ![](samples/5/clu5-sample2.png) | ![](samples/5/clu5-sample3.png) | ![](samples/5/clu5-sample4.png) | X | X | X | X | X | | X | X | | X | | | X | X | X | | | | | | | | X | | | | | X | X | | | X | | | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 6 | 6 | ![](samples/6/clu6-sample0.png) | ![](samples/6/clu6-sample1.png) | ![](samples/6/clu6-sample2.png) | ![](samples/6/clu6-sample3.png) | ![](samples/6/clu6-sample4.png) | X | X | X | X | | X | X | X | X | X | X | X | X | | | | | | X | X | | X | | X | X | | X | | X | | | | | | | | | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | 7 | 7 | ![](samples/7/clu7-sample0.png) | ![](samples/7/clu7-sample1.png) | ![](samples/7/clu7-sample2.png) | ![](samples/7/clu7-sample3.png) | ![](samples/7/clu7-sample4.png) | X | | X | | | | X | | | X | | | X | | | X | X | | | | | | X | | X | | X | | | | | | | | | | | | | | | X | | | | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | 8 | 5 | ![](samples/8/clu8-sample0.png) | ![](samples/8/clu8-sample1.png) | ![](samples/8/clu8-sample2.png) | ![](samples/8/clu8-sample3.png) | ![](samples/8/clu8-sample4.png) | X | | X | | | | | | | | X | | X | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | X | | | | X | X | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
CyberHarem/haniyasushin_keiki_touhou
[ "task_categories:text-to-image", "size_categories:n<1K", "license:mit", "art", "not-for-all-audiences", "region:us" ]
2023-08-18T20:13:55+00:00
{"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]}
2024-01-15T01:04:21+00:00
[]
[]
TAGS #task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
Dataset of haniyasushin\_keiki/埴安神袿姫/하니야스신케이키 (Touhou) ====================================================== This is the dataset of haniyasushin\_keiki/埴安神袿姫/하니야스신케이키 (Touhou), containing 500 images and their tags. The core tags of this character are 'blue\_hair, long\_hair, green\_headwear, bangs, ribbon, arm\_ribbon, breasts, red\_eyes', which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization). List of Packages ---------------- ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code List of Clusters ---------------- List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version ### Table Version
[ "### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.", "### Raw Text Version", "### Table Version" ]
[ "TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n", "### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.", "### Raw Text Version", "### Table Version" ]
[ 44, 61, 5, 4 ]
[ "passage: TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.### Raw Text Version### Table Version" ]
e4d7f0025e10b954424d9613b345b3c5a07d8f86
# Dataset of komakusa_sannyo/駒草山如/코마쿠사산뇨 (Touhou) This is the dataset of komakusa_sannyo/駒草山如/코마쿠사산뇨 (Touhou), containing 202 images and their tags. The core tags of this character are `purple_hair, ponytail, ribbon, long_hair, hair_ribbon, yellow_ribbon, red_eyes, bangs, breasts`, which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)). ## List of Packages | Name | Images | Size | Download | Type | Description | |:-----------------|---------:|:-----------|:------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------| | raw | 202 | 238.06 MiB | [Download](https://huggingface.co/datasets/CyberHarem/komakusa_sannyo_touhou/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). | | 800 | 202 | 139.52 MiB | [Download](https://huggingface.co/datasets/CyberHarem/komakusa_sannyo_touhou/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. | | stage3-p480-800 | 467 | 290.20 MiB | [Download](https://huggingface.co/datasets/CyberHarem/komakusa_sannyo_touhou/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | | 1200 | 202 | 211.62 MiB | [Download](https://huggingface.co/datasets/CyberHarem/komakusa_sannyo_touhou/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. | | stage3-p480-1200 | 467 | 396.36 MiB | [Download](https://huggingface.co/datasets/CyberHarem/komakusa_sannyo_touhou/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code ```python import os import zipfile from huggingface_hub import hf_hub_download from waifuc.source import LocalSource # download raw archive file zip_file = hf_hub_download( repo_id='CyberHarem/komakusa_sannyo_touhou', repo_type='dataset', filename='dataset-raw.zip', ) # extract files to your directory dataset_dir = 'dataset_dir' os.makedirs(dataset_dir, exist_ok=True) with zipfile.ZipFile(zip_file, 'r') as zf: zf.extractall(dataset_dir) # load the dataset with waifuc source = LocalSource(dataset_dir) for item in source: print(item.image, item.meta['filename'], item.meta['tags']) ``` ## List of Clusters List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| | 0 | 22 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, kiseru, red_kimono, solo, holding_smoking_pipe, looking_at_viewer, smoke, wide_sleeves, long_sleeves, simple_background, smile, white_background, open_mouth, purple_skirt, blush, upper_body | | 1 | 7 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | 1girl, full_body, kiseru, long_sleeves, purple_skirt, red_kimono, simple_background, solo, white_background, wide_sleeves, folded_fan, holding_fan, robe, smile, smoke, sandals, standing, geta, holding_smoking_pipe, looking_at_viewer | | 2 | 6 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | 1girl, folded_fan, holding_fan, holding_smoking_pipe, kiseru, long_sleeves, looking_at_viewer, purple_skirt, red_kimono, robe, smoke, solo, wide_sleeves, smile, closed_mouth, parted_bangs, standing, medium_breasts | ### Table Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | kiseru | red_kimono | solo | holding_smoking_pipe | looking_at_viewer | smoke | wide_sleeves | long_sleeves | simple_background | smile | white_background | open_mouth | purple_skirt | blush | upper_body | full_body | folded_fan | holding_fan | robe | sandals | standing | geta | closed_mouth | parted_bangs | medium_breasts | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:---------|:-------------|:-------|:-----------------------|:--------------------|:--------|:---------------|:---------------|:--------------------|:--------|:-------------------|:-------------|:---------------|:--------|:-------------|:------------|:-------------|:--------------|:-------|:----------|:-----------|:-------|:---------------|:---------------|:-----------------| | 0 | 22 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | 1 | 7 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | X | X | X | X | X | X | X | X | X | X | X | X | | X | | | X | X | X | X | X | X | X | | | | | 2 | 6 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | X | X | X | X | X | X | X | X | X | | X | | | X | | | | X | X | X | | X | | X | X | X |
CyberHarem/komakusa_sannyo_touhou
[ "task_categories:text-to-image", "size_categories:n<1K", "license:mit", "art", "not-for-all-audiences", "region:us" ]
2023-08-18T20:37:21+00:00
{"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]}
2024-01-15T05:10:53+00:00
[]
[]
TAGS #task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
Dataset of komakusa\_sannyo/駒草山如/코마쿠사산뇨 (Touhou) ================================================ This is the dataset of komakusa\_sannyo/駒草山如/코마쿠사산뇨 (Touhou), containing 202 images and their tags. The core tags of this character are 'purple\_hair, ponytail, ribbon, long\_hair, hair\_ribbon, yellow\_ribbon, red\_eyes, bangs, breasts', which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization). List of Packages ---------------- ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code List of Clusters ---------------- List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version ### Table Version
[ "### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.", "### Raw Text Version", "### Table Version" ]
[ "TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n", "### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.", "### Raw Text Version", "### Table Version" ]
[ 44, 61, 5, 4 ]
[ "passage: TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.### Raw Text Version### Table Version" ]
24df36cc408c898b0bb2b039bab36ccf70f7b7c2
# Dataset Card for "seperate_0" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
Jing24/seperate_0
[ "region:us" ]
2023-08-18T20:39:16+00:00
{"dataset_info": {"features": [{"name": "id", "dtype": "string"}, {"name": "title", "dtype": "string"}, {"name": "context", "dtype": "string"}, {"name": "question", "dtype": "string"}, {"name": "answers", "struct": [{"name": "answer_start", "sequence": "int32"}, {"name": "text", "sequence": "string"}]}], "splits": [{"name": "train", "num_bytes": 8063353, "num_examples": 9208}], "download_size": 1455012, "dataset_size": 8063353}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]}
2023-08-18T20:39:18+00:00
[]
[]
TAGS #region-us
# Dataset Card for "seperate_0" More Information needed
[ "# Dataset Card for \"seperate_0\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"seperate_0\"\n\nMore Information needed" ]
[ 6, 15 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"seperate_0\"\n\nMore Information needed" ]
43d4beee742ec6aba687b465a9efeb78f271de00
# Dataset Card for "seperate_1" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
Jing24/seperate_1
[ "region:us" ]
2023-08-18T20:39:19+00:00
{"dataset_info": {"features": [{"name": "id", "dtype": "string"}, {"name": "title", "dtype": "string"}, {"name": "context", "dtype": "string"}, {"name": "question", "dtype": "string"}, {"name": "answers", "struct": [{"name": "answer_start", "sequence": "int32"}, {"name": "text", "sequence": "string"}]}], "splits": [{"name": "train", "num_bytes": 7568218, "num_examples": 8021}], "download_size": 1401063, "dataset_size": 7568218}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]}
2023-08-18T20:39:21+00:00
[]
[]
TAGS #region-us
# Dataset Card for "seperate_1" More Information needed
[ "# Dataset Card for \"seperate_1\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"seperate_1\"\n\nMore Information needed" ]
[ 6, 14 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"seperate_1\"\n\nMore Information needed" ]
2874a329838eea4ede8e4f6efc5846a99db37898
# Dataset Card for "seperate_2" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
Jing24/seperate_2
[ "region:us" ]
2023-08-18T20:39:21+00:00
{"dataset_info": {"features": [{"name": "id", "dtype": "string"}, {"name": "title", "dtype": "string"}, {"name": "context", "dtype": "string"}, {"name": "question", "dtype": "string"}, {"name": "answers", "struct": [{"name": "answer_start", "sequence": "int32"}, {"name": "text", "sequence": "string"}]}], "splits": [{"name": "train", "num_bytes": 6921907, "num_examples": 7848}], "download_size": 1327593, "dataset_size": 6921907}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]}
2023-08-18T20:39:23+00:00
[]
[]
TAGS #region-us
# Dataset Card for "seperate_2" More Information needed
[ "# Dataset Card for \"seperate_2\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"seperate_2\"\n\nMore Information needed" ]
[ 6, 15 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"seperate_2\"\n\nMore Information needed" ]
9209b4894c63fde3c676c1f2812ea18eae50c153
# Dataset Card for "seperate_3" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
Jing24/seperate_3
[ "region:us" ]
2023-08-18T20:39:23+00:00
{"dataset_info": {"features": [{"name": "id", "dtype": "string"}, {"name": "title", "dtype": "string"}, {"name": "context", "dtype": "string"}, {"name": "question", "dtype": "string"}, {"name": "answers", "struct": [{"name": "answer_start", "sequence": "int32"}, {"name": "text", "sequence": "string"}]}], "splits": [{"name": "train", "num_bytes": 6880782, "num_examples": 7720}], "download_size": 1220030, "dataset_size": 6880782}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]}
2023-08-18T20:39:24+00:00
[]
[]
TAGS #region-us
# Dataset Card for "seperate_3" More Information needed
[ "# Dataset Card for \"seperate_3\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"seperate_3\"\n\nMore Information needed" ]
[ 6, 15 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"seperate_3\"\n\nMore Information needed" ]
5ba6a4231e73d7567d7283e553e693ee67ab6c7e
# Dataset Card for "seperate_4" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
Jing24/seperate_4
[ "region:us" ]
2023-08-18T20:39:25+00:00
{"dataset_info": {"features": [{"name": "id", "dtype": "string"}, {"name": "title", "dtype": "string"}, {"name": "context", "dtype": "string"}, {"name": "question", "dtype": "string"}, {"name": "answers", "struct": [{"name": "answer_start", "sequence": "int32"}, {"name": "text", "sequence": "string"}]}], "splits": [{"name": "train", "num_bytes": 7565982, "num_examples": 8162}], "download_size": 1382972, "dataset_size": 7565982}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]}
2023-08-18T20:39:27+00:00
[]
[]
TAGS #region-us
# Dataset Card for "seperate_4" More Information needed
[ "# Dataset Card for \"seperate_4\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"seperate_4\"\n\nMore Information needed" ]
[ 6, 15 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"seperate_4\"\n\nMore Information needed" ]
af98cb7de9787e0e4f577a2e879721a32026bf15
# Dataset Card for "seperate_5" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
Jing24/seperate_5
[ "region:us" ]
2023-08-18T20:39:27+00:00
{"dataset_info": {"features": [{"name": "id", "dtype": "string"}, {"name": "title", "dtype": "string"}, {"name": "context", "dtype": "string"}, {"name": "question", "dtype": "string"}, {"name": "answers", "struct": [{"name": "answer_start", "sequence": "int32"}, {"name": "text", "sequence": "string"}]}], "splits": [{"name": "train", "num_bytes": 8027732, "num_examples": 8533}], "download_size": 1385199, "dataset_size": 8027732}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]}
2023-08-18T20:39:29+00:00
[]
[]
TAGS #region-us
# Dataset Card for "seperate_5" More Information needed
[ "# Dataset Card for \"seperate_5\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"seperate_5\"\n\nMore Information needed" ]
[ 6, 15 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"seperate_5\"\n\nMore Information needed" ]
11da042477437ff12a00262ad7f5bcfe8105cab9
# Dataset Card for "seperate_6" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
Jing24/seperate_6
[ "region:us" ]
2023-08-18T20:39:29+00:00
{"dataset_info": {"features": [{"name": "id", "dtype": "string"}, {"name": "title", "dtype": "string"}, {"name": "context", "dtype": "string"}, {"name": "question", "dtype": "string"}, {"name": "answers", "struct": [{"name": "answer_start", "sequence": "int32"}, {"name": "text", "sequence": "string"}]}], "splits": [{"name": "train", "num_bytes": 7073760, "num_examples": 7809}], "download_size": 1306657, "dataset_size": 7073760}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]}
2023-08-18T20:39:31+00:00
[]
[]
TAGS #region-us
# Dataset Card for "seperate_6" More Information needed
[ "# Dataset Card for \"seperate_6\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"seperate_6\"\n\nMore Information needed" ]
[ 6, 15 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"seperate_6\"\n\nMore Information needed" ]
1fb90d7c8544b26aa8bcc3d8292e4b24edb2c30c
# Dataset Card for "seperate_7" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
Jing24/seperate_7
[ "region:us" ]
2023-08-18T20:39:31+00:00
{"dataset_info": {"features": [{"name": "id", "dtype": "string"}, {"name": "title", "dtype": "string"}, {"name": "context", "dtype": "string"}, {"name": "question", "dtype": "string"}, {"name": "answers", "struct": [{"name": "answer_start", "sequence": "int32"}, {"name": "text", "sequence": "string"}]}], "splits": [{"name": "train", "num_bytes": 6301415, "num_examples": 6897}], "download_size": 1157468, "dataset_size": 6301415}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]}
2023-08-18T20:39:33+00:00
[]
[]
TAGS #region-us
# Dataset Card for "seperate_7" More Information needed
[ "# Dataset Card for \"seperate_7\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"seperate_7\"\n\nMore Information needed" ]
[ 6, 15 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"seperate_7\"\n\nMore Information needed" ]
6a47f8360994f9060411f18968ce2f6b8273cbab
# Dataset Card for "seperate_8" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
Jing24/seperate_8
[ "region:us" ]
2023-08-18T20:39:33+00:00
{"dataset_info": {"features": [{"name": "id", "dtype": "string"}, {"name": "title", "dtype": "string"}, {"name": "context", "dtype": "string"}, {"name": "question", "dtype": "string"}, {"name": "answers", "struct": [{"name": "answer_start", "sequence": "int32"}, {"name": "text", "sequence": "string"}]}], "splits": [{"name": "train", "num_bytes": 7635652, "num_examples": 8731}], "download_size": 1353779, "dataset_size": 7635652}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]}
2023-08-18T20:39:35+00:00
[]
[]
TAGS #region-us
# Dataset Card for "seperate_8" More Information needed
[ "# Dataset Card for \"seperate_8\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"seperate_8\"\n\nMore Information needed" ]
[ 6, 15 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"seperate_8\"\n\nMore Information needed" ]
59bebeb2c1d14d2a426923ea75ebe7a715abe7a7
# Dataset Card for "seperate_9" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
Jing24/seperate_9
[ "region:us" ]
2023-08-18T20:39:35+00:00
{"dataset_info": {"features": [{"name": "id", "dtype": "string"}, {"name": "title", "dtype": "string"}, {"name": "context", "dtype": "string"}, {"name": "question", "dtype": "string"}, {"name": "answers", "struct": [{"name": "answer_start", "sequence": "int32"}, {"name": "text", "sequence": "string"}]}], "splits": [{"name": "train", "num_bytes": 6039075, "num_examples": 6803}], "download_size": 1167345, "dataset_size": 6039075}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]}
2023-08-18T20:39:37+00:00
[]
[]
TAGS #region-us
# Dataset Card for "seperate_9" More Information needed
[ "# Dataset Card for \"seperate_9\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"seperate_9\"\n\nMore Information needed" ]
[ 6, 15 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"seperate_9\"\n\nMore Information needed" ]
a333b5ef2643690a9e78501255c4e344ac848f15
# Dataset Card for "seperate_10" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
Jing24/seperate_10
[ "region:us" ]
2023-08-18T20:39:38+00:00
{"dataset_info": {"features": [{"name": "id", "dtype": "string"}, {"name": "title", "dtype": "string"}, {"name": "context", "dtype": "string"}, {"name": "question", "dtype": "string"}, {"name": "answers", "struct": [{"name": "answer_start", "sequence": "int32"}, {"name": "text", "sequence": "string"}]}], "splits": [{"name": "train", "num_bytes": 6991061, "num_examples": 7503}], "download_size": 1295926, "dataset_size": 6991061}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]}
2023-08-18T20:39:39+00:00
[]
[]
TAGS #region-us
# Dataset Card for "seperate_10" More Information needed
[ "# Dataset Card for \"seperate_10\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"seperate_10\"\n\nMore Information needed" ]
[ 6, 15 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"seperate_10\"\n\nMore Information needed" ]
ae692e78608a5f83a2f5c016dc76e7f419e6af97
# Dataset Card for "seperate_all0" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
Jing24/seperate_all0
[ "region:us" ]
2023-08-18T20:41:51+00:00
{"dataset_info": {"features": [{"name": "id", "dtype": "string"}, {"name": "title", "dtype": "string"}, {"name": "context", "dtype": "string"}, {"name": "question", "dtype": "string"}, {"name": "answers", "struct": [{"name": "answer_start", "sequence": "int32"}, {"name": "text", "sequence": "string"}]}], "splits": [{"name": "train", "num_bytes": 8063353, "num_examples": 9208}], "download_size": 1455012, "dataset_size": 8063353}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]}
2023-08-18T20:41:53+00:00
[]
[]
TAGS #region-us
# Dataset Card for "seperate_all0" More Information needed
[ "# Dataset Card for \"seperate_all0\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"seperate_all0\"\n\nMore Information needed" ]
[ 6, 16 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"seperate_all0\"\n\nMore Information needed" ]
096974605dd5d336b91dea271b9869140e00f747
# Dataset Card for "seperate_all_sub0" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
Jing24/seperate_all_sub0
[ "region:us" ]
2023-08-18T20:41:54+00:00
{"dataset_info": {"features": [{"name": "id", "dtype": "string"}, {"name": "title", "dtype": "string"}, {"name": "context", "dtype": "string"}, {"name": "question", "dtype": "string"}, {"name": "answers", "struct": [{"name": "answer_start", "sequence": "int32"}, {"name": "text", "sequence": "string"}]}], "splits": [{"name": "train", "num_bytes": 71282755, "num_examples": 78391}], "download_size": 13012921, "dataset_size": 71282755}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]}
2023-08-18T20:41:56+00:00
[]
[]
TAGS #region-us
# Dataset Card for "seperate_all_sub0" More Information needed
[ "# Dataset Card for \"seperate_all_sub0\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"seperate_all_sub0\"\n\nMore Information needed" ]
[ 6, 18 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"seperate_all_sub0\"\n\nMore Information needed" ]
cb340c3b9dac4a7352580435b1a5daf487afe1f0
# Dataset Card for "seperate_all1" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
Jing24/seperate_all1
[ "region:us" ]
2023-08-18T20:41:57+00:00
{"dataset_info": {"features": [{"name": "id", "dtype": "string"}, {"name": "title", "dtype": "string"}, {"name": "context", "dtype": "string"}, {"name": "question", "dtype": "string"}, {"name": "answers", "struct": [{"name": "answer_start", "sequence": "int32"}, {"name": "text", "sequence": "string"}]}], "splits": [{"name": "train", "num_bytes": 15631571, "num_examples": 17229}], "download_size": 2844837, "dataset_size": 15631571}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]}
2023-08-18T20:41:58+00:00
[]
[]
TAGS #region-us
# Dataset Card for "seperate_all1" More Information needed
[ "# Dataset Card for \"seperate_all1\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"seperate_all1\"\n\nMore Information needed" ]
[ 6, 16 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"seperate_all1\"\n\nMore Information needed" ]
eebc265c15cda73073481b99825f3f049b5b427a
# Dataset Card for "seperate_all_sub1" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
Jing24/seperate_all_sub1
[ "region:us" ]
2023-08-18T20:41:59+00:00
{"dataset_info": {"features": [{"name": "id", "dtype": "string"}, {"name": "title", "dtype": "string"}, {"name": "context", "dtype": "string"}, {"name": "question", "dtype": "string"}, {"name": "answers", "struct": [{"name": "answer_start", "sequence": "int32"}, {"name": "text", "sequence": "string"}]}], "splits": [{"name": "train", "num_bytes": 63714537, "num_examples": 70370}], "download_size": 11609323, "dataset_size": 63714537}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]}
2023-08-18T20:42:01+00:00
[]
[]
TAGS #region-us
# Dataset Card for "seperate_all_sub1" More Information needed
[ "# Dataset Card for \"seperate_all_sub1\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"seperate_all_sub1\"\n\nMore Information needed" ]
[ 6, 18 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"seperate_all_sub1\"\n\nMore Information needed" ]
bfbe9fa76a264fe81a2305a904acf5f871111dad
# Dataset Card for "seperate_all2" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
Jing24/seperate_all2
[ "region:us" ]
2023-08-18T20:42:01+00:00
{"dataset_info": {"features": [{"name": "id", "dtype": "string"}, {"name": "title", "dtype": "string"}, {"name": "context", "dtype": "string"}, {"name": "question", "dtype": "string"}, {"name": "answers", "struct": [{"name": "answer_start", "sequence": "int32"}, {"name": "text", "sequence": "string"}]}], "splits": [{"name": "train", "num_bytes": 22553478, "num_examples": 25077}], "download_size": 4170668, "dataset_size": 22553478}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]}
2023-08-18T20:42:03+00:00
[]
[]
TAGS #region-us
# Dataset Card for "seperate_all2" More Information needed
[ "# Dataset Card for \"seperate_all2\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"seperate_all2\"\n\nMore Information needed" ]
[ 6, 16 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"seperate_all2\"\n\nMore Information needed" ]
5b75ba5362660d894adf7cbbb527be6c1f72c9ee
# Dataset Card for "seperate_all_sub2" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
Jing24/seperate_all_sub2
[ "region:us" ]
2023-08-18T20:42:04+00:00
{"dataset_info": {"features": [{"name": "id", "dtype": "string"}, {"name": "title", "dtype": "string"}, {"name": "context", "dtype": "string"}, {"name": "question", "dtype": "string"}, {"name": "answers", "struct": [{"name": "answer_start", "sequence": "int32"}, {"name": "text", "sequence": "string"}]}], "splits": [{"name": "train", "num_bytes": 56792630, "num_examples": 62522}], "download_size": 10294052, "dataset_size": 56792630}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]}
2023-08-18T20:42:06+00:00
[]
[]
TAGS #region-us
# Dataset Card for "seperate_all_sub2" More Information needed
[ "# Dataset Card for \"seperate_all_sub2\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"seperate_all_sub2\"\n\nMore Information needed" ]
[ 6, 18 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"seperate_all_sub2\"\n\nMore Information needed" ]
a8662a81b1ef158ae4874a2f317e49b2952e0bb1
# Dataset Card for "seperate_all3" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
Jing24/seperate_all3
[ "region:us" ]
2023-08-18T20:42:06+00:00
{"dataset_info": {"features": [{"name": "id", "dtype": "string"}, {"name": "title", "dtype": "string"}, {"name": "context", "dtype": "string"}, {"name": "question", "dtype": "string"}, {"name": "answers", "struct": [{"name": "answer_start", "sequence": "int32"}, {"name": "text", "sequence": "string"}]}], "splits": [{"name": "train", "num_bytes": 29434260, "num_examples": 32797}], "download_size": 5385948, "dataset_size": 29434260}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]}
2023-08-18T20:42:08+00:00
[]
[]
TAGS #region-us
# Dataset Card for "seperate_all3" More Information needed
[ "# Dataset Card for \"seperate_all3\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"seperate_all3\"\n\nMore Information needed" ]
[ 6, 16 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"seperate_all3\"\n\nMore Information needed" ]
70e79c112c52db1d58177b8836fd65cf9fc673bd
# Dataset Card for "seperate_all_sub3" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
Jing24/seperate_all_sub3
[ "region:us" ]
2023-08-18T20:42:09+00:00
{"dataset_info": {"features": [{"name": "id", "dtype": "string"}, {"name": "title", "dtype": "string"}, {"name": "context", "dtype": "string"}, {"name": "question", "dtype": "string"}, {"name": "answers", "struct": [{"name": "answer_start", "sequence": "int32"}, {"name": "text", "sequence": "string"}]}], "splits": [{"name": "train", "num_bytes": 49911848, "num_examples": 54802}], "download_size": 9070743, "dataset_size": 49911848}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]}
2023-08-18T20:42:11+00:00
[]
[]
TAGS #region-us
# Dataset Card for "seperate_all_sub3" More Information needed
[ "# Dataset Card for \"seperate_all_sub3\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"seperate_all_sub3\"\n\nMore Information needed" ]
[ 6, 18 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"seperate_all_sub3\"\n\nMore Information needed" ]
67d2b45f6ef2269c1158b87c025e3be1453cb6df
# Dataset Card for "seperate_all4" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
Jing24/seperate_all4
[ "region:us" ]
2023-08-18T20:42:11+00:00
{"dataset_info": {"features": [{"name": "id", "dtype": "string"}, {"name": "title", "dtype": "string"}, {"name": "context", "dtype": "string"}, {"name": "question", "dtype": "string"}, {"name": "answers", "struct": [{"name": "answer_start", "sequence": "int32"}, {"name": "text", "sequence": "string"}]}], "splits": [{"name": "train", "num_bytes": 37000242, "num_examples": 40959}], "download_size": 6763324, "dataset_size": 37000242}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]}
2023-08-18T20:42:13+00:00
[]
[]
TAGS #region-us
# Dataset Card for "seperate_all4" More Information needed
[ "# Dataset Card for \"seperate_all4\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"seperate_all4\"\n\nMore Information needed" ]
[ 6, 16 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"seperate_all4\"\n\nMore Information needed" ]
4cef31910d292b1401be93fd4301d525ec69e2cc
# Dataset Card for "seperate_all_sub4" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
Jing24/seperate_all_sub4
[ "region:us" ]
2023-08-18T20:42:13+00:00
{"dataset_info": {"features": [{"name": "id", "dtype": "string"}, {"name": "title", "dtype": "string"}, {"name": "context", "dtype": "string"}, {"name": "question", "dtype": "string"}, {"name": "answers", "struct": [{"name": "answer_start", "sequence": "int32"}, {"name": "text", "sequence": "string"}]}], "splits": [{"name": "train", "num_bytes": 42345866, "num_examples": 46640}], "download_size": 7708985, "dataset_size": 42345866}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]}
2023-08-18T20:42:16+00:00
[]
[]
TAGS #region-us
# Dataset Card for "seperate_all_sub4" More Information needed
[ "# Dataset Card for \"seperate_all_sub4\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"seperate_all_sub4\"\n\nMore Information needed" ]
[ 6, 18 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"seperate_all_sub4\"\n\nMore Information needed" ]
be02e13f7b50ab727d6f896f0ce717aa3c8b8f30
# Dataset Card for "seperate_all5" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
Jing24/seperate_all5
[ "region:us" ]
2023-08-18T20:42:17+00:00
{"dataset_info": {"features": [{"name": "id", "dtype": "string"}, {"name": "title", "dtype": "string"}, {"name": "context", "dtype": "string"}, {"name": "question", "dtype": "string"}, {"name": "answers", "struct": [{"name": "answer_start", "sequence": "int32"}, {"name": "text", "sequence": "string"}]}], "splits": [{"name": "train", "num_bytes": 45027974, "num_examples": 49492}], "download_size": 8152264, "dataset_size": 45027974}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]}
2023-08-18T20:42:18+00:00
[]
[]
TAGS #region-us
# Dataset Card for "seperate_all5" More Information needed
[ "# Dataset Card for \"seperate_all5\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"seperate_all5\"\n\nMore Information needed" ]
[ 6, 16 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"seperate_all5\"\n\nMore Information needed" ]
73a7fc8a2ae7a7f311345e38f18b979dd2e35ae7
# Dataset Card for "seperate_all_sub5" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
Jing24/seperate_all_sub5
[ "region:us" ]
2023-08-18T20:42:19+00:00
{"dataset_info": {"features": [{"name": "id", "dtype": "string"}, {"name": "title", "dtype": "string"}, {"name": "context", "dtype": "string"}, {"name": "question", "dtype": "string"}, {"name": "answers", "struct": [{"name": "answer_start", "sequence": "int32"}, {"name": "text", "sequence": "string"}]}], "splits": [{"name": "train", "num_bytes": 34318134, "num_examples": 38107}], "download_size": 6315411, "dataset_size": 34318134}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]}
2023-08-18T20:42:21+00:00
[]
[]
TAGS #region-us
# Dataset Card for "seperate_all_sub5" More Information needed
[ "# Dataset Card for \"seperate_all_sub5\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"seperate_all_sub5\"\n\nMore Information needed" ]
[ 6, 18 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"seperate_all_sub5\"\n\nMore Information needed" ]
34869bf40293b01a9f6bd194ba52d7df968409ce
# Dataset Card for "seperate_all6" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
Jing24/seperate_all6
[ "region:us" ]
2023-08-18T20:42:21+00:00
{"dataset_info": {"features": [{"name": "id", "dtype": "string"}, {"name": "title", "dtype": "string"}, {"name": "context", "dtype": "string"}, {"name": "question", "dtype": "string"}, {"name": "answers", "struct": [{"name": "answer_start", "sequence": "int32"}, {"name": "text", "sequence": "string"}]}], "splits": [{"name": "train", "num_bytes": 52101734, "num_examples": 57301}], "download_size": 9453404, "dataset_size": 52101734}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]}
2023-08-18T20:42:23+00:00
[]
[]
TAGS #region-us
# Dataset Card for "seperate_all6" More Information needed
[ "# Dataset Card for \"seperate_all6\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"seperate_all6\"\n\nMore Information needed" ]
[ 6, 16 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"seperate_all6\"\n\nMore Information needed" ]
3c6d2e806de5c0de0f6a35904a509eb122770c77
# Dataset Card for "seperate_all_sub6" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
Jing24/seperate_all_sub6
[ "region:us" ]
2023-08-18T20:42:23+00:00
{"dataset_info": {"features": [{"name": "id", "dtype": "string"}, {"name": "title", "dtype": "string"}, {"name": "context", "dtype": "string"}, {"name": "question", "dtype": "string"}, {"name": "answers", "struct": [{"name": "answer_start", "sequence": "int32"}, {"name": "text", "sequence": "string"}]}], "splits": [{"name": "train", "num_bytes": 27244374, "num_examples": 30298}], "download_size": 5018838, "dataset_size": 27244374}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]}
2023-08-18T20:42:25+00:00
[]
[]
TAGS #region-us
# Dataset Card for "seperate_all_sub6" More Information needed
[ "# Dataset Card for \"seperate_all_sub6\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"seperate_all_sub6\"\n\nMore Information needed" ]
[ 6, 18 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"seperate_all_sub6\"\n\nMore Information needed" ]
7627530d280b3a428399ef9ccb6fb63d780337fe
# Dataset Card for "seperate_all7" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
Jing24/seperate_all7
[ "region:us" ]
2023-08-18T20:42:26+00:00
{"dataset_info": {"features": [{"name": "id", "dtype": "string"}, {"name": "title", "dtype": "string"}, {"name": "context", "dtype": "string"}, {"name": "question", "dtype": "string"}, {"name": "answers", "struct": [{"name": "answer_start", "sequence": "int32"}, {"name": "text", "sequence": "string"}]}], "splits": [{"name": "train", "num_bytes": 58403149, "num_examples": 64198}], "download_size": 10608919, "dataset_size": 58403149}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]}
2023-08-18T20:42:28+00:00
[]
[]
TAGS #region-us
# Dataset Card for "seperate_all7" More Information needed
[ "# Dataset Card for \"seperate_all7\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"seperate_all7\"\n\nMore Information needed" ]
[ 6, 16 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"seperate_all7\"\n\nMore Information needed" ]
b9f172b488fcf5f77f268495ccde0b6468736c21
# Dataset Card for "seperate_all_sub7" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
Jing24/seperate_all_sub7
[ "region:us" ]
2023-08-18T20:42:28+00:00
{"dataset_info": {"features": [{"name": "id", "dtype": "string"}, {"name": "title", "dtype": "string"}, {"name": "context", "dtype": "string"}, {"name": "question", "dtype": "string"}, {"name": "answers", "struct": [{"name": "answer_start", "sequence": "int32"}, {"name": "text", "sequence": "string"}]}], "splits": [{"name": "train", "num_bytes": 20942959, "num_examples": 23401}], "download_size": 3863400, "dataset_size": 20942959}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]}
2023-08-18T20:42:30+00:00
[]
[]
TAGS #region-us
# Dataset Card for "seperate_all_sub7" More Information needed
[ "# Dataset Card for \"seperate_all_sub7\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"seperate_all_sub7\"\n\nMore Information needed" ]
[ 6, 18 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"seperate_all_sub7\"\n\nMore Information needed" ]
4c9537593883b340e83350ffbc0cdb7e3e75dffc
# Dataset Card for "seperate_all8" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
Jing24/seperate_all8
[ "region:us" ]
2023-08-18T20:42:30+00:00
{"dataset_info": {"features": [{"name": "id", "dtype": "string"}, {"name": "title", "dtype": "string"}, {"name": "context", "dtype": "string"}, {"name": "question", "dtype": "string"}, {"name": "answers", "struct": [{"name": "answer_start", "sequence": "int32"}, {"name": "text", "sequence": "string"}]}], "splits": [{"name": "train", "num_bytes": 66038801, "num_examples": 72929}], "download_size": 11959899, "dataset_size": 66038801}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]}
2023-08-18T20:42:33+00:00
[]
[]
TAGS #region-us
# Dataset Card for "seperate_all8" More Information needed
[ "# Dataset Card for \"seperate_all8\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"seperate_all8\"\n\nMore Information needed" ]
[ 6, 16 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"seperate_all8\"\n\nMore Information needed" ]
2d6fd2e65118c72da9d5f7497c5d8eaf8b2ca752
# Dataset Card for "seperate_all_sub8" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
Jing24/seperate_all_sub8
[ "region:us" ]
2023-08-18T20:42:33+00:00
{"dataset_info": {"features": [{"name": "id", "dtype": "string"}, {"name": "title", "dtype": "string"}, {"name": "context", "dtype": "string"}, {"name": "question", "dtype": "string"}, {"name": "answers", "struct": [{"name": "answer_start", "sequence": "int32"}, {"name": "text", "sequence": "string"}]}], "splits": [{"name": "train", "num_bytes": 13307307, "num_examples": 14670}], "download_size": 2515004, "dataset_size": 13307307}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]}
2023-08-18T20:42:34+00:00
[]
[]
TAGS #region-us
# Dataset Card for "seperate_all_sub8" More Information needed
[ "# Dataset Card for \"seperate_all_sub8\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"seperate_all_sub8\"\n\nMore Information needed" ]
[ 6, 18 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"seperate_all_sub8\"\n\nMore Information needed" ]
0a5a05e3d525587677f3145850dc7d994f336941
# Dataset Card for "seperate_all9" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
Jing24/seperate_all9
[ "region:us" ]
2023-08-18T20:42:35+00:00
{"dataset_info": {"features": [{"name": "id", "dtype": "string"}, {"name": "title", "dtype": "string"}, {"name": "context", "dtype": "string"}, {"name": "question", "dtype": "string"}, {"name": "answers", "struct": [{"name": "answer_start", "sequence": "int32"}, {"name": "text", "sequence": "string"}]}], "splits": [{"name": "train", "num_bytes": 72077876, "num_examples": 79732}], "download_size": 13125316, "dataset_size": 72077876}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]}
2023-08-18T20:42:37+00:00
[]
[]
TAGS #region-us
# Dataset Card for "seperate_all9" More Information needed
[ "# Dataset Card for \"seperate_all9\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"seperate_all9\"\n\nMore Information needed" ]
[ 6, 16 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"seperate_all9\"\n\nMore Information needed" ]
589c89a02a3fbc96e454e096c8fcb04ba90bf6e5
# Dataset Card for "seperate_all_sub9" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
Jing24/seperate_all_sub9
[ "region:us" ]
2023-08-18T20:42:37+00:00
{"dataset_info": {"features": [{"name": "id", "dtype": "string"}, {"name": "title", "dtype": "string"}, {"name": "context", "dtype": "string"}, {"name": "question", "dtype": "string"}, {"name": "answers", "struct": [{"name": "answer_start", "sequence": "int32"}, {"name": "text", "sequence": "string"}]}], "splits": [{"name": "train", "num_bytes": 7268232, "num_examples": 7867}], "download_size": 1346920, "dataset_size": 7268232}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]}
2023-08-18T20:42:39+00:00
[]
[]
TAGS #region-us
# Dataset Card for "seperate_all_sub9" More Information needed
[ "# Dataset Card for \"seperate_all_sub9\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"seperate_all_sub9\"\n\nMore Information needed" ]
[ 6, 18 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"seperate_all_sub9\"\n\nMore Information needed" ]
62678781bd4205b8f03ce05fffb1368e03168ca5
# Dataset Card for "seperate_all10" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
Jing24/seperate_all10
[ "region:us" ]
2023-08-18T20:42:39+00:00
{"dataset_info": {"features": [{"name": "id", "dtype": "string"}, {"name": "title", "dtype": "string"}, {"name": "context", "dtype": "string"}, {"name": "question", "dtype": "string"}, {"name": "answers", "struct": [{"name": "answer_start", "sequence": "int32"}, {"name": "text", "sequence": "string"}]}], "splits": [{"name": "train", "num_bytes": 79068937, "num_examples": 87235}], "download_size": 14409759, "dataset_size": 79068937}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]}
2023-08-18T20:42:41+00:00
[]
[]
TAGS #region-us
# Dataset Card for "seperate_all10" More Information needed
[ "# Dataset Card for \"seperate_all10\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"seperate_all10\"\n\nMore Information needed" ]
[ 6, 16 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"seperate_all10\"\n\nMore Information needed" ]
39d2b94e742ffaf8fd7a1565c06bbf0f368d5763
# Dataset Card for "seperate_all_sub10" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
Jing24/seperate_all_sub10
[ "region:us" ]
2023-08-18T20:42:42+00:00
{"dataset_info": {"features": [{"name": "id", "dtype": "string"}, {"name": "title", "dtype": "string"}, {"name": "context", "dtype": "string"}, {"name": "question", "dtype": "string"}, {"name": "answers", "struct": [{"name": "answer_start", "sequence": "int32"}, {"name": "text", "sequence": "string"}]}], "splits": [{"name": "train", "num_bytes": 277171, "num_examples": 364}], "download_size": 64920, "dataset_size": 277171}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]}
2023-08-18T20:42:43+00:00
[]
[]
TAGS #region-us
# Dataset Card for "seperate_all_sub10" More Information needed
[ "# Dataset Card for \"seperate_all_sub10\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"seperate_all_sub10\"\n\nMore Information needed" ]
[ 6, 18 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"seperate_all_sub10\"\n\nMore Information needed" ]
456a91aebe0234535f81f78dbfe0fe92d5777979
# Dataset of himemushi_momoyo/姫虫百々世/히메무시모모요 (Touhou) This is the dataset of himemushi_momoyo/姫虫百々世/히메무시모모요 (Touhou), containing 33 images and their tags. The core tags of this character are `long_hair, grey_hair, bow, orange_bow, ribbon, grey_eyes, breasts, very_long_hair, arm_ribbon, blue_eyes, bangs, blue_hair, orange_ribbon`, which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)). ## List of Packages | Name | Images | Size | Download | Type | Description | |:-----------------|---------:|:----------|:-------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------| | raw | 33 | 47.67 MiB | [Download](https://huggingface.co/datasets/CyberHarem/himemushi_momoyo_touhou/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). | | 800 | 33 | 25.97 MiB | [Download](https://huggingface.co/datasets/CyberHarem/himemushi_momoyo_touhou/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. | | stage3-p480-800 | 87 | 56.11 MiB | [Download](https://huggingface.co/datasets/CyberHarem/himemushi_momoyo_touhou/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | | 1200 | 33 | 41.93 MiB | [Download](https://huggingface.co/datasets/CyberHarem/himemushi_momoyo_touhou/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. | | stage3-p480-1200 | 87 | 79.51 MiB | [Download](https://huggingface.co/datasets/CyberHarem/himemushi_momoyo_touhou/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code ```python import os import zipfile from huggingface_hub import hf_hub_download from waifuc.source import LocalSource # download raw archive file zip_file = hf_hub_download( repo_id='CyberHarem/himemushi_momoyo_touhou', repo_type='dataset', filename='dataset-raw.zip', ) # extract files to your directory dataset_dir = 'dataset_dir' os.makedirs(dataset_dir, exist_ok=True) with zipfile.ZipFile(zip_file, 'r') as zf: zf.extractall(dataset_dir) # load the dataset with waifuc source = LocalSource(dataset_dir) for item in source: print(item.image, item.meta['filename'], item.meta['tags']) ``` ## List of Clusters List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| | 0 | 33 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, solo, black_shirt, looking_at_viewer, holding, short_sleeves, smile, pickaxe, shovel, ring, midriff, simple_background, black_skirt, white_background, crop_top, navel, open_mouth | ### Table Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | black_shirt | looking_at_viewer | holding | short_sleeves | smile | pickaxe | shovel | ring | midriff | simple_background | black_skirt | white_background | crop_top | navel | open_mouth | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:--------------|:--------------------|:----------|:----------------|:--------|:----------|:---------|:-------|:----------|:--------------------|:--------------|:-------------------|:-----------|:--------|:-------------| | 0 | 33 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
CyberHarem/himemushi_momoyo_touhou
[ "task_categories:text-to-image", "size_categories:n<1K", "license:mit", "art", "not-for-all-audiences", "region:us" ]
2023-08-18T20:43:39+00:00
{"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]}
2024-01-15T02:56:37+00:00
[]
[]
TAGS #task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
Dataset of himemushi\_momoyo/姫虫百々世/히메무시모모요 (Touhou) =================================================== This is the dataset of himemushi\_momoyo/姫虫百々世/히메무시모모요 (Touhou), containing 33 images and their tags. The core tags of this character are 'long\_hair, grey\_hair, bow, orange\_bow, ribbon, grey\_eyes, breasts, very\_long\_hair, arm\_ribbon, blue\_eyes, bangs, blue\_hair, orange\_ribbon', which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization). List of Packages ---------------- ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code List of Clusters ---------------- List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version ### Table Version
[ "### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.", "### Raw Text Version", "### Table Version" ]
[ "TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n", "### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.", "### Raw Text Version", "### Table Version" ]
[ 44, 61, 5, 4 ]
[ "passage: TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.### Raw Text Version### Table Version" ]
8e450eab6502f5d78c38f263d5afe04b117a78b7
# Dataset of luna_child/ルナチャイルド/루나차일드 (Touhou) This is the dataset of luna_child/ルナチャイルド/루나차일드 (Touhou), containing 500 images and their tags. The core tags of this character are `blonde_hair, drill_hair, hat, short_hair, wings, red_eyes, bow, fairy_wings, white_headwear`, which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)). ## List of Packages | Name | Images | Size | Download | Type | Description | |:-----------------|---------:|:-----------|:-------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------| | raw | 500 | 411.70 MiB | [Download](https://huggingface.co/datasets/CyberHarem/luna_child_touhou/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). | | 800 | 500 | 300.33 MiB | [Download](https://huggingface.co/datasets/CyberHarem/luna_child_touhou/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. | | stage3-p480-800 | 1021 | 583.72 MiB | [Download](https://huggingface.co/datasets/CyberHarem/luna_child_touhou/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | | 1200 | 500 | 388.00 MiB | [Download](https://huggingface.co/datasets/CyberHarem/luna_child_touhou/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. | | stage3-p480-1200 | 1021 | 713.29 MiB | [Download](https://huggingface.co/datasets/CyberHarem/luna_child_touhou/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code ```python import os import zipfile from huggingface_hub import hf_hub_download from waifuc.source import LocalSource # download raw archive file zip_file = hf_hub_download( repo_id='CyberHarem/luna_child_touhou', repo_type='dataset', filename='dataset-raw.zip', ) # extract files to your directory dataset_dir = 'dataset_dir' os.makedirs(dataset_dir, exist_ok=True) with zipfile.ZipFile(zip_file, 'r') as zf: zf.extractall(dataset_dir) # load the dataset with waifuc source = LocalSource(dataset_dir) for item in source: print(item.image, item.meta['filename'], item.meta['tags']) ``` ## List of Clusters List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| | 0 | 9 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, blush, open_mouth, solo, dress, chestnut_mouth | | 1 | 10 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | 1girl, bangs, black_bowtie, looking_at_viewer, open_mouth, solo, white_dress, hair_between_eyes, long_sleeves, simple_background, blush, chestnut_mouth, puffy_sleeves, white_background, drill_locks, one-hour_drawing_challenge, upper_body, wide_sleeves | | 2 | 9 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | 2girls, dress, chestnut_mouth, open_mouth | | 3 | 23 | ![](samples/3/clu3-sample0.png) | ![](samples/3/clu3-sample1.png) | ![](samples/3/clu3-sample2.png) | ![](samples/3/clu3-sample3.png) | ![](samples/3/clu3-sample4.png) | loli, 1girl, nipples, nude, blush, solo, flat_chest, pussy, navel, open_mouth, chestnut_mouth | | 4 | 14 | ![](samples/4/clu4-sample0.png) | ![](samples/4/clu4-sample1.png) | ![](samples/4/clu4-sample2.png) | ![](samples/4/clu4-sample3.png) | ![](samples/4/clu4-sample4.png) | hetero, 1girl, loli, penis, solo_focus, 1boy, blush, flat_chest, nipples, sex, nude, open_mouth, vaginal, censored, cum_in_pussy, navel, tears, chestnut_mouth | | 5 | 8 | ![](samples/5/clu5-sample0.png) | ![](samples/5/clu5-sample1.png) | ![](samples/5/clu5-sample2.png) | ![](samples/5/clu5-sample3.png) | ![](samples/5/clu5-sample4.png) | 1boy, 1girl, blush, hetero, loli, penis, solo_focus, censored, facial, fellatio, cum_on_body, flat_chest, nipples, nude, one_eye_closed | | 6 | 5 | ![](samples/6/clu6-sample0.png) | ![](samples/6/clu6-sample1.png) | ![](samples/6/clu6-sample2.png) | ![](samples/6/clu6-sample3.png) | ![](samples/6/clu6-sample4.png) | 1girl, loli, no_panties, pussy, solo, blush, dress_lift, peeing, navel, censored, squatting | | 7 | 9 | ![](samples/7/clu7-sample0.png) | ![](samples/7/clu7-sample1.png) | ![](samples/7/clu7-sample2.png) | ![](samples/7/clu7-sample3.png) | ![](samples/7/clu7-sample4.png) | 1girl, blush, flat_chest, loli, solo, nipples, topless, barefoot, white_panties, underwear_only | | 8 | 6 | ![](samples/8/clu8-sample0.png) | ![](samples/8/clu8-sample1.png) | ![](samples/8/clu8-sample2.png) | ![](samples/8/clu8-sample3.png) | ![](samples/8/clu8-sample4.png) | bangs, beret, blush, cowboy_shot, long_sleeves, pleated_skirt, sailor_collar, serafuku, white_panties, yellow_neckerchief, 1girl, bespectacled, plaid_skirt, solo, alternate_costume, indoors, miniskirt, standing, contemporary, grey_skirt, hair_between_eyes, looking_at_viewer, sleeves_past_wrists | ### Table Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | blush | open_mouth | solo | dress | chestnut_mouth | bangs | black_bowtie | looking_at_viewer | white_dress | hair_between_eyes | long_sleeves | simple_background | puffy_sleeves | white_background | drill_locks | one-hour_drawing_challenge | upper_body | wide_sleeves | 2girls | loli | nipples | nude | flat_chest | pussy | navel | hetero | penis | solo_focus | 1boy | sex | vaginal | censored | cum_in_pussy | tears | facial | fellatio | cum_on_body | one_eye_closed | no_panties | dress_lift | peeing | squatting | topless | barefoot | white_panties | underwear_only | beret | cowboy_shot | pleated_skirt | sailor_collar | serafuku | yellow_neckerchief | bespectacled | plaid_skirt | alternate_costume | indoors | miniskirt | standing | contemporary | grey_skirt | sleeves_past_wrists | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------|:-------------|:-------|:--------|:-----------------|:--------|:---------------|:--------------------|:--------------|:--------------------|:---------------|:--------------------|:----------------|:-------------------|:--------------|:-----------------------------|:-------------|:---------------|:---------|:-------|:----------|:-------|:-------------|:--------|:--------|:---------|:--------|:-------------|:-------|:------|:----------|:-----------|:---------------|:--------|:---------|:-----------|:--------------|:-----------------|:-------------|:-------------|:---------|:------------|:----------|:-----------|:----------------|:-----------------|:--------|:--------------|:----------------|:----------------|:-----------|:---------------------|:---------------|:--------------|:--------------------|:----------|:------------|:-----------|:---------------|:-------------|:----------------------| | 0 | 9 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 1 | 10 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | X | X | X | X | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 2 | 9 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | | | X | | X | X | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 3 | 23 | ![](samples/3/clu3-sample0.png) | ![](samples/3/clu3-sample1.png) | ![](samples/3/clu3-sample2.png) | ![](samples/3/clu3-sample3.png) | ![](samples/3/clu3-sample4.png) | X | X | X | X | | X | | | | | | | | | | | | | | | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 4 | 14 | ![](samples/4/clu4-sample0.png) | ![](samples/4/clu4-sample1.png) | ![](samples/4/clu4-sample2.png) | ![](samples/4/clu4-sample3.png) | ![](samples/4/clu4-sample4.png) | X | X | X | | | X | | | | | | | | | | | | | | | X | X | X | X | | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 5 | 8 | ![](samples/5/clu5-sample0.png) | ![](samples/5/clu5-sample1.png) | ![](samples/5/clu5-sample2.png) | ![](samples/5/clu5-sample3.png) | ![](samples/5/clu5-sample4.png) | X | X | | | | | | | | | | | | | | | | | | | X | X | X | X | | | X | X | X | X | | | X | | | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | 6 | 5 | ![](samples/6/clu6-sample0.png) | ![](samples/6/clu6-sample1.png) | ![](samples/6/clu6-sample2.png) | ![](samples/6/clu6-sample3.png) | ![](samples/6/clu6-sample4.png) | X | X | | X | | | | | | | | | | | | | | | | | X | | | | X | X | | | | | | | X | | | | | | | X | X | X | X | | | | | | | | | | | | | | | | | | | | | 7 | 9 | ![](samples/7/clu7-sample0.png) | ![](samples/7/clu7-sample1.png) | ![](samples/7/clu7-sample2.png) | ![](samples/7/clu7-sample3.png) | ![](samples/7/clu7-sample4.png) | X | X | | X | | | | | | | | | | | | | | | | | X | X | | X | | | | | | | | | | | | | | | | | | | | X | X | X | X | | | | | | | | | | | | | | | | | 8 | 6 | ![](samples/8/clu8-sample0.png) | ![](samples/8/clu8-sample1.png) | ![](samples/8/clu8-sample2.png) | ![](samples/8/clu8-sample3.png) | ![](samples/8/clu8-sample4.png) | X | X | | X | | | X | | X | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
CyberHarem/luna_child_touhou
[ "task_categories:text-to-image", "size_categories:n<1K", "license:mit", "art", "not-for-all-audiences", "region:us" ]
2023-08-18T21:15:06+00:00
{"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]}
2024-01-14T23:33:22+00:00
[]
[]
TAGS #task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
Dataset of luna\_child/ルナチャイルド/루나차일드 (Touhou) ============================================= This is the dataset of luna\_child/ルナチャイルド/루나차일드 (Touhou), containing 500 images and their tags. The core tags of this character are 'blonde\_hair, drill\_hair, hat, short\_hair, wings, red\_eyes, bow, fairy\_wings, white\_headwear', which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization). List of Packages ---------------- ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code List of Clusters ---------------- List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version ### Table Version
[ "### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.", "### Raw Text Version", "### Table Version" ]
[ "TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n", "### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.", "### Raw Text Version", "### Table Version" ]
[ 44, 61, 5, 4 ]
[ "passage: TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.### Raw Text Version### Table Version" ]
4099fb279dc6fc1991f84056f9518806f3d1a851
# Dataset of okunoda_miyoi (Touhou) This is the dataset of okunoda_miyoi (Touhou), containing 242 images and their tags. The core tags of this character are `pink_hair, blue_headwear, hat, green_eyes, breasts, short_hair, bangs, hair_between_eyes, animal_hat, large_breasts`, which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)). ## List of Packages | Name | Images | Size | Download | Type | Description | |:-----------------|---------:|:-----------|:----------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------| | raw | 242 | 313.99 MiB | [Download](https://huggingface.co/datasets/CyberHarem/okunoda_miyoi_touhou/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). | | 800 | 242 | 178.85 MiB | [Download](https://huggingface.co/datasets/CyberHarem/okunoda_miyoi_touhou/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. | | stage3-p480-800 | 579 | 397.19 MiB | [Download](https://huggingface.co/datasets/CyberHarem/okunoda_miyoi_touhou/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | | 1200 | 242 | 276.68 MiB | [Download](https://huggingface.co/datasets/CyberHarem/okunoda_miyoi_touhou/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. | | stage3-p480-1200 | 579 | 575.82 MiB | [Download](https://huggingface.co/datasets/CyberHarem/okunoda_miyoi_touhou/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code ```python import os import zipfile from huggingface_hub import hf_hub_download from waifuc.source import LocalSource # download raw archive file zip_file = hf_hub_download( repo_id='CyberHarem/okunoda_miyoi_touhou', repo_type='dataset', filename='dataset-raw.zip', ) # extract files to your directory dataset_dir = 'dataset_dir' os.makedirs(dataset_dir, exist_ok=True) with zipfile.ZipFile(zip_file, 'r') as zf: zf.extractall(dataset_dir) # load the dataset with waifuc source = LocalSource(dataset_dir) for item in source: print(item.image, item.meta['filename'], item.meta['tags']) ``` ## List of Clusters List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| | 0 | 8 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, blue_shirt, looking_at_viewer, simple_background, solo, white_background, long_sleeves, purple_skirt, smile, blush, closed_mouth, fish_print, gourd, holding, white_shirt, standing | | 1 | 5 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | 1girl, blue_shirt, fish_print, looking_at_viewer, purple_skirt, short_sleeves, simple_background, solo, blue_apron, holding, open_mouth, :d, white_background, blush, cross-laced_clothes, ofuda_on_clothes | | 2 | 7 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | 1girl, blue_shirt, holding_tray, open_mouth, purple_skirt, solo, white_background, apron, fish_print, bottle, ofuda_on_clothes, short_sleeves, simple_background, tokkuri, gourd, looking_at_viewer, :d, blush, choko_(cup), full_body | | 3 | 16 | ![](samples/3/clu3-sample0.png) | ![](samples/3/clu3-sample1.png) | ![](samples/3/clu3-sample2.png) | ![](samples/3/clu3-sample3.png) | ![](samples/3/clu3-sample4.png) | 1girl, blue_shirt, looking_at_viewer, solo, blush, short_sleeves, open_mouth, simple_background, white_background, upper_body, :d, purple_skirt, holding | | 4 | 16 | ![](samples/4/clu4-sample0.png) | ![](samples/4/clu4-sample1.png) | ![](samples/4/clu4-sample2.png) | ![](samples/4/clu4-sample3.png) | ![](samples/4/clu4-sample4.png) | 1girl, blush, 1boy, hetero, solo_focus, nipples, penis, open_mouth, looking_at_viewer, paizuri, smile, upper_body, mosaic_censoring, sweat, cum_on_body, facial, huge_breasts, indoors, nude, one_eye_closed | ### Table Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | blue_shirt | looking_at_viewer | simple_background | solo | white_background | long_sleeves | purple_skirt | smile | blush | closed_mouth | fish_print | gourd | holding | white_shirt | standing | short_sleeves | blue_apron | open_mouth | :d | cross-laced_clothes | ofuda_on_clothes | holding_tray | apron | bottle | tokkuri | choko_(cup) | full_body | upper_body | 1boy | hetero | solo_focus | nipples | penis | paizuri | mosaic_censoring | sweat | cum_on_body | facial | huge_breasts | indoors | nude | one_eye_closed | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------------|:--------------------|:--------------------|:-------|:-------------------|:---------------|:---------------|:--------|:--------|:---------------|:-------------|:--------|:----------|:--------------|:-----------|:----------------|:-------------|:-------------|:-----|:----------------------|:-------------------|:---------------|:--------|:---------|:----------|:--------------|:------------|:-------------|:-------|:---------|:-------------|:----------|:--------|:----------|:-------------------|:--------|:--------------|:---------|:---------------|:----------|:-------|:-----------------| | 0 | 8 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 1 | 5 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | X | X | X | X | X | X | | X | | X | | X | | X | | | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | 2 | 7 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | X | X | X | X | X | X | | X | | X | | X | X | | | | X | | X | X | | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | 3 | 16 | ![](samples/3/clu3-sample0.png) | ![](samples/3/clu3-sample1.png) | ![](samples/3/clu3-sample2.png) | ![](samples/3/clu3-sample3.png) | ![](samples/3/clu3-sample4.png) | X | X | X | X | X | X | | X | | X | | | | X | | | X | | X | X | | | | | | | | | X | | | | | | | | | | | | | | | | 4 | 16 | ![](samples/4/clu4-sample0.png) | ![](samples/4/clu4-sample1.png) | ![](samples/4/clu4-sample2.png) | ![](samples/4/clu4-sample3.png) | ![](samples/4/clu4-sample4.png) | X | | X | | | | | | X | X | | | | | | | | | X | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
CyberHarem/okunoda_miyoi_touhou
[ "task_categories:text-to-image", "size_categories:n<1K", "license:mit", "art", "not-for-all-audiences", "region:us" ]
2023-08-18T21:20:37+00:00
{"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]}
2024-01-15T08:08:22+00:00
[]
[]
TAGS #task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
Dataset of okunoda\_miyoi (Touhou) ================================== This is the dataset of okunoda\_miyoi (Touhou), containing 242 images and their tags. The core tags of this character are 'pink\_hair, blue\_headwear, hat, green\_eyes, breasts, short\_hair, bangs, hair\_between\_eyes, animal\_hat, large\_breasts', which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization). List of Packages ---------------- ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code List of Clusters ---------------- List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version ### Table Version
[ "### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.", "### Raw Text Version", "### Table Version" ]
[ "TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n", "### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.", "### Raw Text Version", "### Table Version" ]
[ 44, 61, 5, 4 ]
[ "passage: TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.### Raw Text Version### Table Version" ]
4ce568300b03b04f362b59b0f5697ff476937151
# Dataset of konngara (Touhou) This is the dataset of konngara (Touhou), containing 89 images and their tags. The core tags of this character are `horns, single_horn, red_eyes, black_hair, ponytail, long_hair, ribbon`, which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)). ## List of Packages | Name | Images | Size | Download | Type | Description | |:-----------------|---------:|:-----------|:-----------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------| | raw | 89 | 77.05 MiB | [Download](https://huggingface.co/datasets/CyberHarem/konngara_touhou/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). | | 800 | 89 | 54.21 MiB | [Download](https://huggingface.co/datasets/CyberHarem/konngara_touhou/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. | | stage3-p480-800 | 171 | 101.31 MiB | [Download](https://huggingface.co/datasets/CyberHarem/konngara_touhou/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | | 1200 | 89 | 72.52 MiB | [Download](https://huggingface.co/datasets/CyberHarem/konngara_touhou/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. | | stage3-p480-1200 | 171 | 127.80 MiB | [Download](https://huggingface.co/datasets/CyberHarem/konngara_touhou/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code ```python import os import zipfile from huggingface_hub import hf_hub_download from waifuc.source import LocalSource # download raw archive file zip_file = hf_hub_download( repo_id='CyberHarem/konngara_touhou', repo_type='dataset', filename='dataset-raw.zip', ) # extract files to your directory dataset_dir = 'dataset_dir' os.makedirs(dataset_dir, exist_ok=True) with zipfile.ZipFile(zip_file, 'r') as zf: zf.extractall(dataset_dir) # load the dataset with waifuc source = LocalSource(dataset_dir) for item in source: print(item.image, item.meta['filename'], item.meta['tags']) ``` ## List of Clusters List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| | 0 | 5 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, japanese_clothes, katana, solo, profile, sheath | | 1 | 6 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | 1girl, sakazuki, solo, wide_sleeves, katana, kimono, hair_bow, looking_at_viewer, holding | | 2 | 12 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | 1girl, holding_sword, solo, katana, long_sleeves, looking_at_viewer, wide_sleeves, closed_mouth, hair_ribbon, bangs, holding_cup, sakazuki, red_kimono, red_ribbon, simple_background | ### Table Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | japanese_clothes | katana | solo | profile | sheath | sakazuki | wide_sleeves | kimono | hair_bow | looking_at_viewer | holding | holding_sword | long_sleeves | closed_mouth | hair_ribbon | bangs | holding_cup | red_kimono | red_ribbon | simple_background | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------------------|:---------|:-------|:----------|:---------|:-----------|:---------------|:---------|:-----------|:--------------------|:----------|:----------------|:---------------|:---------------|:--------------|:--------|:--------------|:-------------|:-------------|:--------------------| | 0 | 5 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | | | | | | | | | | | | | | | | | 1 | 6 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | X | | X | X | | | X | X | X | X | X | X | | | | | | | | | | | 2 | 12 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | X | | X | X | | | X | X | | | X | | X | X | X | X | X | X | X | X | X |
CyberHarem/konngara_touhou
[ "task_categories:text-to-image", "size_categories:n<1K", "license:mit", "art", "not-for-all-audiences", "region:us" ]
2023-08-18T21:33:48+00:00
{"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]}
2024-01-15T04:51:30+00:00
[]
[]
TAGS #task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
Dataset of konngara (Touhou) ============================ This is the dataset of konngara (Touhou), containing 89 images and their tags. The core tags of this character are 'horns, single\_horn, red\_eyes, black\_hair, ponytail, long\_hair, ribbon', which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization). List of Packages ---------------- ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code List of Clusters ---------------- List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version ### Table Version
[ "### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.", "### Raw Text Version", "### Table Version" ]
[ "TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n", "### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.", "### Raw Text Version", "### Table Version" ]
[ 44, 61, 5, 4 ]
[ "passage: TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.### Raw Text Version### Table Version" ]
1c8222832449f6a8e6ae1180c3ff4e8808e13f43
# Dataset Card for "apitext_dirty" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
havens2/apitext_dirty
[ "region:us" ]
2023-08-18T21:37:39+00:00
{"dataset_info": {"features": [{"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 6472042, "num_examples": 8830}], "download_size": 2694540, "dataset_size": 6472042}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]}
2023-08-18T21:37:39+00:00
[]
[]
TAGS #region-us
# Dataset Card for "apitext_dirty" More Information needed
[ "# Dataset Card for \"apitext_dirty\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"apitext_dirty\"\n\nMore Information needed" ]
[ 6, 15 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"apitext_dirty\"\n\nMore Information needed" ]
171665c64cd7cad651df298b3345d92042ff0575
# The Emerald Tablets of Thoth the Atlantean 📔 Based on the [Translation by Dr. Doreal](https://www.crystalinks.com/emerald.html) "Literal translation and interpretation of one of the most ancient and secret of the great works of ancient wisdom." I made this available, because nobody else did yet, as far as I know.
shadowsword/thoth
[ "size_categories:1K<n<10K", "language:en", "license:unknown", "region:us" ]
2023-08-18T21:40:10+00:00
{"language": ["en"], "license": "unknown", "size_categories": ["1K<n<10K"], "pretty_name": "Emerald Tablets of Thoth the Atlantean"}
2023-08-18T22:06:06+00:00
[]
[ "en" ]
TAGS #size_categories-1K<n<10K #language-English #license-unknown #region-us
# The Emerald Tablets of Thoth the Atlantean Based on the Translation by Dr. Doreal "Literal translation and interpretation of one of the most ancient and secret of the great works of ancient wisdom." I made this available, because nobody else did yet, as far as I know.
[ "# The Emerald Tablets of Thoth the Atlantean\n Based on the Translation by Dr. Doreal\n\n\"Literal translation and interpretation of one of the most\nancient and secret of the great works of ancient wisdom.\"\n\nI made this available, because nobody else did yet, as far as I know." ]
[ "TAGS\n#size_categories-1K<n<10K #language-English #license-unknown #region-us \n", "# The Emerald Tablets of Thoth the Atlantean\n Based on the Translation by Dr. Doreal\n\n\"Literal translation and interpretation of one of the most\nancient and secret of the great works of ancient wisdom.\"\n\nI made this available, because nobody else did yet, as far as I know." ]
[ 29, 65 ]
[ "passage: TAGS\n#size_categories-1K<n<10K #language-English #license-unknown #region-us \n# The Emerald Tablets of Thoth the Atlantean\n Based on the Translation by Dr. Doreal\n\n\"Literal translation and interpretation of one of the most\nancient and secret of the great works of ancient wisdom.\"\n\nI made this available, because nobody else did yet, as far as I know." ]
a910bbc9c4cf93317c0e60f119ea11fc37c53f79
# Dataset of watatsuki_no_yorihime/綿月依姫 (Touhou) This is the dataset of watatsuki_no_yorihime/綿月依姫 (Touhou), containing 130 images and their tags. The core tags of this character are `purple_hair, long_hair, ponytail, ribbon, bow, hair_bow, hair_ribbon, breasts, red_eyes, large_breasts, purple_eyes`, which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)). ## List of Packages | Name | Images | Size | Download | Type | Description | |:-----------------|---------:|:-----------|:------------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------| | raw | 130 | 127.98 MiB | [Download](https://huggingface.co/datasets/CyberHarem/watatsuki_no_yorihime_touhou/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). | | 800 | 130 | 88.30 MiB | [Download](https://huggingface.co/datasets/CyberHarem/watatsuki_no_yorihime_touhou/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. | | stage3-p480-800 | 287 | 165.56 MiB | [Download](https://huggingface.co/datasets/CyberHarem/watatsuki_no_yorihime_touhou/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | | 1200 | 130 | 119.22 MiB | [Download](https://huggingface.co/datasets/CyberHarem/watatsuki_no_yorihime_touhou/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. | | stage3-p480-1200 | 287 | 204.67 MiB | [Download](https://huggingface.co/datasets/CyberHarem/watatsuki_no_yorihime_touhou/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code ```python import os import zipfile from huggingface_hub import hf_hub_download from waifuc.source import LocalSource # download raw archive file zip_file = hf_hub_download( repo_id='CyberHarem/watatsuki_no_yorihime_touhou', repo_type='dataset', filename='dataset-raw.zip', ) # extract files to your directory dataset_dir = 'dataset_dir' os.makedirs(dataset_dir, exist_ok=True) with zipfile.ZipFile(zip_file, 'r') as zf: zf.extractall(dataset_dir) # load the dataset with waifuc source = LocalSource(dataset_dir) for item in source: print(item.image, item.meta['filename'], item.meta['tags']) ``` ## List of Clusters List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| | 0 | 6 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, blush, looking_at_viewer, solo, white_panties, bangs, white_shirt, belt, collared_shirt, long_sleeves, open_clothes, red_dress, shiny_skin, collarbone, nipples, shiny_hair, thighs, very_long_hair, wing_collar | | 1 | 5 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | 1girl, belt, katana, solo, bracelet | | 2 | 11 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | 1girl, belt, bracelet, katana, solo, boots, fire, sheath | | 3 | 32 | ![](samples/3/clu3-sample0.png) | ![](samples/3/clu3-sample1.png) | ![](samples/3/clu3-sample2.png) | ![](samples/3/clu3-sample3.png) | ![](samples/3/clu3-sample4.png) | 1girl, hetero, blush, solo_focus, censored, nipples, penis, 1boy, pussy, sex, open_mouth, vaginal, cum, tears | ### Table Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | blush | looking_at_viewer | solo | white_panties | bangs | white_shirt | belt | collared_shirt | long_sleeves | open_clothes | red_dress | shiny_skin | collarbone | nipples | shiny_hair | thighs | very_long_hair | wing_collar | katana | bracelet | boots | fire | sheath | hetero | solo_focus | censored | penis | 1boy | pussy | sex | open_mouth | vaginal | cum | tears | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------|:--------------------|:-------|:----------------|:--------|:--------------|:-------|:-----------------|:---------------|:---------------|:------------|:-------------|:-------------|:----------|:-------------|:---------|:-----------------|:--------------|:---------|:-----------|:--------|:-------|:---------|:---------|:-------------|:-----------|:--------|:-------|:--------|:------|:-------------|:----------|:------|:--------| | 0 | 6 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | 1 | 5 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | X | | | X | | | | X | | | | | | | | | | | | X | X | | | | | | | | | | | | | | | | 2 | 11 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | X | | | X | | | | X | | | | | | | | | | | | X | X | X | X | X | | | | | | | | | | | | | 3 | 32 | ![](samples/3/clu3-sample0.png) | ![](samples/3/clu3-sample1.png) | ![](samples/3/clu3-sample2.png) | ![](samples/3/clu3-sample3.png) | ![](samples/3/clu3-sample4.png) | X | X | | | | | | | | | | | | | X | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X |
CyberHarem/watatsuki_no_yorihime_touhou
[ "task_categories:text-to-image", "size_categories:n<1K", "license:mit", "art", "not-for-all-audiences", "region:us" ]
2023-08-18T21:43:54+00:00
{"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]}
2024-01-14T23:48:37+00:00
[]
[]
TAGS #task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
Dataset of watatsuki\_no\_yorihime/綿月依姫 (Touhou) ================================================ This is the dataset of watatsuki\_no\_yorihime/綿月依姫 (Touhou), containing 130 images and their tags. The core tags of this character are 'purple\_hair, long\_hair, ponytail, ribbon, bow, hair\_bow, hair\_ribbon, breasts, red\_eyes, large\_breasts, purple\_eyes', which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization). List of Packages ---------------- ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code List of Clusters ---------------- List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version ### Table Version
[ "### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.", "### Raw Text Version", "### Table Version" ]
[ "TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n", "### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.", "### Raw Text Version", "### Table Version" ]
[ 44, 61, 5, 4 ]
[ "passage: TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.### Raw Text Version### Table Version" ]
b1802b90dbf4dae5058b36203b36f24df52a2bf2
## Dataset Description TODO ### Dataset Summary TODO ## Dataset Creatioon TODO
Cristofher/perritos_y_no_perritos
[ "task_categories:image-classification", "annotations_creators:found", "size_categories:n<1K", "source_datasets:original", "license:apache-2.0", "animals", "dogs", "creature-dataset", "region:us" ]
2023-08-18T21:57:54+00:00
{"annotations_creators": ["found"], "language_creators": [], "language": [], "license": ["apache-2.0"], "multilinguality": [], "size_categories": ["n<1K"], "source_datasets": ["original"], "task_categories": ["image-classification"], "task_ids": ["binary-class-image-classification"], "pretty_name": "Perritos-y-no-Perritos", "tags": ["animals", "dogs", "creature-dataset"]}
2023-08-19T01:48:31+00:00
[]
[]
TAGS #task_categories-image-classification #annotations_creators-found #size_categories-n<1K #source_datasets-original #license-apache-2.0 #animals #dogs #creature-dataset #region-us
## Dataset Description TODO ### Dataset Summary TODO ## Dataset Creatioon TODO
[ "## Dataset Description\n\nTODO", "### Dataset Summary\n\nTODO", "## Dataset Creatioon\n\nTODO" ]
[ "TAGS\n#task_categories-image-classification #annotations_creators-found #size_categories-n<1K #source_datasets-original #license-apache-2.0 #animals #dogs #creature-dataset #region-us \n", "## Dataset Description\n\nTODO", "### Dataset Summary\n\nTODO", "## Dataset Creatioon\n\nTODO" ]
[ 66, 6, 8, 8 ]
[ "passage: TAGS\n#task_categories-image-classification #annotations_creators-found #size_categories-n<1K #source_datasets-original #license-apache-2.0 #animals #dogs #creature-dataset #region-us \n## Dataset Description\n\nTODO### Dataset Summary\n\nTODO## Dataset Creatioon\n\nTODO" ]
84e6b4d951a9ddbadb5590aac4fd3fcf629fa5ef
# Dataset Card for `Reddit-Movie-small-V1` ## Dataset Description - **Homepage:** https://github.com/AaronHeee/LLMs-as-Zero-Shot-Conversational-RecSys - **Repository:** https://github.com/AaronHeee/LLMs-as-Zero-Shot-Conversational-RecSys - **Paper:** To appear - **Point of Contact:** [email protected] ### Dataset Summary This dataset contains the recommendation-related conversations in movie domain, only for research use in e.g., conversational recommendation, long-query retrieval tasks. This dataset is ranging from Jan. 2022 to Dec. 2022. Another larger version dataset (from Jan. 2012 to Dec. 2022) can be found [here](https://huggingface.co/datasets/ZhankuiHe/reddit_movie_large_v1). ### Dataset Processing We dump [Reddit](https://reddit.com) conversations from [pushshift.io](https://pushshift.io), converted them into [raw text](https://huggingface.co/datasets/ZhankuiHe/reddit_movie_raw) on Reddit about movie recommendations from five subreddits: - [r/movies](https://www.reddit.com/r/movies/) - [r/moviesuggestions](https://www.reddit.com/r/suggestions/) - [r/bestofnetflix](https://www.reddit.com/r/bestofnetflix/) - [r/nextflixbestof](https://www.reddit.com/r/netflixbestof/) - [r/truefilm](https://www.reddit.com/r/truefilm/) After that, we process them by: 1. extracting movie recommendation conversations; 2. recognizing movie mentions in raw text; 3. linking movie mentions to existing movie entities in [IMDB](https://imdb.com) database. Since the raw text is quite noisy and processing is not perfect, we do observe some failure cases in our processed data. Thus we use V1 to highlight that this processed version is the first verion. Welcome to contribute to cleaner processed versions (such as V2) in the future, many thanks! ### Disclaimer ⚠️ **Please note that conversations processed from Reddit raw data may include content that is not entirely conducive to a positive experience (e.g., toxic speech). Exercise caution and discretion when utilizing this information.** ## Dataset Structure ### Data Fields - `id2name.json` provides a lookup table (dictionary) from `itemid` (e.g., `tt0053779`) to `itemname` (e.g., `La Dolce Vita (1960)`). Note that, the `itemid` is from [IMDB](https://imdb.com), so that it can be used to align other movie recommendation datasets sharing the same `itemid`, such as [MovieLens](https://movielens.org/). - `{train, valid, test}.csv` are question-answer pairs that can be used for training, validation and testing (split by the dialog created timestamp in their chronological order, ranging from far to recent). There are 12 columns in these `*.csv` files: - `conv_id (string)`: Conversational ID. Since our conversations are collected from reddit posts, we generate conversations by extracting paths in a reddit thread with different replies. An example of `conv_id` is: ``` "t3_rt7enj_0/14" # -> t3_rt7enj is the ID of the first post in the thread, 0 means this is the first path extracted from this thread, and 13 means there are 13 paths in total. ``` - `turn_id (string)`: Conversational turn ID. For example: ``` "t3_rt7enj" # -> We can use (conv_id, turn_id) to uniquely define a row in this dataset. ``` - `turn_order (int64)`: No.X turn in a given conversation, which can be used to sort turns within the conversation. For example: ``` 0 # -> It is the first turn in this conversation. Typically, for conversations from Reddit, the number of turns is usually not very large. ``` - `user_id (string)`: The unique user id. For example: ``` "t2_fweij" # -> user id ``` - `is_seeker (bool)`: Whether the speaker at the current turn is the seeker for recommendation or not. For example ``` true # -> It is the seeker (seeker starts a movie requesting conversation on Reddit). ``` - `utc_time (int64)`: The UTC timestamp when this conversation turn happend. For example: ``` 1641234238 # -> Try `datetime.fromtimestamp(1641234238)` ``` - `upvotes (int64)`: The number of upvotes from other reddit users (it is `null` if this post is the first post in this thread, because upvotes only work for replies.). For example: ``` 6 # -> 6 upvotes from other Reddit users. ``` - `processed (string)`: The role and text at this conversation turn (processed version). For example: ``` "['USER', 'We decided on tt3501632. They love it so far— very funny!']" # -> [ROLE, Processed string] after `eval()`, where we can match `tt3501632` to real item name using `id2name.json`. ``` - `raw (int64)`: The role and text at conversation turn (raw-text version). For example: ``` "['USER', 'We decided on Thor: Ragnarok. They love it so far— very funny!']" # -> [ROLE, Raw string] after `eval()`, where it is convinient to form it as "USER: We decided on Thor: Ragnarok. They love it so far— very funny!". ``` - `context_processed (string)`: The role and text pairs as the historical conversation context (processed version). For example: ``` "[['USER', 'It’s summer break ... Some of the films we have watched (and they enjoyed) in the past are tt3544112, tt1441952, tt1672078, tt0482571, tt0445590, tt0477348...'], ['SYSTEM', "I'm not big on super hero movies, but even I loved the tt2015381 movies ..."]]" # -> [[ROLE, Processed string], [ROLE, Processed string], ...] after `eval()`, where we can match `tt******` to real item name using `id2name.json`. ``` - `context_raw (string)`: The role and text pairs as the historical conversation context (raw version). For example: ``` "[['USER', 'It’s summer break ... Some of the films we have watched (and they enjoyed) in the past are Sing Street, Salmon Fishing in the Yemen, The Life of Pi, The Prestige, LOTR Trilogy, No Country for Old Men...'], ['SYSTEM', "I'm not big on super hero movies, but even I loved the guardians of the Galaxy movies ..."]]" # -> [[ROLE, Processed string], [ROLE, Processed string], ...] after `eval()`, where we can form "USER: ...\n SYSTEM: ...\n USER:..." easily. ``` - `context_turn_ids (string)`: The conversation context turn_ids associated with context [ROLE, Processed string] pairs. For example: ``` "['t3_8voapb', 't1_e1p0f5h'] # -> This is the `turn_id`s for the context ['USER', 'It’s summer break ...'], ['SYSTEM', "I'm not big on super hero movie...']. They can used to retrieve more related information like `utc_time` after combining with `conv_id`. ``` ### Data Splits We hold the last 20% data (in chronological order according to the created time of the conversation) as testing set. Others can be treated as training samples. We provided a suggested split to split Train into Train and Validation but you are free to try your splits. | | Total | Train + Validation | Test | | - | - | - | - | | #Conv. | 171,773 | 154,597 | 17,176 | | #Turns | 419,233 | 377,614 | 41,619 | | #Users | 12,508 | 11,477 | 1,384 | | #Items | 31,396 | 30,146 | 10,434 | ### Citation Information Please cite these two papers if you used this dataset, thanks! ```bib @inproceedings{he23large, title = Large language models as zero-shot conversational recommenders", author = "Zhankui He and Zhouhang Xie and Rahul Jha and Harald Steck and Dawen Liang and Yesu Feng and Bodhisattwa Majumder and Nathan Kallus and Julian McAuley", year = "2023", booktitle = "CIKM" } ``` ```bib @inproceedings{baumgartner2020pushshift, title={The pushshift reddit dataset}, author={Baumgartner, Jason and Zannettou, Savvas and Keegan, Brian and Squire, Megan and Blackburn, Jeremy}, booktitle={Proceedings of the international AAAI conference on web and social media}, volume={14}, pages={830--839}, year={2020} } ``` Please contact [Zhankui He](https://aaronheee.github.io) if you have any questions or suggestions.
ZhankuiHe/reddit_movie_small_v1
[ "task_categories:conversational", "language:en", "recommendation", "region:us" ]
2023-08-18T22:10:25+00:00
{"language": ["en"], "task_categories": ["conversational"], "tags": ["recommendation"]}
2023-08-20T16:23:49+00:00
[]
[ "en" ]
TAGS #task_categories-conversational #language-English #recommendation #region-us
Dataset Card for 'Reddit-Movie-small-V1' ======================================== Dataset Description ------------------- * Homepage: URL * Repository: URL * Paper: To appear * Point of Contact: zhh004@URL ### Dataset Summary This dataset contains the recommendation-related conversations in movie domain, only for research use in e.g., conversational recommendation, long-query retrieval tasks. This dataset is ranging from Jan. 2022 to Dec. 2022. Another larger version dataset (from Jan. 2012 to Dec. 2022) can be found here. ### Dataset Processing We dump Reddit conversations from URL, converted them into raw text on Reddit about movie recommendations from five subreddits: * r/movies * r/moviesuggestions * r/bestofnetflix * r/nextflixbestof * r/truefilm After that, we process them by: 1. extracting movie recommendation conversations; 2. recognizing movie mentions in raw text; 3. linking movie mentions to existing movie entities in IMDB database. Since the raw text is quite noisy and processing is not perfect, we do observe some failure cases in our processed data. Thus we use V1 to highlight that this processed version is the first verion. Welcome to contribute to cleaner processed versions (such as V2) in the future, many thanks! ### Disclaimer ️ Please note that conversations processed from Reddit raw data may include content that is not entirely conducive to a positive experience (e.g., toxic speech). Exercise caution and discretion when utilizing this information. Dataset Structure ----------------- ### Data Fields * 'URL' provides a lookup table (dictionary) from 'itemid' (e.g., 'tt0053779') to 'itemname' (e.g., 'La Dolce Vita (1960)'). Note that, the 'itemid' is from IMDB, so that it can be used to align other movie recommendation datasets sharing the same 'itemid', such as MovieLens. * '{train, valid, test}.csv' are question-answer pairs that can be used for training, validation and testing (split by the dialog created timestamp in their chronological order, ranging from far to recent). There are 12 columns in these '\*.csv' files: + 'conv\_id (string)': Conversational ID. Since our conversations are collected from reddit posts, we generate conversations by extracting paths in a reddit thread with different replies. An example of 'conv\_id' is: + 'turn\_id (string)': Conversational turn ID. For example: + 'turn\_order (int64)': No.X turn in a given conversation, which can be used to sort turns within the conversation. For example: + 'user\_id (string)': The unique user id. For example: + 'is\_seeker (bool)': Whether the speaker at the current turn is the seeker for recommendation or not. For example + 'utc\_time (int64)': The UTC timestamp when this conversation turn happend. For example: + 'upvotes (int64)': The number of upvotes from other reddit users (it is 'null' if this post is the first post in this thread, because upvotes only work for replies.). For example: + 'processed (string)': The role and text at this conversation turn (processed version). For example: + 'raw (int64)': The role and text at conversation turn (raw-text version). For example: + 'context\_processed (string)': The role and text pairs as the historical conversation context (processed version). For example: + 'context\_raw (string)': The role and text pairs as the historical conversation context (raw version). For example: + 'context\_turn\_ids (string)': The conversation context turn\_ids associated with context [ROLE, Processed string] pairs. For example: ### Data Splits We hold the last 20% data (in chronological order according to the created time of the conversation) as testing set. Others can be treated as training samples. We provided a suggested split to split Train into Train and Validation but you are free to try your splits. Please cite these two papers if you used this dataset, thanks! Please contact Zhankui He if you have any questions or suggestions.
[ "### Dataset Summary\n\n\nThis dataset contains the recommendation-related conversations in movie domain, only for research use in e.g., conversational recommendation, long-query retrieval tasks.\n\n\nThis dataset is ranging from Jan. 2022 to Dec. 2022. Another larger version dataset (from Jan. 2012 to Dec. 2022) can be found here.", "### Dataset Processing\n\n\nWe dump Reddit conversations from URL, converted them into raw text on Reddit about movie recommendations from five subreddits:\n\n\n* r/movies\n* r/moviesuggestions\n* r/bestofnetflix\n* r/nextflixbestof\n* r/truefilm\n\n\nAfter that, we process them by:\n\n\n1. extracting movie recommendation conversations;\n2. recognizing movie mentions in raw text;\n3. linking movie mentions to existing movie entities in IMDB database.\n\n\nSince the raw text is quite noisy and processing is not perfect, we do observe some failure cases in our processed data. Thus we use V1 to highlight that this processed version is the first verion. Welcome to contribute to cleaner processed versions (such as V2) in the future, many thanks!", "### Disclaimer\n\n\n️ Please note that conversations processed from Reddit raw data may include content that is not entirely conducive to a positive experience (e.g., toxic speech). Exercise caution and discretion when utilizing this information.\n\n\nDataset Structure\n-----------------", "### Data Fields\n\n\n* 'URL' provides a lookup table (dictionary) from 'itemid' (e.g., 'tt0053779') to 'itemname' (e.g., 'La Dolce Vita (1960)'). Note that, the 'itemid' is from IMDB, so that it can be used to align other movie recommendation datasets sharing the same 'itemid', such as MovieLens.\n* '{train, valid, test}.csv' are question-answer pairs that can be used for training, validation and testing (split by the dialog created timestamp in their chronological order, ranging from far to recent). There are 12 columns in these '\\*.csv' files:\n\t+ 'conv\\_id (string)': Conversational ID. Since our conversations are collected from reddit posts, we generate conversations by extracting paths in a reddit thread with different replies. An example of 'conv\\_id' is:\n\t+ 'turn\\_id (string)': Conversational turn ID. For example:\n\t+ 'turn\\_order (int64)': No.X turn in a given conversation, which can be used to sort turns within the conversation. For example:\n\t+ 'user\\_id (string)': The unique user id. For example:\n\t+ 'is\\_seeker (bool)': Whether the speaker at the current turn is the seeker for recommendation or not. For example\n\t+ 'utc\\_time (int64)': The UTC timestamp when this conversation turn happend. For example:\n\t+ 'upvotes (int64)': The number of upvotes from other reddit users (it is 'null' if this post is the first post in this thread, because upvotes only work for replies.). For example:\n\t+ 'processed (string)': The role and text at this conversation turn (processed version). For example:\n\t+ 'raw (int64)': The role and text at conversation turn (raw-text version). For example:\n\t+ 'context\\_processed (string)': The role and text pairs as the historical conversation context (processed version). For example:\n\t+ 'context\\_raw (string)': The role and text pairs as the historical conversation context (raw version). For example:\n\t+ 'context\\_turn\\_ids (string)': The conversation context turn\\_ids associated with context [ROLE, Processed string] pairs. For example:", "### Data Splits\n\n\nWe hold the last 20% data (in chronological order according to the created time of the conversation) as testing set. Others can be treated as training samples. We provided a suggested split to split Train into Train and Validation but you are free to try your splits.\n\n\n\nPlease cite these two papers if you used this dataset, thanks!\n\n\nPlease contact Zhankui He if you have any questions or suggestions." ]
[ "TAGS\n#task_categories-conversational #language-English #recommendation #region-us \n", "### Dataset Summary\n\n\nThis dataset contains the recommendation-related conversations in movie domain, only for research use in e.g., conversational recommendation, long-query retrieval tasks.\n\n\nThis dataset is ranging from Jan. 2022 to Dec. 2022. Another larger version dataset (from Jan. 2012 to Dec. 2022) can be found here.", "### Dataset Processing\n\n\nWe dump Reddit conversations from URL, converted them into raw text on Reddit about movie recommendations from five subreddits:\n\n\n* r/movies\n* r/moviesuggestions\n* r/bestofnetflix\n* r/nextflixbestof\n* r/truefilm\n\n\nAfter that, we process them by:\n\n\n1. extracting movie recommendation conversations;\n2. recognizing movie mentions in raw text;\n3. linking movie mentions to existing movie entities in IMDB database.\n\n\nSince the raw text is quite noisy and processing is not perfect, we do observe some failure cases in our processed data. Thus we use V1 to highlight that this processed version is the first verion. Welcome to contribute to cleaner processed versions (such as V2) in the future, many thanks!", "### Disclaimer\n\n\n️ Please note that conversations processed from Reddit raw data may include content that is not entirely conducive to a positive experience (e.g., toxic speech). Exercise caution and discretion when utilizing this information.\n\n\nDataset Structure\n-----------------", "### Data Fields\n\n\n* 'URL' provides a lookup table (dictionary) from 'itemid' (e.g., 'tt0053779') to 'itemname' (e.g., 'La Dolce Vita (1960)'). Note that, the 'itemid' is from IMDB, so that it can be used to align other movie recommendation datasets sharing the same 'itemid', such as MovieLens.\n* '{train, valid, test}.csv' are question-answer pairs that can be used for training, validation and testing (split by the dialog created timestamp in their chronological order, ranging from far to recent). There are 12 columns in these '\\*.csv' files:\n\t+ 'conv\\_id (string)': Conversational ID. Since our conversations are collected from reddit posts, we generate conversations by extracting paths in a reddit thread with different replies. An example of 'conv\\_id' is:\n\t+ 'turn\\_id (string)': Conversational turn ID. For example:\n\t+ 'turn\\_order (int64)': No.X turn in a given conversation, which can be used to sort turns within the conversation. For example:\n\t+ 'user\\_id (string)': The unique user id. For example:\n\t+ 'is\\_seeker (bool)': Whether the speaker at the current turn is the seeker for recommendation or not. For example\n\t+ 'utc\\_time (int64)': The UTC timestamp when this conversation turn happend. For example:\n\t+ 'upvotes (int64)': The number of upvotes from other reddit users (it is 'null' if this post is the first post in this thread, because upvotes only work for replies.). For example:\n\t+ 'processed (string)': The role and text at this conversation turn (processed version). For example:\n\t+ 'raw (int64)': The role and text at conversation turn (raw-text version). For example:\n\t+ 'context\\_processed (string)': The role and text pairs as the historical conversation context (processed version). For example:\n\t+ 'context\\_raw (string)': The role and text pairs as the historical conversation context (raw version). For example:\n\t+ 'context\\_turn\\_ids (string)': The conversation context turn\\_ids associated with context [ROLE, Processed string] pairs. For example:", "### Data Splits\n\n\nWe hold the last 20% data (in chronological order according to the created time of the conversation) as testing set. Others can be treated as training samples. We provided a suggested split to split Train into Train and Validation but you are free to try your splits.\n\n\n\nPlease cite these two papers if you used this dataset, thanks!\n\n\nPlease contact Zhankui He if you have any questions or suggestions." ]
[ 25, 82, 179, 59, 567, 94 ]
[ "passage: TAGS\n#task_categories-conversational #language-English #recommendation #region-us \n### Dataset Summary\n\n\nThis dataset contains the recommendation-related conversations in movie domain, only for research use in e.g., conversational recommendation, long-query retrieval tasks.\n\n\nThis dataset is ranging from Jan. 2022 to Dec. 2022. Another larger version dataset (from Jan. 2012 to Dec. 2022) can be found here.### Dataset Processing\n\n\nWe dump Reddit conversations from URL, converted them into raw text on Reddit about movie recommendations from five subreddits:\n\n\n* r/movies\n* r/moviesuggestions\n* r/bestofnetflix\n* r/nextflixbestof\n* r/truefilm\n\n\nAfter that, we process them by:\n\n\n1. extracting movie recommendation conversations;\n2. recognizing movie mentions in raw text;\n3. linking movie mentions to existing movie entities in IMDB database.\n\n\nSince the raw text is quite noisy and processing is not perfect, we do observe some failure cases in our processed data. Thus we use V1 to highlight that this processed version is the first verion. Welcome to contribute to cleaner processed versions (such as V2) in the future, many thanks!### Disclaimer\n\n\n️ Please note that conversations processed from Reddit raw data may include content that is not entirely conducive to a positive experience (e.g., toxic speech). Exercise caution and discretion when utilizing this information.\n\n\nDataset Structure\n-----------------" ]
5d084dfe2e79515ec092e4a219f052c135faca97
# Dataset Card for `Reddit-Movie-large-V1` ## Dataset Description - **Homepage:** https://github.com/AaronHeee/LLMs-as-Zero-Shot-Conversational-RecSys - **Repository:** https://github.com/AaronHeee/LLMs-as-Zero-Shot-Conversational-RecSys - **Paper:** To appear - **Point of Contact:** [email protected] ### Dataset Summary This dataset contains the recommendation-related conversations in movie domain, only for research use in e.g., conversational recommendation, long-query retrieval tasks. This dataset is ranging from Jan. 2012 to Dec. 2022. Another smaller version dataset (from Jan. 2022 to Dec. 2022) can be found [here](https://huggingface.co/datasets/ZhankuiHe/reddit_movie_small_v1). ### Dataset Processing We dump [Reddit](https://reddit.com) conversations from [pushshift.io](https://pushshift.io), converted them into [raw text](https://huggingface.co/datasets/ZhankuiHe/reddit_movie_raw) on Reddit about movie recommendations from five subreddits: - [r/movies](https://www.reddit.com/r/movies/) - [r/moviesuggestions](https://www.reddit.com/r/suggestions/) - [r/bestofnetflix](https://www.reddit.com/r/bestofnetflix/) - [r/nextflixbestof](https://www.reddit.com/r/netflixbestof/) - [r/truefilm](https://www.reddit.com/r/truefilm/) After that, we process them by: 1. extracting movie recommendation conversations; 2. recognizing movie mentions in raw text; 3. linking movie mentions to existing movie entities in [IMDB](https://imdb.com) database. Since the raw text is quite noisy and processing is not perfect, we do observe some failure cases in our processed data. Thus we use V1 to highlight that this processed version is the first verion. Welcome to contribute to cleaner processed versions (such as V2) in the future, many thanks! ### Disclaimer ⚠️ **Please note that conversations processed from Reddit raw data may include content that is not entirely conducive to a positive experience (e.g., toxic speech). Exercise caution and discretion when utilizing this information.** ## Dataset Structure ### Data Fields - `id2name.json` provides a lookup table (dictionary) from `itemid` (e.g., `tt0053779`) to `itemname` (e.g., `La Dolce Vita (1960)`). Note that, the `itemid` is from [IMDB](https://imdb.com), so that it can be used to align other movie recommendation datasets sharing the same `itemid`, such as [MovieLens](https://movielens.org/). - `{train, valid, test}.csv` are question-answer pairs that can be used for training, validation and testing (split by the dialog created timestamp in their chronological order, ranging from far to recent). There are 12 columns in these `*.csv` files: - `conv_id (string)`: Conversational ID. Since our conversations are collected from reddit posts, we generate conversations by extracting paths in a reddit thread with different replies. An example of `conv_id` is: ``` "t3_rt7enj_0/14" # -> t3_rt7enj is the ID of the first post in the thread, 0 means this is the first path extracted from this thread, and 13 means there are 13 paths in total. ``` - `turn_id (string)`: Conversational turn ID. For example: ``` "t3_rt7enj" # -> We can use (conv_id, turn_id) to uniquely define a row in this dataset. ``` - `turn_order (int64)`: No.X turn in a given conversation, which can be used to sort turns within the conversation. For example: ``` 0 # -> It is the first turn in this conversation. Typically, for conversations from Reddit, the number of turns is usually not very large. ``` - `user_id (string)`: The unique user id. For example: ``` "t2_fweij" # -> user id ``` - `is_seeker (bool)`: Whether the speaker at the current turn is the seeker for recommendation or not. For example ``` true # -> It is the seeker (seeker starts a movie requesting conversation on Reddit). ``` - `utc_time (int64)`: The UTC timestamp when this conversation turn happend. For example: ``` 1641234238 # -> Try `datetime.fromtimestamp(1641234238)` ``` - `upvotes (int64)`: The number of upvotes from other reddit users (it is `null` if this post is the first post in this thread, because upvotes only work for replies.). For example: ``` 6 # -> 6 upvotes from other Reddit users. ``` - `processed (string)`: The role and text at this conversation turn (processed version). For example: ``` "['USER', 'We decided on tt3501632. They love it so far— very funny!']" # -> [ROLE, Processed string] after `eval()`, where we can match `tt3501632` to real item name using `id2name.json`. ``` - `raw (int64)`: The role and text at conversation turn (raw-text version). For example: ``` "['USER', 'We decided on Thor: Ragnarok. They love it so far— very funny!']" # -> [ROLE, Raw string] after `eval()`, where it is convinient to form it as "USER: We decided on Thor: Ragnarok. They love it so far— very funny!". ``` - `context_processed (string)`: The role and text pairs as the historical conversation context (processed version). For example: ``` "[['USER', 'It’s summer break ... Some of the films we have watched (and they enjoyed) in the past are tt3544112, tt1441952, tt1672078, tt0482571, tt0445590, tt0477348...'], ['SYSTEM', "I'm not big on super hero movies, but even I loved the tt2015381 movies ..."]]" # -> [[ROLE, Processed string], [ROLE, Processed string], ...] after `eval()`, where we can match `tt******` to real item name using `id2name.json`. ``` - `context_raw (string)`: The role and text pairs as the historical conversation context (raw version). For example: ``` "[['USER', 'It’s summer break ... Some of the films we have watched (and they enjoyed) in the past are Sing Street, Salmon Fishing in the Yemen, The Life of Pi, The Prestige, LOTR Trilogy, No Country for Old Men...'], ['SYSTEM', "I'm not big on super hero movies, but even I loved the guardians of the Galaxy movies ..."]]" # -> [[ROLE, Processed string], [ROLE, Processed string], ...] after `eval()`, where we can form "USER: ...\n SYSTEM: ...\n USER:..." easily. ``` - `context_turn_ids (string)`: The conversation context turn_ids associated with context [ROLE, Processed string] pairs. For example: ``` "['t3_8voapb', 't1_e1p0f5h'] # -> This is the `turn_id`s for the context ['USER', 'It’s summer break ...'], ['SYSTEM', "I'm not big on super hero movie...']. They can used to retrieve more related information like `utc_time` after combining with `conv_id`. ``` ### Data Splits We hold the last 20% data (in chronological order according to the created time of the conversation) as testing set. Others can be treated as training samples. We provided a suggested split to split Train into Train and Validation but you are free to try your splits. | | Total | Train + Validation | Test | | - | - | - | - | | #Conv. | 634,392 | 570,955 | 63,437 | | #Turns | 1,669,720 | 1,514,537 | 155,183 | | #Users | 36,247 | 32,676 | 4,559 | | #Items | 51,203 | 48,838 | 20,275 | ### Citation Information Please cite these two papers if you used this dataset, thanks! ```bib @inproceedings{he23large, title = Large language models as zero-shot conversational recommenders", author = "Zhankui He and Zhouhang Xie and Rahul Jha and Harald Steck and Dawen Liang and Yesu Feng and Bodhisattwa Majumder and Nathan Kallus and Julian McAuley", year = "2023", booktitle = "CIKM" } ``` ```bib @inproceedings{baumgartner2020pushshift, title={The pushshift reddit dataset}, author={Baumgartner, Jason and Zannettou, Savvas and Keegan, Brian and Squire, Megan and Blackburn, Jeremy}, booktitle={Proceedings of the international AAAI conference on web and social media}, volume={14}, pages={830--839}, year={2020} } ``` Please contact [Zhankui He](https://aaronheee.github.io) if you have any questions or suggestions.
ZhankuiHe/reddit_movie_large_v1
[ "task_categories:conversational", "language:en", "recommendation", "region:us" ]
2023-08-18T22:10:40+00:00
{"language": ["en"], "task_categories": ["conversational"], "tags": ["recommendation"]}
2023-08-20T16:24:14+00:00
[]
[ "en" ]
TAGS #task_categories-conversational #language-English #recommendation #region-us
Dataset Card for 'Reddit-Movie-large-V1' ======================================== Dataset Description ------------------- * Homepage: URL * Repository: URL * Paper: To appear * Point of Contact: zhh004@URL ### Dataset Summary This dataset contains the recommendation-related conversations in movie domain, only for research use in e.g., conversational recommendation, long-query retrieval tasks. This dataset is ranging from Jan. 2012 to Dec. 2022. Another smaller version dataset (from Jan. 2022 to Dec. 2022) can be found here. ### Dataset Processing We dump Reddit conversations from URL, converted them into raw text on Reddit about movie recommendations from five subreddits: * r/movies * r/moviesuggestions * r/bestofnetflix * r/nextflixbestof * r/truefilm After that, we process them by: 1. extracting movie recommendation conversations; 2. recognizing movie mentions in raw text; 3. linking movie mentions to existing movie entities in IMDB database. Since the raw text is quite noisy and processing is not perfect, we do observe some failure cases in our processed data. Thus we use V1 to highlight that this processed version is the first verion. Welcome to contribute to cleaner processed versions (such as V2) in the future, many thanks! ### Disclaimer ️ Please note that conversations processed from Reddit raw data may include content that is not entirely conducive to a positive experience (e.g., toxic speech). Exercise caution and discretion when utilizing this information. Dataset Structure ----------------- ### Data Fields * 'URL' provides a lookup table (dictionary) from 'itemid' (e.g., 'tt0053779') to 'itemname' (e.g., 'La Dolce Vita (1960)'). Note that, the 'itemid' is from IMDB, so that it can be used to align other movie recommendation datasets sharing the same 'itemid', such as MovieLens. * '{train, valid, test}.csv' are question-answer pairs that can be used for training, validation and testing (split by the dialog created timestamp in their chronological order, ranging from far to recent). There are 12 columns in these '\*.csv' files: + 'conv\_id (string)': Conversational ID. Since our conversations are collected from reddit posts, we generate conversations by extracting paths in a reddit thread with different replies. An example of 'conv\_id' is: + 'turn\_id (string)': Conversational turn ID. For example: + 'turn\_order (int64)': No.X turn in a given conversation, which can be used to sort turns within the conversation. For example: + 'user\_id (string)': The unique user id. For example: + 'is\_seeker (bool)': Whether the speaker at the current turn is the seeker for recommendation or not. For example + 'utc\_time (int64)': The UTC timestamp when this conversation turn happend. For example: + 'upvotes (int64)': The number of upvotes from other reddit users (it is 'null' if this post is the first post in this thread, because upvotes only work for replies.). For example: + 'processed (string)': The role and text at this conversation turn (processed version). For example: + 'raw (int64)': The role and text at conversation turn (raw-text version). For example: + 'context\_processed (string)': The role and text pairs as the historical conversation context (processed version). For example: + 'context\_raw (string)': The role and text pairs as the historical conversation context (raw version). For example: + 'context\_turn\_ids (string)': The conversation context turn\_ids associated with context [ROLE, Processed string] pairs. For example: ### Data Splits We hold the last 20% data (in chronological order according to the created time of the conversation) as testing set. Others can be treated as training samples. We provided a suggested split to split Train into Train and Validation but you are free to try your splits. Please cite these two papers if you used this dataset, thanks! Please contact Zhankui He if you have any questions or suggestions.
[ "### Dataset Summary\n\n\nThis dataset contains the recommendation-related conversations in movie domain, only for research use in e.g., conversational recommendation, long-query retrieval tasks.\n\n\nThis dataset is ranging from Jan. 2012 to Dec. 2022. Another smaller version dataset (from Jan. 2022 to Dec. 2022) can be found here.", "### Dataset Processing\n\n\nWe dump Reddit conversations from URL, converted them into raw text on Reddit about movie recommendations from five subreddits:\n\n\n* r/movies\n* r/moviesuggestions\n* r/bestofnetflix\n* r/nextflixbestof\n* r/truefilm\n\n\nAfter that, we process them by:\n\n\n1. extracting movie recommendation conversations;\n2. recognizing movie mentions in raw text;\n3. linking movie mentions to existing movie entities in IMDB database.\n\n\nSince the raw text is quite noisy and processing is not perfect, we do observe some failure cases in our processed data. Thus we use V1 to highlight that this processed version is the first verion. Welcome to contribute to cleaner processed versions (such as V2) in the future, many thanks!", "### Disclaimer\n\n\n️ Please note that conversations processed from Reddit raw data may include content that is not entirely conducive to a positive experience (e.g., toxic speech). Exercise caution and discretion when utilizing this information.\n\n\nDataset Structure\n-----------------", "### Data Fields\n\n\n* 'URL' provides a lookup table (dictionary) from 'itemid' (e.g., 'tt0053779') to 'itemname' (e.g., 'La Dolce Vita (1960)'). Note that, the 'itemid' is from IMDB, so that it can be used to align other movie recommendation datasets sharing the same 'itemid', such as MovieLens.\n* '{train, valid, test}.csv' are question-answer pairs that can be used for training, validation and testing (split by the dialog created timestamp in their chronological order, ranging from far to recent). There are 12 columns in these '\\*.csv' files:\n\t+ 'conv\\_id (string)': Conversational ID. Since our conversations are collected from reddit posts, we generate conversations by extracting paths in a reddit thread with different replies. An example of 'conv\\_id' is:\n\t+ 'turn\\_id (string)': Conversational turn ID. For example:\n\t+ 'turn\\_order (int64)': No.X turn in a given conversation, which can be used to sort turns within the conversation. For example:\n\t+ 'user\\_id (string)': The unique user id. For example:\n\t+ 'is\\_seeker (bool)': Whether the speaker at the current turn is the seeker for recommendation or not. For example\n\t+ 'utc\\_time (int64)': The UTC timestamp when this conversation turn happend. For example:\n\t+ 'upvotes (int64)': The number of upvotes from other reddit users (it is 'null' if this post is the first post in this thread, because upvotes only work for replies.). For example:\n\t+ 'processed (string)': The role and text at this conversation turn (processed version). For example:\n\t+ 'raw (int64)': The role and text at conversation turn (raw-text version). For example:\n\t+ 'context\\_processed (string)': The role and text pairs as the historical conversation context (processed version). For example:\n\t+ 'context\\_raw (string)': The role and text pairs as the historical conversation context (raw version). For example:\n\t+ 'context\\_turn\\_ids (string)': The conversation context turn\\_ids associated with context [ROLE, Processed string] pairs. For example:", "### Data Splits\n\n\nWe hold the last 20% data (in chronological order according to the created time of the conversation) as testing set. Others can be treated as training samples. We provided a suggested split to split Train into Train and Validation but you are free to try your splits.\n\n\n\nPlease cite these two papers if you used this dataset, thanks!\n\n\nPlease contact Zhankui He if you have any questions or suggestions." ]
[ "TAGS\n#task_categories-conversational #language-English #recommendation #region-us \n", "### Dataset Summary\n\n\nThis dataset contains the recommendation-related conversations in movie domain, only for research use in e.g., conversational recommendation, long-query retrieval tasks.\n\n\nThis dataset is ranging from Jan. 2012 to Dec. 2022. Another smaller version dataset (from Jan. 2022 to Dec. 2022) can be found here.", "### Dataset Processing\n\n\nWe dump Reddit conversations from URL, converted them into raw text on Reddit about movie recommendations from five subreddits:\n\n\n* r/movies\n* r/moviesuggestions\n* r/bestofnetflix\n* r/nextflixbestof\n* r/truefilm\n\n\nAfter that, we process them by:\n\n\n1. extracting movie recommendation conversations;\n2. recognizing movie mentions in raw text;\n3. linking movie mentions to existing movie entities in IMDB database.\n\n\nSince the raw text is quite noisy and processing is not perfect, we do observe some failure cases in our processed data. Thus we use V1 to highlight that this processed version is the first verion. Welcome to contribute to cleaner processed versions (such as V2) in the future, many thanks!", "### Disclaimer\n\n\n️ Please note that conversations processed from Reddit raw data may include content that is not entirely conducive to a positive experience (e.g., toxic speech). Exercise caution and discretion when utilizing this information.\n\n\nDataset Structure\n-----------------", "### Data Fields\n\n\n* 'URL' provides a lookup table (dictionary) from 'itemid' (e.g., 'tt0053779') to 'itemname' (e.g., 'La Dolce Vita (1960)'). Note that, the 'itemid' is from IMDB, so that it can be used to align other movie recommendation datasets sharing the same 'itemid', such as MovieLens.\n* '{train, valid, test}.csv' are question-answer pairs that can be used for training, validation and testing (split by the dialog created timestamp in their chronological order, ranging from far to recent). There are 12 columns in these '\\*.csv' files:\n\t+ 'conv\\_id (string)': Conversational ID. Since our conversations are collected from reddit posts, we generate conversations by extracting paths in a reddit thread with different replies. An example of 'conv\\_id' is:\n\t+ 'turn\\_id (string)': Conversational turn ID. For example:\n\t+ 'turn\\_order (int64)': No.X turn in a given conversation, which can be used to sort turns within the conversation. For example:\n\t+ 'user\\_id (string)': The unique user id. For example:\n\t+ 'is\\_seeker (bool)': Whether the speaker at the current turn is the seeker for recommendation or not. For example\n\t+ 'utc\\_time (int64)': The UTC timestamp when this conversation turn happend. For example:\n\t+ 'upvotes (int64)': The number of upvotes from other reddit users (it is 'null' if this post is the first post in this thread, because upvotes only work for replies.). For example:\n\t+ 'processed (string)': The role and text at this conversation turn (processed version). For example:\n\t+ 'raw (int64)': The role and text at conversation turn (raw-text version). For example:\n\t+ 'context\\_processed (string)': The role and text pairs as the historical conversation context (processed version). For example:\n\t+ 'context\\_raw (string)': The role and text pairs as the historical conversation context (raw version). For example:\n\t+ 'context\\_turn\\_ids (string)': The conversation context turn\\_ids associated with context [ROLE, Processed string] pairs. For example:", "### Data Splits\n\n\nWe hold the last 20% data (in chronological order according to the created time of the conversation) as testing set. Others can be treated as training samples. We provided a suggested split to split Train into Train and Validation but you are free to try your splits.\n\n\n\nPlease cite these two papers if you used this dataset, thanks!\n\n\nPlease contact Zhankui He if you have any questions or suggestions." ]
[ 25, 82, 179, 59, 567, 94 ]
[ "passage: TAGS\n#task_categories-conversational #language-English #recommendation #region-us \n### Dataset Summary\n\n\nThis dataset contains the recommendation-related conversations in movie domain, only for research use in e.g., conversational recommendation, long-query retrieval tasks.\n\n\nThis dataset is ranging from Jan. 2012 to Dec. 2022. Another smaller version dataset (from Jan. 2022 to Dec. 2022) can be found here.### Dataset Processing\n\n\nWe dump Reddit conversations from URL, converted them into raw text on Reddit about movie recommendations from five subreddits:\n\n\n* r/movies\n* r/moviesuggestions\n* r/bestofnetflix\n* r/nextflixbestof\n* r/truefilm\n\n\nAfter that, we process them by:\n\n\n1. extracting movie recommendation conversations;\n2. recognizing movie mentions in raw text;\n3. linking movie mentions to existing movie entities in IMDB database.\n\n\nSince the raw text is quite noisy and processing is not perfect, we do observe some failure cases in our processed data. Thus we use V1 to highlight that this processed version is the first verion. Welcome to contribute to cleaner processed versions (such as V2) in the future, many thanks!### Disclaimer\n\n\n️ Please note that conversations processed from Reddit raw data may include content that is not entirely conducive to a positive experience (e.g., toxic speech). Exercise caution and discretion when utilizing this information.\n\n\nDataset Structure\n-----------------" ]
c575598682d370e3544d8002b3d93b088b0f3f9c
# Dataset of yamashiro_takane (Touhou) This is the dataset of yamashiro_takane (Touhou), containing 254 images and their tags. The core tags of this character are `green_hair, hat, green_eyes, flat_cap, medium_hair, bangs, green_headwear, camouflage_headwear`, which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)). ## List of Packages | Name | Images | Size | Download | Type | Description | |:-----------------|---------:|:-----------|:-------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------| | raw | 254 | 299.24 MiB | [Download](https://huggingface.co/datasets/CyberHarem/yamashiro_takane_touhou/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). | | 800 | 254 | 186.36 MiB | [Download](https://huggingface.co/datasets/CyberHarem/yamashiro_takane_touhou/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. | | stage3-p480-800 | 578 | 381.84 MiB | [Download](https://huggingface.co/datasets/CyberHarem/yamashiro_takane_touhou/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | | 1200 | 254 | 273.14 MiB | [Download](https://huggingface.co/datasets/CyberHarem/yamashiro_takane_touhou/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. | | stage3-p480-1200 | 578 | 514.13 MiB | [Download](https://huggingface.co/datasets/CyberHarem/yamashiro_takane_touhou/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code ```python import os import zipfile from huggingface_hub import hf_hub_download from waifuc.source import LocalSource # download raw archive file zip_file = hf_hub_download( repo_id='CyberHarem/yamashiro_takane_touhou', repo_type='dataset', filename='dataset-raw.zip', ) # extract files to your directory dataset_dir = 'dataset_dir' os.makedirs(dataset_dir, exist_ok=True) with zipfile.ZipFile(zip_file, 'r') as zf: zf.extractall(dataset_dir) # load the dataset with waifuc source = LocalSource(dataset_dir) for item in source: print(item.image, item.meta['filename'], item.meta['tags']) ``` ## List of Clusters List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| | 0 | 21 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, solo, boots, green_shirt, key, simple_background, white_background, brown_footwear, frills, full_body, green_skirt, camouflage_jacket, long_sleeves, smile, holding_card, pocket, standing, looking_at_viewer, open_mouth, backpack, blue_headwear, box | ### Table Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | boots | green_shirt | key | simple_background | white_background | brown_footwear | frills | full_body | green_skirt | camouflage_jacket | long_sleeves | smile | holding_card | pocket | standing | looking_at_viewer | open_mouth | backpack | blue_headwear | box | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:--------|:--------------|:------|:--------------------|:-------------------|:-----------------|:---------|:------------|:--------------|:--------------------|:---------------|:--------|:---------------|:---------|:-----------|:--------------------|:-------------|:-----------|:----------------|:------| | 0 | 21 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
CyberHarem/yamashiro_takane_touhou
[ "task_categories:text-to-image", "size_categories:n<1K", "license:mit", "art", "not-for-all-audiences", "region:us" ]
2023-08-18T22:12:58+00:00
{"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]}
2024-01-15T08:12:19+00:00
[]
[]
TAGS #task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
Dataset of yamashiro\_takane (Touhou) ===================================== This is the dataset of yamashiro\_takane (Touhou), containing 254 images and their tags. The core tags of this character are 'green\_hair, hat, green\_eyes, flat\_cap, medium\_hair, bangs, green\_headwear, camouflage\_headwear', which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization). List of Packages ---------------- ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code List of Clusters ---------------- List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version ### Table Version
[ "### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.", "### Raw Text Version", "### Table Version" ]
[ "TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n", "### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.", "### Raw Text Version", "### Table Version" ]
[ 44, 61, 5, 4 ]
[ "passage: TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.### Raw Text Version### Table Version" ]