sha
stringlengths
40
40
text
stringlengths
1
13.4M
id
stringlengths
2
117
tags
listlengths
1
7.91k
created_at
stringlengths
25
25
metadata
stringlengths
2
875k
last_modified
stringlengths
25
25
arxiv
listlengths
0
25
languages
listlengths
0
7.91k
tags_str
stringlengths
17
159k
text_str
stringlengths
1
447k
text_lists
listlengths
0
352
processed_texts
listlengths
1
353
tokens_length
listlengths
1
353
input_texts
listlengths
1
40
a4e1b28c900bade9fb681e330ad719e8da6d5143
# Dataset Card for Evaluation run of uni-tianyan/Uni-TianYan ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/uni-tianyan/Uni-TianYan - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [uni-tianyan/Uni-TianYan](https://huggingface.co/uni-tianyan/Uni-TianYan) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_uni-tianyan__Uni-TianYan", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-09-18T02:40:09.826211](https://huggingface.co/datasets/open-llm-leaderboard/details_uni-tianyan__Uni-TianYan/blob/main/results_2023-09-18T02-40-09.826211.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.24486157718120805, "em_stderr": 0.004403654691385411, "f1": 0.39787751677852523, "f1_stderr": 0.004155160727794137, "acc": 0.5222921265482389, "acc_stderr": 0.01107896164608613 }, "harness|drop|3": { "em": 0.24486157718120805, "em_stderr": 0.004403654691385411, "f1": 0.39787751677852523, "f1_stderr": 0.004155160727794137 }, "harness|gsm8k|5": { "acc": 0.221379833206975, "acc_stderr": 0.011436000004253518 }, "harness|winogrande|5": { "acc": 0.8232044198895028, "acc_stderr": 0.010721923287918744 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_uni-tianyan__Uni-TianYan
[ "region:us" ]
2023-09-03T11:28:00+00:00
{"pretty_name": "Evaluation run of uni-tianyan/Uni-TianYan", "dataset_summary": "Dataset automatically created during the evaluation run of model [uni-tianyan/Uni-TianYan](https://huggingface.co/uni-tianyan/Uni-TianYan) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_uni-tianyan__Uni-TianYan\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-09-18T02:40:09.826211](https://huggingface.co/datasets/open-llm-leaderboard/details_uni-tianyan__Uni-TianYan/blob/main/results_2023-09-18T02-40-09.826211.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.24486157718120805,\n \"em_stderr\": 0.004403654691385411,\n \"f1\": 0.39787751677852523,\n \"f1_stderr\": 0.004155160727794137,\n \"acc\": 0.5222921265482389,\n \"acc_stderr\": 0.01107896164608613\n },\n \"harness|drop|3\": {\n \"em\": 0.24486157718120805,\n \"em_stderr\": 0.004403654691385411,\n \"f1\": 0.39787751677852523,\n \"f1_stderr\": 0.004155160727794137\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.221379833206975,\n \"acc_stderr\": 0.011436000004253518\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8232044198895028,\n \"acc_stderr\": 0.010721923287918744\n }\n}\n```", "repo_url": "https://huggingface.co/uni-tianyan/Uni-TianYan", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_09_03T12_27_36.436118", "path": ["**/details_harness|arc:challenge|25_2023-09-03T12:27:36.436118.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-09-03T12:27:36.436118.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_09_18T02_40_09.826211", "path": ["**/details_harness|drop|3_2023-09-18T02-40-09.826211.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-09-18T02-40-09.826211.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_09_18T02_40_09.826211", "path": ["**/details_harness|gsm8k|5_2023-09-18T02-40-09.826211.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-09-18T02-40-09.826211.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_09_03T12_27_36.436118", "path": ["**/details_harness|hellaswag|10_2023-09-03T12:27:36.436118.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-09-03T12:27:36.436118.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_09_03T12_27_36.436118", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-03T12:27:36.436118.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-09-03T12:27:36.436118.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-09-03T12:27:36.436118.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-09-03T12:27:36.436118.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-03T12:27:36.436118.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-09-03T12:27:36.436118.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-09-03T12:27:36.436118.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-09-03T12:27:36.436118.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-09-03T12:27:36.436118.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-09-03T12:27:36.436118.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-09-03T12:27:36.436118.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-09-03T12:27:36.436118.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-03T12:27:36.436118.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-09-03T12:27:36.436118.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-03T12:27:36.436118.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-03T12:27:36.436118.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-09-03T12:27:36.436118.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-09-03T12:27:36.436118.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-09-03T12:27:36.436118.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-03T12:27:36.436118.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-03T12:27:36.436118.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-03T12:27:36.436118.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-09-03T12:27:36.436118.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-03T12:27:36.436118.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-03T12:27:36.436118.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-03T12:27:36.436118.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-03T12:27:36.436118.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-09-03T12:27:36.436118.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-03T12:27:36.436118.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-03T12:27:36.436118.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-03T12:27:36.436118.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-03T12:27:36.436118.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-09-03T12:27:36.436118.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-09-03T12:27:36.436118.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-09-03T12:27:36.436118.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-09-03T12:27:36.436118.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-03T12:27:36.436118.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-09-03T12:27:36.436118.parquet", "**/details_harness|hendrycksTest-management|5_2023-09-03T12:27:36.436118.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-09-03T12:27:36.436118.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-09-03T12:27:36.436118.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-09-03T12:27:36.436118.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-09-03T12:27:36.436118.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-03T12:27:36.436118.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-09-03T12:27:36.436118.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-09-03T12:27:36.436118.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-09-03T12:27:36.436118.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-09-03T12:27:36.436118.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-09-03T12:27:36.436118.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-09-03T12:27:36.436118.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-09-03T12:27:36.436118.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-09-03T12:27:36.436118.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-09-03T12:27:36.436118.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-09-03T12:27:36.436118.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-03T12:27:36.436118.parquet", "**/details_harness|hendrycksTest-virology|5_2023-09-03T12:27:36.436118.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-09-03T12:27:36.436118.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-03T12:27:36.436118.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-09-03T12:27:36.436118.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-09-03T12:27:36.436118.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-09-03T12:27:36.436118.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-03T12:27:36.436118.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-09-03T12:27:36.436118.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-09-03T12:27:36.436118.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-09-03T12:27:36.436118.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-09-03T12:27:36.436118.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-09-03T12:27:36.436118.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-09-03T12:27:36.436118.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-09-03T12:27:36.436118.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-03T12:27:36.436118.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-09-03T12:27:36.436118.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-03T12:27:36.436118.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-03T12:27:36.436118.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-09-03T12:27:36.436118.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-09-03T12:27:36.436118.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-09-03T12:27:36.436118.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-03T12:27:36.436118.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-03T12:27:36.436118.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-03T12:27:36.436118.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-09-03T12:27:36.436118.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-03T12:27:36.436118.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-03T12:27:36.436118.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-03T12:27:36.436118.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-03T12:27:36.436118.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-09-03T12:27:36.436118.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-03T12:27:36.436118.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-03T12:27:36.436118.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-03T12:27:36.436118.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-03T12:27:36.436118.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-09-03T12:27:36.436118.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-09-03T12:27:36.436118.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-09-03T12:27:36.436118.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-09-03T12:27:36.436118.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-03T12:27:36.436118.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-09-03T12:27:36.436118.parquet", "**/details_harness|hendrycksTest-management|5_2023-09-03T12:27:36.436118.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-09-03T12:27:36.436118.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-09-03T12:27:36.436118.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-09-03T12:27:36.436118.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-09-03T12:27:36.436118.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-03T12:27:36.436118.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-09-03T12:27:36.436118.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-09-03T12:27:36.436118.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-09-03T12:27:36.436118.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-09-03T12:27:36.436118.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-09-03T12:27:36.436118.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-09-03T12:27:36.436118.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-09-03T12:27:36.436118.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-09-03T12:27:36.436118.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-09-03T12:27:36.436118.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-09-03T12:27:36.436118.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-03T12:27:36.436118.parquet", "**/details_harness|hendrycksTest-virology|5_2023-09-03T12:27:36.436118.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-09-03T12:27:36.436118.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_09_03T12_27_36.436118", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-03T12:27:36.436118.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-03T12:27:36.436118.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_09_03T12_27_36.436118", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-09-03T12:27:36.436118.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-09-03T12:27:36.436118.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_09_03T12_27_36.436118", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-09-03T12:27:36.436118.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-09-03T12:27:36.436118.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_09_03T12_27_36.436118", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-09-03T12:27:36.436118.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-09-03T12:27:36.436118.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_09_03T12_27_36.436118", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-03T12:27:36.436118.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-03T12:27:36.436118.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_09_03T12_27_36.436118", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-09-03T12:27:36.436118.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-09-03T12:27:36.436118.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_09_03T12_27_36.436118", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-09-03T12:27:36.436118.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-09-03T12:27:36.436118.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_09_03T12_27_36.436118", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-09-03T12:27:36.436118.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-09-03T12:27:36.436118.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_09_03T12_27_36.436118", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-09-03T12:27:36.436118.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-09-03T12:27:36.436118.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_09_03T12_27_36.436118", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-09-03T12:27:36.436118.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-09-03T12:27:36.436118.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_09_03T12_27_36.436118", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-09-03T12:27:36.436118.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-09-03T12:27:36.436118.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_09_03T12_27_36.436118", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-09-03T12:27:36.436118.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-09-03T12:27:36.436118.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_09_03T12_27_36.436118", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-03T12:27:36.436118.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-03T12:27:36.436118.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_09_03T12_27_36.436118", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-09-03T12:27:36.436118.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-09-03T12:27:36.436118.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_09_03T12_27_36.436118", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-03T12:27:36.436118.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-03T12:27:36.436118.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_09_03T12_27_36.436118", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-03T12:27:36.436118.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-03T12:27:36.436118.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_09_03T12_27_36.436118", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-09-03T12:27:36.436118.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-09-03T12:27:36.436118.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_09_03T12_27_36.436118", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-09-03T12:27:36.436118.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-09-03T12:27:36.436118.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_09_03T12_27_36.436118", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-09-03T12:27:36.436118.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-09-03T12:27:36.436118.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_09_03T12_27_36.436118", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-03T12:27:36.436118.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-03T12:27:36.436118.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_09_03T12_27_36.436118", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-03T12:27:36.436118.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-03T12:27:36.436118.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_09_03T12_27_36.436118", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-03T12:27:36.436118.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-03T12:27:36.436118.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_09_03T12_27_36.436118", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-09-03T12:27:36.436118.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-09-03T12:27:36.436118.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_09_03T12_27_36.436118", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-03T12:27:36.436118.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-03T12:27:36.436118.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_09_03T12_27_36.436118", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-03T12:27:36.436118.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-03T12:27:36.436118.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_09_03T12_27_36.436118", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-03T12:27:36.436118.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-03T12:27:36.436118.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_09_03T12_27_36.436118", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-03T12:27:36.436118.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-03T12:27:36.436118.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_09_03T12_27_36.436118", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-09-03T12:27:36.436118.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-09-03T12:27:36.436118.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_09_03T12_27_36.436118", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-03T12:27:36.436118.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-03T12:27:36.436118.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_09_03T12_27_36.436118", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-03T12:27:36.436118.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-03T12:27:36.436118.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_09_03T12_27_36.436118", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-03T12:27:36.436118.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-03T12:27:36.436118.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_09_03T12_27_36.436118", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-03T12:27:36.436118.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-03T12:27:36.436118.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_09_03T12_27_36.436118", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-09-03T12:27:36.436118.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-09-03T12:27:36.436118.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_09_03T12_27_36.436118", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-09-03T12:27:36.436118.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-09-03T12:27:36.436118.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_09_03T12_27_36.436118", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-09-03T12:27:36.436118.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-09-03T12:27:36.436118.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_09_03T12_27_36.436118", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-09-03T12:27:36.436118.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-09-03T12:27:36.436118.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_09_03T12_27_36.436118", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-03T12:27:36.436118.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-03T12:27:36.436118.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_09_03T12_27_36.436118", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-09-03T12:27:36.436118.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-09-03T12:27:36.436118.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_09_03T12_27_36.436118", "path": ["**/details_harness|hendrycksTest-management|5_2023-09-03T12:27:36.436118.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-09-03T12:27:36.436118.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_09_03T12_27_36.436118", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-09-03T12:27:36.436118.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-09-03T12:27:36.436118.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_09_03T12_27_36.436118", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-09-03T12:27:36.436118.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-09-03T12:27:36.436118.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_09_03T12_27_36.436118", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-09-03T12:27:36.436118.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-09-03T12:27:36.436118.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_09_03T12_27_36.436118", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-09-03T12:27:36.436118.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-09-03T12:27:36.436118.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_09_03T12_27_36.436118", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-03T12:27:36.436118.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-03T12:27:36.436118.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_09_03T12_27_36.436118", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-09-03T12:27:36.436118.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-09-03T12:27:36.436118.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_09_03T12_27_36.436118", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-09-03T12:27:36.436118.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-09-03T12:27:36.436118.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_09_03T12_27_36.436118", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-09-03T12:27:36.436118.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-09-03T12:27:36.436118.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_09_03T12_27_36.436118", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-09-03T12:27:36.436118.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-09-03T12:27:36.436118.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_09_03T12_27_36.436118", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-09-03T12:27:36.436118.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-09-03T12:27:36.436118.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_09_03T12_27_36.436118", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-09-03T12:27:36.436118.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-09-03T12:27:36.436118.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_09_03T12_27_36.436118", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-09-03T12:27:36.436118.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-09-03T12:27:36.436118.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_09_03T12_27_36.436118", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-09-03T12:27:36.436118.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-09-03T12:27:36.436118.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_09_03T12_27_36.436118", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-09-03T12:27:36.436118.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-09-03T12:27:36.436118.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_09_03T12_27_36.436118", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-09-03T12:27:36.436118.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-09-03T12:27:36.436118.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_09_03T12_27_36.436118", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-03T12:27:36.436118.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-03T12:27:36.436118.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_09_03T12_27_36.436118", "path": ["**/details_harness|hendrycksTest-virology|5_2023-09-03T12:27:36.436118.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-09-03T12:27:36.436118.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_09_03T12_27_36.436118", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-09-03T12:27:36.436118.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-09-03T12:27:36.436118.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_09_03T12_27_36.436118", "path": ["**/details_harness|truthfulqa:mc|0_2023-09-03T12:27:36.436118.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-09-03T12:27:36.436118.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_09_18T02_40_09.826211", "path": ["**/details_harness|winogrande|5_2023-09-18T02-40-09.826211.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-09-18T02-40-09.826211.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_09_03T12_27_36.436118", "path": ["results_2023-09-03T12:27:36.436118.parquet"]}, {"split": "2023_09_18T02_40_09.826211", "path": ["results_2023-09-18T02-40-09.826211.parquet"]}, {"split": "latest", "path": ["results_2023-09-18T02-40-09.826211.parquet"]}]}]}
2023-09-18T01:40:22+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of uni-tianyan/Uni-TianYan ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model uni-tianyan/Uni-TianYan on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-09-18T02:40:09.826211(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of uni-tianyan/Uni-TianYan", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model uni-tianyan/Uni-TianYan on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-09-18T02:40:09.826211(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of uni-tianyan/Uni-TianYan", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model uni-tianyan/Uni-TianYan on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-09-18T02:40:09.826211(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 19, 31, 167, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of uni-tianyan/Uni-TianYan## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model uni-tianyan/Uni-TianYan on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-09-18T02:40:09.826211(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
1ca974624ee2bb45828148caebc46446195d0f8e
# Dataset Card for "LILTS" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
LauraExp/LILTS
[ "region:us" ]
2023-09-03T11:30:16+00:00
{"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "test", "path": "data/test-*"}]}], "dataset_info": {"features": [{"name": "image", "dtype": "image"}], "splits": [{"name": "train", "num_bytes": 3396339.0, "num_examples": 2}, {"name": "test", "num_bytes": 3396339.0, "num_examples": 2}], "download_size": 0, "dataset_size": 6792678.0}}
2023-09-03T11:50:21+00:00
[]
[]
TAGS #region-us
# Dataset Card for "LILTS" More Information needed
[ "# Dataset Card for \"LILTS\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"LILTS\"\n\nMore Information needed" ]
[ 6, 13 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"LILTS\"\n\nMore Information needed" ]
be2fc79cce28ab2273e4c1427d44547cbeb7d7d3
# Dataset Card for Evaluation run of xxyyy123/10k_v1_lora_qkvo_rank14_v3 ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/xxyyy123/10k_v1_lora_qkvo_rank14_v3 - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [xxyyy123/10k_v1_lora_qkvo_rank14_v3](https://huggingface.co/xxyyy123/10k_v1_lora_qkvo_rank14_v3) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_xxyyy123__10k_v1_lora_qkvo_rank14_v3", "harness_truthfulqa_mc_0", split="train") ``` ## Latest results These are the [latest results from run 2023-09-03T13:17:02.987872](https://huggingface.co/datasets/open-llm-leaderboard/details_xxyyy123__10k_v1_lora_qkvo_rank14_v3/blob/main/results_2023-09-03T13%3A17%3A02.987872.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.5091352266849982, "acc_stderr": 0.03495474191892426, "acc_norm": 0.5128128131582483, "acc_norm_stderr": 0.03493935725866389, "mc1": 0.3671970624235006, "mc1_stderr": 0.01687480500145318, "mc2": 0.5344202990692574, "mc2_stderr": 0.015729161957393895 }, "harness|arc:challenge|25": { "acc": 0.5298634812286689, "acc_stderr": 0.014585305840007105, "acc_norm": 0.5597269624573379, "acc_norm_stderr": 0.01450676952480424 }, "harness|hellaswag|10": { "acc": 0.6050587532364071, "acc_stderr": 0.004878390226591715, "acc_norm": 0.7921728739294961, "acc_norm_stderr": 0.00404923158643323 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.31, "acc_stderr": 0.04648231987117316, "acc_norm": 0.31, "acc_norm_stderr": 0.04648231987117316 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.48148148148148145, "acc_stderr": 0.043163785995113245, "acc_norm": 0.48148148148148145, "acc_norm_stderr": 0.043163785995113245 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.5, "acc_stderr": 0.04068942293855797, "acc_norm": 0.5, "acc_norm_stderr": 0.04068942293855797 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.49, "acc_stderr": 0.05024183937956912, "acc_norm": 0.49, "acc_norm_stderr": 0.05024183937956912 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.5584905660377358, "acc_stderr": 0.030561590426731833, "acc_norm": 0.5584905660377358, "acc_norm_stderr": 0.030561590426731833 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.5486111111111112, "acc_stderr": 0.041614023984032786, "acc_norm": 0.5486111111111112, "acc_norm_stderr": 0.041614023984032786 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.37, "acc_stderr": 0.048523658709391, "acc_norm": 0.37, "acc_norm_stderr": 0.048523658709391 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.39, "acc_stderr": 0.04902071300001975, "acc_norm": 0.39, "acc_norm_stderr": 0.04902071300001975 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.29, "acc_stderr": 0.045604802157206845, "acc_norm": 0.29, "acc_norm_stderr": 0.045604802157206845 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.45664739884393063, "acc_stderr": 0.03798106566014498, "acc_norm": 0.45664739884393063, "acc_norm_stderr": 0.03798106566014498 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.2647058823529412, "acc_stderr": 0.04389869956808778, "acc_norm": 0.2647058823529412, "acc_norm_stderr": 0.04389869956808778 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.64, "acc_stderr": 0.04824181513244218, "acc_norm": 0.64, "acc_norm_stderr": 0.04824181513244218 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.46808510638297873, "acc_stderr": 0.03261936918467382, "acc_norm": 0.46808510638297873, "acc_norm_stderr": 0.03261936918467382 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.34210526315789475, "acc_stderr": 0.04462917535336936, "acc_norm": 0.34210526315789475, "acc_norm_stderr": 0.04462917535336936 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.46206896551724136, "acc_stderr": 0.041546596717075474, "acc_norm": 0.46206896551724136, "acc_norm_stderr": 0.041546596717075474 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.29365079365079366, "acc_stderr": 0.023456037383982022, "acc_norm": 0.29365079365079366, "acc_norm_stderr": 0.023456037383982022 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.2698412698412698, "acc_stderr": 0.03970158273235173, "acc_norm": 0.2698412698412698, "acc_norm_stderr": 0.03970158273235173 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.37, "acc_stderr": 0.04852365870939099, "acc_norm": 0.37, "acc_norm_stderr": 0.04852365870939099 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.5451612903225806, "acc_stderr": 0.028327743091561077, "acc_norm": 0.5451612903225806, "acc_norm_stderr": 0.028327743091561077 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.3793103448275862, "acc_stderr": 0.034139638059062345, "acc_norm": 0.3793103448275862, "acc_norm_stderr": 0.034139638059062345 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.39, "acc_stderr": 0.04902071300001975, "acc_norm": 0.39, "acc_norm_stderr": 0.04902071300001975 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.7333333333333333, "acc_stderr": 0.03453131801885415, "acc_norm": 0.7333333333333333, "acc_norm_stderr": 0.03453131801885415 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.6363636363636364, "acc_stderr": 0.03427308652999934, "acc_norm": 0.6363636363636364, "acc_norm_stderr": 0.03427308652999934 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.7253886010362695, "acc_stderr": 0.03221024508041153, "acc_norm": 0.7253886010362695, "acc_norm_stderr": 0.03221024508041153 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.4948717948717949, "acc_stderr": 0.02534967290683866, "acc_norm": 0.4948717948717949, "acc_norm_stderr": 0.02534967290683866 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.24074074074074073, "acc_stderr": 0.026067159222275805, "acc_norm": 0.24074074074074073, "acc_norm_stderr": 0.026067159222275805 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.5126050420168067, "acc_stderr": 0.03246816765752174, "acc_norm": 0.5126050420168067, "acc_norm_stderr": 0.03246816765752174 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.31125827814569534, "acc_stderr": 0.03780445850526733, "acc_norm": 0.31125827814569534, "acc_norm_stderr": 0.03780445850526733 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.7211009174311926, "acc_stderr": 0.0192274688764635, "acc_norm": 0.7211009174311926, "acc_norm_stderr": 0.0192274688764635 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.4166666666666667, "acc_stderr": 0.03362277436608043, "acc_norm": 0.4166666666666667, "acc_norm_stderr": 0.03362277436608043 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.696078431372549, "acc_stderr": 0.03228210387037892, "acc_norm": 0.696078431372549, "acc_norm_stderr": 0.03228210387037892 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.7341772151898734, "acc_stderr": 0.02875679962965834, "acc_norm": 0.7341772151898734, "acc_norm_stderr": 0.02875679962965834 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.5515695067264574, "acc_stderr": 0.03337883736255098, "acc_norm": 0.5515695067264574, "acc_norm_stderr": 0.03337883736255098 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.6030534351145038, "acc_stderr": 0.04291135671009224, "acc_norm": 0.6030534351145038, "acc_norm_stderr": 0.04291135671009224 }, "harness|hendrycksTest-international_law|5": { "acc": 0.7024793388429752, "acc_stderr": 0.04173349148083499, "acc_norm": 0.7024793388429752, "acc_norm_stderr": 0.04173349148083499 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.6388888888888888, "acc_stderr": 0.04643454608906275, "acc_norm": 0.6388888888888888, "acc_norm_stderr": 0.04643454608906275 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.5398773006134969, "acc_stderr": 0.03915857291436971, "acc_norm": 0.5398773006134969, "acc_norm_stderr": 0.03915857291436971 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.3392857142857143, "acc_stderr": 0.04493949068613539, "acc_norm": 0.3392857142857143, "acc_norm_stderr": 0.04493949068613539 }, "harness|hendrycksTest-management|5": { "acc": 0.7281553398058253, "acc_stderr": 0.044052680241409216, "acc_norm": 0.7281553398058253, "acc_norm_stderr": 0.044052680241409216 }, "harness|hendrycksTest-marketing|5": { "acc": 0.7649572649572649, "acc_stderr": 0.027778835904935434, "acc_norm": 0.7649572649572649, "acc_norm_stderr": 0.027778835904935434 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.57, "acc_stderr": 0.04975698519562428, "acc_norm": 0.57, "acc_norm_stderr": 0.04975698519562428 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.7037037037037037, "acc_stderr": 0.016328814422102052, "acc_norm": 0.7037037037037037, "acc_norm_stderr": 0.016328814422102052 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.5664739884393064, "acc_stderr": 0.026680134761679214, "acc_norm": 0.5664739884393064, "acc_norm_stderr": 0.026680134761679214 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.2547486033519553, "acc_stderr": 0.014572650383409155, "acc_norm": 0.2547486033519553, "acc_norm_stderr": 0.014572650383409155 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.5359477124183006, "acc_stderr": 0.02855582751652878, "acc_norm": 0.5359477124183006, "acc_norm_stderr": 0.02855582751652878 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.5819935691318328, "acc_stderr": 0.028013651891995072, "acc_norm": 0.5819935691318328, "acc_norm_stderr": 0.028013651891995072 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.5524691358024691, "acc_stderr": 0.027667138569422704, "acc_norm": 0.5524691358024691, "acc_norm_stderr": 0.027667138569422704 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.36879432624113473, "acc_stderr": 0.028782227561347243, "acc_norm": 0.36879432624113473, "acc_norm_stderr": 0.028782227561347243 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.3820078226857888, "acc_stderr": 0.012409564470235567, "acc_norm": 0.3820078226857888, "acc_norm_stderr": 0.012409564470235567 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.48161764705882354, "acc_stderr": 0.030352303395351964, "acc_norm": 0.48161764705882354, "acc_norm_stderr": 0.030352303395351964 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.4820261437908497, "acc_stderr": 0.020214761037872404, "acc_norm": 0.4820261437908497, "acc_norm_stderr": 0.020214761037872404 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.5818181818181818, "acc_stderr": 0.04724577405731572, "acc_norm": 0.5818181818181818, "acc_norm_stderr": 0.04724577405731572 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.6040816326530613, "acc_stderr": 0.03130802899065686, "acc_norm": 0.6040816326530613, "acc_norm_stderr": 0.03130802899065686 }, "harness|hendrycksTest-sociology|5": { "acc": 0.5373134328358209, "acc_stderr": 0.035256751674679745, "acc_norm": 0.5373134328358209, "acc_norm_stderr": 0.035256751674679745 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.69, "acc_stderr": 0.04648231987117316, "acc_norm": 0.69, "acc_norm_stderr": 0.04648231987117316 }, "harness|hendrycksTest-virology|5": { "acc": 0.42168674698795183, "acc_stderr": 0.03844453181770917, "acc_norm": 0.42168674698795183, "acc_norm_stderr": 0.03844453181770917 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.6783625730994152, "acc_stderr": 0.03582529442573122, "acc_norm": 0.6783625730994152, "acc_norm_stderr": 0.03582529442573122 }, "harness|truthfulqa:mc|0": { "mc1": 0.3671970624235006, "mc1_stderr": 0.01687480500145318, "mc2": 0.5344202990692574, "mc2_stderr": 0.015729161957393895 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_xxyyy123__10k_v1_lora_qkvo_rank14_v3
[ "region:us" ]
2023-09-03T12:17:27+00:00
{"pretty_name": "Evaluation run of xxyyy123/10k_v1_lora_qkvo_rank14_v3", "dataset_summary": "Dataset automatically created during the evaluation run of model [xxyyy123/10k_v1_lora_qkvo_rank14_v3](https://huggingface.co/xxyyy123/10k_v1_lora_qkvo_rank14_v3) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_xxyyy123__10k_v1_lora_qkvo_rank14_v3\",\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-09-03T13:17:02.987872](https://huggingface.co/datasets/open-llm-leaderboard/details_xxyyy123__10k_v1_lora_qkvo_rank14_v3/blob/main/results_2023-09-03T13%3A17%3A02.987872.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5091352266849982,\n \"acc_stderr\": 0.03495474191892426,\n \"acc_norm\": 0.5128128131582483,\n \"acc_norm_stderr\": 0.03493935725866389,\n \"mc1\": 0.3671970624235006,\n \"mc1_stderr\": 0.01687480500145318,\n \"mc2\": 0.5344202990692574,\n \"mc2_stderr\": 0.015729161957393895\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5298634812286689,\n \"acc_stderr\": 0.014585305840007105,\n \"acc_norm\": 0.5597269624573379,\n \"acc_norm_stderr\": 0.01450676952480424\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6050587532364071,\n \"acc_stderr\": 0.004878390226591715,\n \"acc_norm\": 0.7921728739294961,\n \"acc_norm_stderr\": 0.00404923158643323\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.48148148148148145,\n \"acc_stderr\": 0.043163785995113245,\n \"acc_norm\": 0.48148148148148145,\n \"acc_norm_stderr\": 0.043163785995113245\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.04068942293855797,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.04068942293855797\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.5584905660377358,\n \"acc_stderr\": 0.030561590426731833,\n \"acc_norm\": 0.5584905660377358,\n \"acc_norm_stderr\": 0.030561590426731833\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5486111111111112,\n \"acc_stderr\": 0.041614023984032786,\n \"acc_norm\": 0.5486111111111112,\n \"acc_norm_stderr\": 0.041614023984032786\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.45664739884393063,\n \"acc_stderr\": 0.03798106566014498,\n \"acc_norm\": 0.45664739884393063,\n \"acc_norm_stderr\": 0.03798106566014498\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.2647058823529412,\n \"acc_stderr\": 0.04389869956808778,\n \"acc_norm\": 0.2647058823529412,\n \"acc_norm_stderr\": 0.04389869956808778\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.64,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.46808510638297873,\n \"acc_stderr\": 0.03261936918467382,\n \"acc_norm\": 0.46808510638297873,\n \"acc_norm_stderr\": 0.03261936918467382\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.34210526315789475,\n \"acc_stderr\": 0.04462917535336936,\n \"acc_norm\": 0.34210526315789475,\n \"acc_norm_stderr\": 0.04462917535336936\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.46206896551724136,\n \"acc_stderr\": 0.041546596717075474,\n \"acc_norm\": 0.46206896551724136,\n \"acc_norm_stderr\": 0.041546596717075474\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.29365079365079366,\n \"acc_stderr\": 0.023456037383982022,\n \"acc_norm\": 0.29365079365079366,\n \"acc_norm_stderr\": 0.023456037383982022\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2698412698412698,\n \"acc_stderr\": 0.03970158273235173,\n \"acc_norm\": 0.2698412698412698,\n \"acc_norm_stderr\": 0.03970158273235173\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.5451612903225806,\n \"acc_stderr\": 0.028327743091561077,\n \"acc_norm\": 0.5451612903225806,\n \"acc_norm_stderr\": 0.028327743091561077\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.3793103448275862,\n \"acc_stderr\": 0.034139638059062345,\n \"acc_norm\": 0.3793103448275862,\n \"acc_norm_stderr\": 0.034139638059062345\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7333333333333333,\n \"acc_stderr\": 0.03453131801885415,\n \"acc_norm\": 0.7333333333333333,\n \"acc_norm_stderr\": 0.03453131801885415\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.6363636363636364,\n \"acc_stderr\": 0.03427308652999934,\n \"acc_norm\": 0.6363636363636364,\n \"acc_norm_stderr\": 0.03427308652999934\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.7253886010362695,\n \"acc_stderr\": 0.03221024508041153,\n \"acc_norm\": 0.7253886010362695,\n \"acc_norm_stderr\": 0.03221024508041153\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.4948717948717949,\n \"acc_stderr\": 0.02534967290683866,\n \"acc_norm\": 0.4948717948717949,\n \"acc_norm_stderr\": 0.02534967290683866\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.24074074074074073,\n \"acc_stderr\": 0.026067159222275805,\n \"acc_norm\": 0.24074074074074073,\n \"acc_norm_stderr\": 0.026067159222275805\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.5126050420168067,\n \"acc_stderr\": 0.03246816765752174,\n \"acc_norm\": 0.5126050420168067,\n \"acc_norm_stderr\": 0.03246816765752174\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.31125827814569534,\n \"acc_stderr\": 0.03780445850526733,\n \"acc_norm\": 0.31125827814569534,\n \"acc_norm_stderr\": 0.03780445850526733\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.7211009174311926,\n \"acc_stderr\": 0.0192274688764635,\n \"acc_norm\": 0.7211009174311926,\n \"acc_norm_stderr\": 0.0192274688764635\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.4166666666666667,\n \"acc_stderr\": 0.03362277436608043,\n \"acc_norm\": 0.4166666666666667,\n \"acc_norm_stderr\": 0.03362277436608043\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.696078431372549,\n \"acc_stderr\": 0.03228210387037892,\n \"acc_norm\": 0.696078431372549,\n \"acc_norm_stderr\": 0.03228210387037892\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7341772151898734,\n \"acc_stderr\": 0.02875679962965834,\n \"acc_norm\": 0.7341772151898734,\n \"acc_norm_stderr\": 0.02875679962965834\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5515695067264574,\n \"acc_stderr\": 0.03337883736255098,\n \"acc_norm\": 0.5515695067264574,\n \"acc_norm_stderr\": 0.03337883736255098\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.6030534351145038,\n \"acc_stderr\": 0.04291135671009224,\n \"acc_norm\": 0.6030534351145038,\n \"acc_norm_stderr\": 0.04291135671009224\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7024793388429752,\n \"acc_stderr\": 0.04173349148083499,\n \"acc_norm\": 0.7024793388429752,\n \"acc_norm_stderr\": 0.04173349148083499\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6388888888888888,\n \"acc_stderr\": 0.04643454608906275,\n \"acc_norm\": 0.6388888888888888,\n \"acc_norm_stderr\": 0.04643454608906275\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.5398773006134969,\n \"acc_stderr\": 0.03915857291436971,\n \"acc_norm\": 0.5398773006134969,\n \"acc_norm_stderr\": 0.03915857291436971\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3392857142857143,\n \"acc_stderr\": 0.04493949068613539,\n \"acc_norm\": 0.3392857142857143,\n \"acc_norm_stderr\": 0.04493949068613539\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7281553398058253,\n \"acc_stderr\": 0.044052680241409216,\n \"acc_norm\": 0.7281553398058253,\n \"acc_norm_stderr\": 0.044052680241409216\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7649572649572649,\n \"acc_stderr\": 0.027778835904935434,\n \"acc_norm\": 0.7649572649572649,\n \"acc_norm_stderr\": 0.027778835904935434\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.57,\n \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\": 0.57,\n \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7037037037037037,\n \"acc_stderr\": 0.016328814422102052,\n \"acc_norm\": 0.7037037037037037,\n \"acc_norm_stderr\": 0.016328814422102052\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.5664739884393064,\n \"acc_stderr\": 0.026680134761679214,\n \"acc_norm\": 0.5664739884393064,\n \"acc_norm_stderr\": 0.026680134761679214\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2547486033519553,\n \"acc_stderr\": 0.014572650383409155,\n \"acc_norm\": 0.2547486033519553,\n \"acc_norm_stderr\": 0.014572650383409155\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.5359477124183006,\n \"acc_stderr\": 0.02855582751652878,\n \"acc_norm\": 0.5359477124183006,\n \"acc_norm_stderr\": 0.02855582751652878\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5819935691318328,\n \"acc_stderr\": 0.028013651891995072,\n \"acc_norm\": 0.5819935691318328,\n \"acc_norm_stderr\": 0.028013651891995072\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.5524691358024691,\n \"acc_stderr\": 0.027667138569422704,\n \"acc_norm\": 0.5524691358024691,\n \"acc_norm_stderr\": 0.027667138569422704\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.36879432624113473,\n \"acc_stderr\": 0.028782227561347243,\n \"acc_norm\": 0.36879432624113473,\n \"acc_norm_stderr\": 0.028782227561347243\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3820078226857888,\n \"acc_stderr\": 0.012409564470235567,\n \"acc_norm\": 0.3820078226857888,\n \"acc_norm_stderr\": 0.012409564470235567\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.48161764705882354,\n \"acc_stderr\": 0.030352303395351964,\n \"acc_norm\": 0.48161764705882354,\n \"acc_norm_stderr\": 0.030352303395351964\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.4820261437908497,\n \"acc_stderr\": 0.020214761037872404,\n \"acc_norm\": 0.4820261437908497,\n \"acc_norm_stderr\": 0.020214761037872404\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5818181818181818,\n \"acc_stderr\": 0.04724577405731572,\n \"acc_norm\": 0.5818181818181818,\n \"acc_norm_stderr\": 0.04724577405731572\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.6040816326530613,\n \"acc_stderr\": 0.03130802899065686,\n \"acc_norm\": 0.6040816326530613,\n \"acc_norm_stderr\": 0.03130802899065686\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.5373134328358209,\n \"acc_stderr\": 0.035256751674679745,\n \"acc_norm\": 0.5373134328358209,\n \"acc_norm_stderr\": 0.035256751674679745\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.42168674698795183,\n \"acc_stderr\": 0.03844453181770917,\n \"acc_norm\": 0.42168674698795183,\n \"acc_norm_stderr\": 0.03844453181770917\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.6783625730994152,\n \"acc_stderr\": 0.03582529442573122,\n \"acc_norm\": 0.6783625730994152,\n \"acc_norm_stderr\": 0.03582529442573122\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3671970624235006,\n \"mc1_stderr\": 0.01687480500145318,\n \"mc2\": 0.5344202990692574,\n \"mc2_stderr\": 0.015729161957393895\n }\n}\n```", "repo_url": "https://huggingface.co/xxyyy123/10k_v1_lora_qkvo_rank14_v3", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_09_03T13_17_02.987872", "path": ["**/details_harness|arc:challenge|25_2023-09-03T13:17:02.987872.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-09-03T13:17:02.987872.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_09_03T13_17_02.987872", "path": ["**/details_harness|hellaswag|10_2023-09-03T13:17:02.987872.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-09-03T13:17:02.987872.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_09_03T13_17_02.987872", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-03T13:17:02.987872.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-09-03T13:17:02.987872.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-09-03T13:17:02.987872.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-09-03T13:17:02.987872.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-03T13:17:02.987872.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-09-03T13:17:02.987872.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-09-03T13:17:02.987872.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-09-03T13:17:02.987872.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-09-03T13:17:02.987872.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-09-03T13:17:02.987872.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-09-03T13:17:02.987872.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-09-03T13:17:02.987872.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-03T13:17:02.987872.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-09-03T13:17:02.987872.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-03T13:17:02.987872.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-03T13:17:02.987872.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-09-03T13:17:02.987872.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-09-03T13:17:02.987872.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-09-03T13:17:02.987872.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-03T13:17:02.987872.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-03T13:17:02.987872.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-03T13:17:02.987872.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-09-03T13:17:02.987872.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-03T13:17:02.987872.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-03T13:17:02.987872.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-03T13:17:02.987872.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-03T13:17:02.987872.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-09-03T13:17:02.987872.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-03T13:17:02.987872.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-03T13:17:02.987872.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-03T13:17:02.987872.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-03T13:17:02.987872.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-09-03T13:17:02.987872.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-09-03T13:17:02.987872.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-09-03T13:17:02.987872.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-09-03T13:17:02.987872.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-03T13:17:02.987872.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-09-03T13:17:02.987872.parquet", "**/details_harness|hendrycksTest-management|5_2023-09-03T13:17:02.987872.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-09-03T13:17:02.987872.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-09-03T13:17:02.987872.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-09-03T13:17:02.987872.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-09-03T13:17:02.987872.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-03T13:17:02.987872.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-09-03T13:17:02.987872.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-09-03T13:17:02.987872.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-09-03T13:17:02.987872.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-09-03T13:17:02.987872.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-09-03T13:17:02.987872.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-09-03T13:17:02.987872.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-09-03T13:17:02.987872.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-09-03T13:17:02.987872.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-09-03T13:17:02.987872.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-09-03T13:17:02.987872.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-03T13:17:02.987872.parquet", "**/details_harness|hendrycksTest-virology|5_2023-09-03T13:17:02.987872.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-09-03T13:17:02.987872.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-03T13:17:02.987872.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-09-03T13:17:02.987872.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-09-03T13:17:02.987872.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-09-03T13:17:02.987872.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-03T13:17:02.987872.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-09-03T13:17:02.987872.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-09-03T13:17:02.987872.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-09-03T13:17:02.987872.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-09-03T13:17:02.987872.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-09-03T13:17:02.987872.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-09-03T13:17:02.987872.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-09-03T13:17:02.987872.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-03T13:17:02.987872.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-09-03T13:17:02.987872.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-03T13:17:02.987872.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-03T13:17:02.987872.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-09-03T13:17:02.987872.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-09-03T13:17:02.987872.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-09-03T13:17:02.987872.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-03T13:17:02.987872.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-03T13:17:02.987872.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-03T13:17:02.987872.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-09-03T13:17:02.987872.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-03T13:17:02.987872.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-03T13:17:02.987872.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-03T13:17:02.987872.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-03T13:17:02.987872.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-09-03T13:17:02.987872.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-03T13:17:02.987872.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-03T13:17:02.987872.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-03T13:17:02.987872.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-03T13:17:02.987872.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-09-03T13:17:02.987872.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-09-03T13:17:02.987872.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-09-03T13:17:02.987872.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-09-03T13:17:02.987872.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-03T13:17:02.987872.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-09-03T13:17:02.987872.parquet", "**/details_harness|hendrycksTest-management|5_2023-09-03T13:17:02.987872.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-09-03T13:17:02.987872.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-09-03T13:17:02.987872.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-09-03T13:17:02.987872.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-09-03T13:17:02.987872.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-03T13:17:02.987872.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-09-03T13:17:02.987872.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-09-03T13:17:02.987872.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-09-03T13:17:02.987872.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-09-03T13:17:02.987872.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-09-03T13:17:02.987872.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-09-03T13:17:02.987872.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-09-03T13:17:02.987872.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-09-03T13:17:02.987872.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-09-03T13:17:02.987872.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-09-03T13:17:02.987872.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-03T13:17:02.987872.parquet", "**/details_harness|hendrycksTest-virology|5_2023-09-03T13:17:02.987872.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-09-03T13:17:02.987872.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_09_03T13_17_02.987872", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-03T13:17:02.987872.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-03T13:17:02.987872.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_09_03T13_17_02.987872", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-09-03T13:17:02.987872.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-09-03T13:17:02.987872.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_09_03T13_17_02.987872", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-09-03T13:17:02.987872.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-09-03T13:17:02.987872.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_09_03T13_17_02.987872", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-09-03T13:17:02.987872.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-09-03T13:17:02.987872.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_09_03T13_17_02.987872", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-03T13:17:02.987872.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-03T13:17:02.987872.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_09_03T13_17_02.987872", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-09-03T13:17:02.987872.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-09-03T13:17:02.987872.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_09_03T13_17_02.987872", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-09-03T13:17:02.987872.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-09-03T13:17:02.987872.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_09_03T13_17_02.987872", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-09-03T13:17:02.987872.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-09-03T13:17:02.987872.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_09_03T13_17_02.987872", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-09-03T13:17:02.987872.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-09-03T13:17:02.987872.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_09_03T13_17_02.987872", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-09-03T13:17:02.987872.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-09-03T13:17:02.987872.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_09_03T13_17_02.987872", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-09-03T13:17:02.987872.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-09-03T13:17:02.987872.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_09_03T13_17_02.987872", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-09-03T13:17:02.987872.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-09-03T13:17:02.987872.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_09_03T13_17_02.987872", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-03T13:17:02.987872.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-03T13:17:02.987872.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_09_03T13_17_02.987872", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-09-03T13:17:02.987872.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-09-03T13:17:02.987872.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_09_03T13_17_02.987872", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-03T13:17:02.987872.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-03T13:17:02.987872.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_09_03T13_17_02.987872", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-03T13:17:02.987872.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-03T13:17:02.987872.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_09_03T13_17_02.987872", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-09-03T13:17:02.987872.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-09-03T13:17:02.987872.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_09_03T13_17_02.987872", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-09-03T13:17:02.987872.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-09-03T13:17:02.987872.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_09_03T13_17_02.987872", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-09-03T13:17:02.987872.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-09-03T13:17:02.987872.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_09_03T13_17_02.987872", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-03T13:17:02.987872.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-03T13:17:02.987872.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_09_03T13_17_02.987872", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-03T13:17:02.987872.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-03T13:17:02.987872.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_09_03T13_17_02.987872", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-03T13:17:02.987872.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-03T13:17:02.987872.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_09_03T13_17_02.987872", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-09-03T13:17:02.987872.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-09-03T13:17:02.987872.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_09_03T13_17_02.987872", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-03T13:17:02.987872.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-03T13:17:02.987872.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_09_03T13_17_02.987872", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-03T13:17:02.987872.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-03T13:17:02.987872.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_09_03T13_17_02.987872", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-03T13:17:02.987872.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-03T13:17:02.987872.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_09_03T13_17_02.987872", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-03T13:17:02.987872.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-03T13:17:02.987872.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_09_03T13_17_02.987872", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-09-03T13:17:02.987872.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-09-03T13:17:02.987872.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_09_03T13_17_02.987872", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-03T13:17:02.987872.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-03T13:17:02.987872.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_09_03T13_17_02.987872", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-03T13:17:02.987872.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-03T13:17:02.987872.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_09_03T13_17_02.987872", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-03T13:17:02.987872.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-03T13:17:02.987872.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_09_03T13_17_02.987872", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-03T13:17:02.987872.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-03T13:17:02.987872.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_09_03T13_17_02.987872", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-09-03T13:17:02.987872.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-09-03T13:17:02.987872.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_09_03T13_17_02.987872", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-09-03T13:17:02.987872.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-09-03T13:17:02.987872.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_09_03T13_17_02.987872", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-09-03T13:17:02.987872.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-09-03T13:17:02.987872.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_09_03T13_17_02.987872", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-09-03T13:17:02.987872.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-09-03T13:17:02.987872.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_09_03T13_17_02.987872", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-03T13:17:02.987872.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-03T13:17:02.987872.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_09_03T13_17_02.987872", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-09-03T13:17:02.987872.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-09-03T13:17:02.987872.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_09_03T13_17_02.987872", "path": ["**/details_harness|hendrycksTest-management|5_2023-09-03T13:17:02.987872.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-09-03T13:17:02.987872.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_09_03T13_17_02.987872", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-09-03T13:17:02.987872.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-09-03T13:17:02.987872.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_09_03T13_17_02.987872", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-09-03T13:17:02.987872.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-09-03T13:17:02.987872.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_09_03T13_17_02.987872", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-09-03T13:17:02.987872.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-09-03T13:17:02.987872.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_09_03T13_17_02.987872", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-09-03T13:17:02.987872.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-09-03T13:17:02.987872.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_09_03T13_17_02.987872", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-03T13:17:02.987872.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-03T13:17:02.987872.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_09_03T13_17_02.987872", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-09-03T13:17:02.987872.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-09-03T13:17:02.987872.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_09_03T13_17_02.987872", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-09-03T13:17:02.987872.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-09-03T13:17:02.987872.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_09_03T13_17_02.987872", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-09-03T13:17:02.987872.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-09-03T13:17:02.987872.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_09_03T13_17_02.987872", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-09-03T13:17:02.987872.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-09-03T13:17:02.987872.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_09_03T13_17_02.987872", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-09-03T13:17:02.987872.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-09-03T13:17:02.987872.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_09_03T13_17_02.987872", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-09-03T13:17:02.987872.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-09-03T13:17:02.987872.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_09_03T13_17_02.987872", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-09-03T13:17:02.987872.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-09-03T13:17:02.987872.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_09_03T13_17_02.987872", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-09-03T13:17:02.987872.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-09-03T13:17:02.987872.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_09_03T13_17_02.987872", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-09-03T13:17:02.987872.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-09-03T13:17:02.987872.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_09_03T13_17_02.987872", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-09-03T13:17:02.987872.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-09-03T13:17:02.987872.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_09_03T13_17_02.987872", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-03T13:17:02.987872.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-03T13:17:02.987872.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_09_03T13_17_02.987872", "path": ["**/details_harness|hendrycksTest-virology|5_2023-09-03T13:17:02.987872.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-09-03T13:17:02.987872.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_09_03T13_17_02.987872", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-09-03T13:17:02.987872.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-09-03T13:17:02.987872.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_09_03T13_17_02.987872", "path": ["**/details_harness|truthfulqa:mc|0_2023-09-03T13:17:02.987872.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-09-03T13:17:02.987872.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_09_03T13_17_02.987872", "path": ["results_2023-09-03T13:17:02.987872.parquet"]}, {"split": "latest", "path": ["results_2023-09-03T13:17:02.987872.parquet"]}]}]}
2023-09-03T12:18:25+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of xxyyy123/10k_v1_lora_qkvo_rank14_v3 ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model xxyyy123/10k_v1_lora_qkvo_rank14_v3 on the Open LLM Leaderboard. The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-09-03T13:17:02.987872(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of xxyyy123/10k_v1_lora_qkvo_rank14_v3", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model xxyyy123/10k_v1_lora_qkvo_rank14_v3 on the Open LLM Leaderboard.\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-09-03T13:17:02.987872(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of xxyyy123/10k_v1_lora_qkvo_rank14_v3", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model xxyyy123/10k_v1_lora_qkvo_rank14_v3 on the Open LLM Leaderboard.\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-09-03T13:17:02.987872(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 30, 31, 178, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of xxyyy123/10k_v1_lora_qkvo_rank14_v3## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model xxyyy123/10k_v1_lora_qkvo_rank14_v3 on the Open LLM Leaderboard.\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-09-03T13:17:02.987872(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
e60c86870a332373705827d85f185a7979902572
# Dataset Card for Evaluation run of xxyyy123/20k_v1_lora_qkvo_rank14_v2 ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/xxyyy123/20k_v1_lora_qkvo_rank14_v2 - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [xxyyy123/20k_v1_lora_qkvo_rank14_v2](https://huggingface.co/xxyyy123/20k_v1_lora_qkvo_rank14_v2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_xxyyy123__20k_v1_lora_qkvo_rank14_v2", "harness_truthfulqa_mc_0", split="train") ``` ## Latest results These are the [latest results from run 2023-09-03T13:20:05.284068](https://huggingface.co/datasets/open-llm-leaderboard/details_xxyyy123__20k_v1_lora_qkvo_rank14_v2/blob/main/results_2023-09-03T13%3A20%3A05.284068.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.5085078094647929, "acc_stderr": 0.03515476481930117, "acc_norm": 0.5121639524782741, "acc_norm_stderr": 0.035139678425659286, "mc1": 0.3574051407588739, "mc1_stderr": 0.0167765996767294, "mc2": 0.5157743333677478, "mc2_stderr": 0.01586124547215222 }, "harness|arc:challenge|25": { "acc": 0.5264505119453925, "acc_stderr": 0.01459093135812017, "acc_norm": 0.5537542662116041, "acc_norm_stderr": 0.014526705548539982 }, "harness|hellaswag|10": { "acc": 0.6025692093208525, "acc_stderr": 0.004883663587184775, "acc_norm": 0.7909778928500298, "acc_norm_stderr": 0.004057792171893577 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.28, "acc_stderr": 0.04512608598542128, "acc_norm": 0.28, "acc_norm_stderr": 0.04512608598542128 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.48148148148148145, "acc_stderr": 0.043163785995113245, "acc_norm": 0.48148148148148145, "acc_norm_stderr": 0.043163785995113245 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.4605263157894737, "acc_stderr": 0.04056242252249033, "acc_norm": 0.4605263157894737, "acc_norm_stderr": 0.04056242252249033 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.45, "acc_stderr": 0.049999999999999996, "acc_norm": 0.45, "acc_norm_stderr": 0.049999999999999996 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.5735849056603773, "acc_stderr": 0.030437794342983056, "acc_norm": 0.5735849056603773, "acc_norm_stderr": 0.030437794342983056 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.5555555555555556, "acc_stderr": 0.041553199555931467, "acc_norm": 0.5555555555555556, "acc_norm_stderr": 0.041553199555931467 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.39, "acc_stderr": 0.04902071300001974, "acc_norm": 0.39, "acc_norm_stderr": 0.04902071300001974 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.39, "acc_stderr": 0.04902071300001974, "acc_norm": 0.39, "acc_norm_stderr": 0.04902071300001974 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.33, "acc_stderr": 0.047258156262526045, "acc_norm": 0.33, "acc_norm_stderr": 0.047258156262526045 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.44508670520231214, "acc_stderr": 0.03789401760283647, "acc_norm": 0.44508670520231214, "acc_norm_stderr": 0.03789401760283647 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.2647058823529412, "acc_stderr": 0.04389869956808778, "acc_norm": 0.2647058823529412, "acc_norm_stderr": 0.04389869956808778 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.64, "acc_stderr": 0.048241815132442176, "acc_norm": 0.64, "acc_norm_stderr": 0.048241815132442176 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.46382978723404256, "acc_stderr": 0.032600385118357715, "acc_norm": 0.46382978723404256, "acc_norm_stderr": 0.032600385118357715 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.3508771929824561, "acc_stderr": 0.044895393502707, "acc_norm": 0.3508771929824561, "acc_norm_stderr": 0.044895393502707 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.46206896551724136, "acc_stderr": 0.041546596717075474, "acc_norm": 0.46206896551724136, "acc_norm_stderr": 0.041546596717075474 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.28835978835978837, "acc_stderr": 0.023330654054535886, "acc_norm": 0.28835978835978837, "acc_norm_stderr": 0.023330654054535886 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.30952380952380953, "acc_stderr": 0.04134913018303316, "acc_norm": 0.30952380952380953, "acc_norm_stderr": 0.04134913018303316 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.4, "acc_stderr": 0.04923659639173309, "acc_norm": 0.4, "acc_norm_stderr": 0.04923659639173309 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.5161290322580645, "acc_stderr": 0.028429203176724555, "acc_norm": 0.5161290322580645, "acc_norm_stderr": 0.028429203176724555 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.37438423645320196, "acc_stderr": 0.03405155380561952, "acc_norm": 0.37438423645320196, "acc_norm_stderr": 0.03405155380561952 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.39, "acc_stderr": 0.04902071300001975, "acc_norm": 0.39, "acc_norm_stderr": 0.04902071300001975 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.7212121212121212, "acc_stderr": 0.035014387062967806, "acc_norm": 0.7212121212121212, "acc_norm_stderr": 0.035014387062967806 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.6313131313131313, "acc_stderr": 0.034373055019806184, "acc_norm": 0.6313131313131313, "acc_norm_stderr": 0.034373055019806184 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.7202072538860104, "acc_stderr": 0.032396370467357036, "acc_norm": 0.7202072538860104, "acc_norm_stderr": 0.032396370467357036 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.4641025641025641, "acc_stderr": 0.025285585990017845, "acc_norm": 0.4641025641025641, "acc_norm_stderr": 0.025285585990017845 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.25555555555555554, "acc_stderr": 0.026593939101844086, "acc_norm": 0.25555555555555554, "acc_norm_stderr": 0.026593939101844086 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.49159663865546216, "acc_stderr": 0.03247390276569669, "acc_norm": 0.49159663865546216, "acc_norm_stderr": 0.03247390276569669 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.36423841059602646, "acc_stderr": 0.03929111781242741, "acc_norm": 0.36423841059602646, "acc_norm_stderr": 0.03929111781242741 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.708256880733945, "acc_stderr": 0.01948930096887652, "acc_norm": 0.708256880733945, "acc_norm_stderr": 0.01948930096887652 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.38425925925925924, "acc_stderr": 0.03317354514310742, "acc_norm": 0.38425925925925924, "acc_norm_stderr": 0.03317354514310742 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.7058823529411765, "acc_stderr": 0.03198001660115071, "acc_norm": 0.7058823529411765, "acc_norm_stderr": 0.03198001660115071 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.7130801687763713, "acc_stderr": 0.02944377302259469, "acc_norm": 0.7130801687763713, "acc_norm_stderr": 0.02944377302259469 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.5695067264573991, "acc_stderr": 0.033231973029429394, "acc_norm": 0.5695067264573991, "acc_norm_stderr": 0.033231973029429394 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.5954198473282443, "acc_stderr": 0.043046937953806645, "acc_norm": 0.5954198473282443, "acc_norm_stderr": 0.043046937953806645 }, "harness|hendrycksTest-international_law|5": { "acc": 0.6528925619834711, "acc_stderr": 0.043457245702925335, "acc_norm": 0.6528925619834711, "acc_norm_stderr": 0.043457245702925335 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.6666666666666666, "acc_stderr": 0.04557239513497751, "acc_norm": 0.6666666666666666, "acc_norm_stderr": 0.04557239513497751 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.5766871165644172, "acc_stderr": 0.03881891213334384, "acc_norm": 0.5766871165644172, "acc_norm_stderr": 0.03881891213334384 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.375, "acc_stderr": 0.04595091388086298, "acc_norm": 0.375, "acc_norm_stderr": 0.04595091388086298 }, "harness|hendrycksTest-management|5": { "acc": 0.6504854368932039, "acc_stderr": 0.047211885060971716, "acc_norm": 0.6504854368932039, "acc_norm_stderr": 0.047211885060971716 }, "harness|hendrycksTest-marketing|5": { "acc": 0.7393162393162394, "acc_stderr": 0.02876034895652341, "acc_norm": 0.7393162393162394, "acc_norm_stderr": 0.02876034895652341 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.58, "acc_stderr": 0.049604496374885836, "acc_norm": 0.58, "acc_norm_stderr": 0.049604496374885836 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.7088122605363985, "acc_stderr": 0.01624608706970141, "acc_norm": 0.7088122605363985, "acc_norm_stderr": 0.01624608706970141 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.5635838150289018, "acc_stderr": 0.02670054542494368, "acc_norm": 0.5635838150289018, "acc_norm_stderr": 0.02670054542494368 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.26927374301675977, "acc_stderr": 0.014835616582882618, "acc_norm": 0.26927374301675977, "acc_norm_stderr": 0.014835616582882618 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.5228758169934641, "acc_stderr": 0.028599936776089782, "acc_norm": 0.5228758169934641, "acc_norm_stderr": 0.028599936776089782 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.5852090032154341, "acc_stderr": 0.027982680459759563, "acc_norm": 0.5852090032154341, "acc_norm_stderr": 0.027982680459759563 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.558641975308642, "acc_stderr": 0.027628737155668777, "acc_norm": 0.558641975308642, "acc_norm_stderr": 0.027628737155668777 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.3829787234042553, "acc_stderr": 0.02899908090480618, "acc_norm": 0.3829787234042553, "acc_norm_stderr": 0.02899908090480618 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.3748370273794003, "acc_stderr": 0.012363652467551934, "acc_norm": 0.3748370273794003, "acc_norm_stderr": 0.012363652467551934 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.4889705882352941, "acc_stderr": 0.030365446477275675, "acc_norm": 0.4889705882352941, "acc_norm_stderr": 0.030365446477275675 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.4852941176470588, "acc_stderr": 0.020219083895133924, "acc_norm": 0.4852941176470588, "acc_norm_stderr": 0.020219083895133924 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6, "acc_stderr": 0.0469237132203465, "acc_norm": 0.6, "acc_norm_stderr": 0.0469237132203465 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.6081632653061224, "acc_stderr": 0.03125127591089165, "acc_norm": 0.6081632653061224, "acc_norm_stderr": 0.03125127591089165 }, "harness|hendrycksTest-sociology|5": { "acc": 0.5323383084577115, "acc_stderr": 0.03528131472933607, "acc_norm": 0.5323383084577115, "acc_norm_stderr": 0.03528131472933607 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.68, "acc_stderr": 0.046882617226215034, "acc_norm": 0.68, "acc_norm_stderr": 0.046882617226215034 }, "harness|hendrycksTest-virology|5": { "acc": 0.45180722891566266, "acc_stderr": 0.03874371556587953, "acc_norm": 0.45180722891566266, "acc_norm_stderr": 0.03874371556587953 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.6783625730994152, "acc_stderr": 0.03582529442573122, "acc_norm": 0.6783625730994152, "acc_norm_stderr": 0.03582529442573122 }, "harness|truthfulqa:mc|0": { "mc1": 0.3574051407588739, "mc1_stderr": 0.0167765996767294, "mc2": 0.5157743333677478, "mc2_stderr": 0.01586124547215222 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_xxyyy123__20k_v1_lora_qkvo_rank14_v2
[ "region:us" ]
2023-09-03T12:20:28+00:00
{"pretty_name": "Evaluation run of xxyyy123/20k_v1_lora_qkvo_rank14_v2", "dataset_summary": "Dataset automatically created during the evaluation run of model [xxyyy123/20k_v1_lora_qkvo_rank14_v2](https://huggingface.co/xxyyy123/20k_v1_lora_qkvo_rank14_v2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_xxyyy123__20k_v1_lora_qkvo_rank14_v2\",\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-09-03T13:20:05.284068](https://huggingface.co/datasets/open-llm-leaderboard/details_xxyyy123__20k_v1_lora_qkvo_rank14_v2/blob/main/results_2023-09-03T13%3A20%3A05.284068.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5085078094647929,\n \"acc_stderr\": 0.03515476481930117,\n \"acc_norm\": 0.5121639524782741,\n \"acc_norm_stderr\": 0.035139678425659286,\n \"mc1\": 0.3574051407588739,\n \"mc1_stderr\": 0.0167765996767294,\n \"mc2\": 0.5157743333677478,\n \"mc2_stderr\": 0.01586124547215222\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5264505119453925,\n \"acc_stderr\": 0.01459093135812017,\n \"acc_norm\": 0.5537542662116041,\n \"acc_norm_stderr\": 0.014526705548539982\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6025692093208525,\n \"acc_stderr\": 0.004883663587184775,\n \"acc_norm\": 0.7909778928500298,\n \"acc_norm_stderr\": 0.004057792171893577\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.48148148148148145,\n \"acc_stderr\": 0.043163785995113245,\n \"acc_norm\": 0.48148148148148145,\n \"acc_norm_stderr\": 0.043163785995113245\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.4605263157894737,\n \"acc_stderr\": 0.04056242252249033,\n \"acc_norm\": 0.4605263157894737,\n \"acc_norm_stderr\": 0.04056242252249033\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.45,\n \"acc_stderr\": 0.049999999999999996,\n \"acc_norm\": 0.45,\n \"acc_norm_stderr\": 0.049999999999999996\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.5735849056603773,\n \"acc_stderr\": 0.030437794342983056,\n \"acc_norm\": 0.5735849056603773,\n \"acc_norm_stderr\": 0.030437794342983056\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5555555555555556,\n \"acc_stderr\": 0.041553199555931467,\n \"acc_norm\": 0.5555555555555556,\n \"acc_norm_stderr\": 0.041553199555931467\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001974,\n \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001974\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001974,\n \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001974\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.44508670520231214,\n \"acc_stderr\": 0.03789401760283647,\n \"acc_norm\": 0.44508670520231214,\n \"acc_norm_stderr\": 0.03789401760283647\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.2647058823529412,\n \"acc_stderr\": 0.04389869956808778,\n \"acc_norm\": 0.2647058823529412,\n \"acc_norm_stderr\": 0.04389869956808778\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.64,\n \"acc_stderr\": 0.048241815132442176,\n \"acc_norm\": 0.64,\n \"acc_norm_stderr\": 0.048241815132442176\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.46382978723404256,\n \"acc_stderr\": 0.032600385118357715,\n \"acc_norm\": 0.46382978723404256,\n \"acc_norm_stderr\": 0.032600385118357715\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.3508771929824561,\n \"acc_stderr\": 0.044895393502707,\n \"acc_norm\": 0.3508771929824561,\n \"acc_norm_stderr\": 0.044895393502707\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.46206896551724136,\n \"acc_stderr\": 0.041546596717075474,\n \"acc_norm\": 0.46206896551724136,\n \"acc_norm_stderr\": 0.041546596717075474\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.28835978835978837,\n \"acc_stderr\": 0.023330654054535886,\n \"acc_norm\": 0.28835978835978837,\n \"acc_norm_stderr\": 0.023330654054535886\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.30952380952380953,\n \"acc_stderr\": 0.04134913018303316,\n \"acc_norm\": 0.30952380952380953,\n \"acc_norm_stderr\": 0.04134913018303316\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.4,\n \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.5161290322580645,\n \"acc_stderr\": 0.028429203176724555,\n \"acc_norm\": 0.5161290322580645,\n \"acc_norm_stderr\": 0.028429203176724555\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.37438423645320196,\n \"acc_stderr\": 0.03405155380561952,\n \"acc_norm\": 0.37438423645320196,\n \"acc_norm_stderr\": 0.03405155380561952\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7212121212121212,\n \"acc_stderr\": 0.035014387062967806,\n \"acc_norm\": 0.7212121212121212,\n \"acc_norm_stderr\": 0.035014387062967806\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.6313131313131313,\n \"acc_stderr\": 0.034373055019806184,\n \"acc_norm\": 0.6313131313131313,\n \"acc_norm_stderr\": 0.034373055019806184\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.7202072538860104,\n \"acc_stderr\": 0.032396370467357036,\n \"acc_norm\": 0.7202072538860104,\n \"acc_norm_stderr\": 0.032396370467357036\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.4641025641025641,\n \"acc_stderr\": 0.025285585990017845,\n \"acc_norm\": 0.4641025641025641,\n \"acc_norm_stderr\": 0.025285585990017845\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.25555555555555554,\n \"acc_stderr\": 0.026593939101844086,\n \"acc_norm\": 0.25555555555555554,\n \"acc_norm_stderr\": 0.026593939101844086\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.49159663865546216,\n \"acc_stderr\": 0.03247390276569669,\n \"acc_norm\": 0.49159663865546216,\n \"acc_norm_stderr\": 0.03247390276569669\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.36423841059602646,\n \"acc_stderr\": 0.03929111781242741,\n \"acc_norm\": 0.36423841059602646,\n \"acc_norm_stderr\": 0.03929111781242741\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.708256880733945,\n \"acc_stderr\": 0.01948930096887652,\n \"acc_norm\": 0.708256880733945,\n \"acc_norm_stderr\": 0.01948930096887652\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.38425925925925924,\n \"acc_stderr\": 0.03317354514310742,\n \"acc_norm\": 0.38425925925925924,\n \"acc_norm_stderr\": 0.03317354514310742\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7058823529411765,\n \"acc_stderr\": 0.03198001660115071,\n \"acc_norm\": 0.7058823529411765,\n \"acc_norm_stderr\": 0.03198001660115071\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7130801687763713,\n \"acc_stderr\": 0.02944377302259469,\n \"acc_norm\": 0.7130801687763713,\n \"acc_norm_stderr\": 0.02944377302259469\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5695067264573991,\n \"acc_stderr\": 0.033231973029429394,\n \"acc_norm\": 0.5695067264573991,\n \"acc_norm_stderr\": 0.033231973029429394\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.5954198473282443,\n \"acc_stderr\": 0.043046937953806645,\n \"acc_norm\": 0.5954198473282443,\n \"acc_norm_stderr\": 0.043046937953806645\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.6528925619834711,\n \"acc_stderr\": 0.043457245702925335,\n \"acc_norm\": 0.6528925619834711,\n \"acc_norm_stderr\": 0.043457245702925335\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.04557239513497751,\n \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.04557239513497751\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.5766871165644172,\n \"acc_stderr\": 0.03881891213334384,\n \"acc_norm\": 0.5766871165644172,\n \"acc_norm_stderr\": 0.03881891213334384\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.375,\n \"acc_stderr\": 0.04595091388086298,\n \"acc_norm\": 0.375,\n \"acc_norm_stderr\": 0.04595091388086298\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.6504854368932039,\n \"acc_stderr\": 0.047211885060971716,\n \"acc_norm\": 0.6504854368932039,\n \"acc_norm_stderr\": 0.047211885060971716\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7393162393162394,\n \"acc_stderr\": 0.02876034895652341,\n \"acc_norm\": 0.7393162393162394,\n \"acc_norm_stderr\": 0.02876034895652341\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.58,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7088122605363985,\n \"acc_stderr\": 0.01624608706970141,\n \"acc_norm\": 0.7088122605363985,\n \"acc_norm_stderr\": 0.01624608706970141\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.5635838150289018,\n \"acc_stderr\": 0.02670054542494368,\n \"acc_norm\": 0.5635838150289018,\n \"acc_norm_stderr\": 0.02670054542494368\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.26927374301675977,\n \"acc_stderr\": 0.014835616582882618,\n \"acc_norm\": 0.26927374301675977,\n \"acc_norm_stderr\": 0.014835616582882618\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.5228758169934641,\n \"acc_stderr\": 0.028599936776089782,\n \"acc_norm\": 0.5228758169934641,\n \"acc_norm_stderr\": 0.028599936776089782\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5852090032154341,\n \"acc_stderr\": 0.027982680459759563,\n \"acc_norm\": 0.5852090032154341,\n \"acc_norm_stderr\": 0.027982680459759563\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.558641975308642,\n \"acc_stderr\": 0.027628737155668777,\n \"acc_norm\": 0.558641975308642,\n \"acc_norm_stderr\": 0.027628737155668777\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.3829787234042553,\n \"acc_stderr\": 0.02899908090480618,\n \"acc_norm\": 0.3829787234042553,\n \"acc_norm_stderr\": 0.02899908090480618\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3748370273794003,\n \"acc_stderr\": 0.012363652467551934,\n \"acc_norm\": 0.3748370273794003,\n \"acc_norm_stderr\": 0.012363652467551934\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.4889705882352941,\n \"acc_stderr\": 0.030365446477275675,\n \"acc_norm\": 0.4889705882352941,\n \"acc_norm_stderr\": 0.030365446477275675\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.4852941176470588,\n \"acc_stderr\": 0.020219083895133924,\n \"acc_norm\": 0.4852941176470588,\n \"acc_norm_stderr\": 0.020219083895133924\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.0469237132203465,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.0469237132203465\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.6081632653061224,\n \"acc_stderr\": 0.03125127591089165,\n \"acc_norm\": 0.6081632653061224,\n \"acc_norm_stderr\": 0.03125127591089165\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.5323383084577115,\n \"acc_stderr\": 0.03528131472933607,\n \"acc_norm\": 0.5323383084577115,\n \"acc_norm_stderr\": 0.03528131472933607\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.68,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.45180722891566266,\n \"acc_stderr\": 0.03874371556587953,\n \"acc_norm\": 0.45180722891566266,\n \"acc_norm_stderr\": 0.03874371556587953\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.6783625730994152,\n \"acc_stderr\": 0.03582529442573122,\n \"acc_norm\": 0.6783625730994152,\n \"acc_norm_stderr\": 0.03582529442573122\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3574051407588739,\n \"mc1_stderr\": 0.0167765996767294,\n \"mc2\": 0.5157743333677478,\n \"mc2_stderr\": 0.01586124547215222\n }\n}\n```", "repo_url": "https://huggingface.co/xxyyy123/20k_v1_lora_qkvo_rank14_v2", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_09_03T13_20_05.284068", "path": ["**/details_harness|arc:challenge|25_2023-09-03T13:20:05.284068.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-09-03T13:20:05.284068.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_09_03T13_20_05.284068", "path": ["**/details_harness|hellaswag|10_2023-09-03T13:20:05.284068.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-09-03T13:20:05.284068.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_09_03T13_20_05.284068", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-03T13:20:05.284068.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-09-03T13:20:05.284068.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-09-03T13:20:05.284068.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-09-03T13:20:05.284068.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-03T13:20:05.284068.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-09-03T13:20:05.284068.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-09-03T13:20:05.284068.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-09-03T13:20:05.284068.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-09-03T13:20:05.284068.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-09-03T13:20:05.284068.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-09-03T13:20:05.284068.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-09-03T13:20:05.284068.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-03T13:20:05.284068.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-09-03T13:20:05.284068.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-03T13:20:05.284068.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-03T13:20:05.284068.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-09-03T13:20:05.284068.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-09-03T13:20:05.284068.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-09-03T13:20:05.284068.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-03T13:20:05.284068.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-03T13:20:05.284068.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-03T13:20:05.284068.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-09-03T13:20:05.284068.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-03T13:20:05.284068.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-03T13:20:05.284068.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-03T13:20:05.284068.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-03T13:20:05.284068.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-09-03T13:20:05.284068.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-03T13:20:05.284068.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-03T13:20:05.284068.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-03T13:20:05.284068.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-03T13:20:05.284068.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-09-03T13:20:05.284068.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-09-03T13:20:05.284068.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-09-03T13:20:05.284068.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-09-03T13:20:05.284068.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-03T13:20:05.284068.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-09-03T13:20:05.284068.parquet", "**/details_harness|hendrycksTest-management|5_2023-09-03T13:20:05.284068.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-09-03T13:20:05.284068.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-09-03T13:20:05.284068.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-09-03T13:20:05.284068.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-09-03T13:20:05.284068.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-03T13:20:05.284068.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-09-03T13:20:05.284068.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-09-03T13:20:05.284068.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-09-03T13:20:05.284068.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-09-03T13:20:05.284068.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-09-03T13:20:05.284068.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-09-03T13:20:05.284068.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-09-03T13:20:05.284068.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-09-03T13:20:05.284068.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-09-03T13:20:05.284068.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-09-03T13:20:05.284068.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-03T13:20:05.284068.parquet", "**/details_harness|hendrycksTest-virology|5_2023-09-03T13:20:05.284068.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-09-03T13:20:05.284068.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-03T13:20:05.284068.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-09-03T13:20:05.284068.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-09-03T13:20:05.284068.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-09-03T13:20:05.284068.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-03T13:20:05.284068.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-09-03T13:20:05.284068.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-09-03T13:20:05.284068.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-09-03T13:20:05.284068.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-09-03T13:20:05.284068.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-09-03T13:20:05.284068.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-09-03T13:20:05.284068.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-09-03T13:20:05.284068.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-03T13:20:05.284068.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-09-03T13:20:05.284068.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-03T13:20:05.284068.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-03T13:20:05.284068.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-09-03T13:20:05.284068.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-09-03T13:20:05.284068.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-09-03T13:20:05.284068.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-03T13:20:05.284068.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-03T13:20:05.284068.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-03T13:20:05.284068.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-09-03T13:20:05.284068.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-03T13:20:05.284068.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-03T13:20:05.284068.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-03T13:20:05.284068.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-03T13:20:05.284068.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-09-03T13:20:05.284068.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-03T13:20:05.284068.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-03T13:20:05.284068.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-03T13:20:05.284068.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-03T13:20:05.284068.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-09-03T13:20:05.284068.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-09-03T13:20:05.284068.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-09-03T13:20:05.284068.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-09-03T13:20:05.284068.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-03T13:20:05.284068.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-09-03T13:20:05.284068.parquet", "**/details_harness|hendrycksTest-management|5_2023-09-03T13:20:05.284068.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-09-03T13:20:05.284068.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-09-03T13:20:05.284068.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-09-03T13:20:05.284068.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-09-03T13:20:05.284068.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-03T13:20:05.284068.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-09-03T13:20:05.284068.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-09-03T13:20:05.284068.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-09-03T13:20:05.284068.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-09-03T13:20:05.284068.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-09-03T13:20:05.284068.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-09-03T13:20:05.284068.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-09-03T13:20:05.284068.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-09-03T13:20:05.284068.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-09-03T13:20:05.284068.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-09-03T13:20:05.284068.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-03T13:20:05.284068.parquet", "**/details_harness|hendrycksTest-virology|5_2023-09-03T13:20:05.284068.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-09-03T13:20:05.284068.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_09_03T13_20_05.284068", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-03T13:20:05.284068.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-03T13:20:05.284068.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_09_03T13_20_05.284068", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-09-03T13:20:05.284068.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-09-03T13:20:05.284068.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_09_03T13_20_05.284068", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-09-03T13:20:05.284068.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-09-03T13:20:05.284068.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_09_03T13_20_05.284068", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-09-03T13:20:05.284068.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-09-03T13:20:05.284068.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_09_03T13_20_05.284068", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-03T13:20:05.284068.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-03T13:20:05.284068.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_09_03T13_20_05.284068", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-09-03T13:20:05.284068.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-09-03T13:20:05.284068.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_09_03T13_20_05.284068", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-09-03T13:20:05.284068.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-09-03T13:20:05.284068.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_09_03T13_20_05.284068", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-09-03T13:20:05.284068.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-09-03T13:20:05.284068.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_09_03T13_20_05.284068", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-09-03T13:20:05.284068.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-09-03T13:20:05.284068.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_09_03T13_20_05.284068", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-09-03T13:20:05.284068.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-09-03T13:20:05.284068.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_09_03T13_20_05.284068", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-09-03T13:20:05.284068.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-09-03T13:20:05.284068.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_09_03T13_20_05.284068", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-09-03T13:20:05.284068.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-09-03T13:20:05.284068.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_09_03T13_20_05.284068", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-03T13:20:05.284068.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-03T13:20:05.284068.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_09_03T13_20_05.284068", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-09-03T13:20:05.284068.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-09-03T13:20:05.284068.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_09_03T13_20_05.284068", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-03T13:20:05.284068.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-03T13:20:05.284068.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_09_03T13_20_05.284068", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-03T13:20:05.284068.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-03T13:20:05.284068.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_09_03T13_20_05.284068", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-09-03T13:20:05.284068.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-09-03T13:20:05.284068.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_09_03T13_20_05.284068", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-09-03T13:20:05.284068.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-09-03T13:20:05.284068.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_09_03T13_20_05.284068", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-09-03T13:20:05.284068.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-09-03T13:20:05.284068.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_09_03T13_20_05.284068", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-03T13:20:05.284068.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-03T13:20:05.284068.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_09_03T13_20_05.284068", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-03T13:20:05.284068.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-03T13:20:05.284068.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_09_03T13_20_05.284068", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-03T13:20:05.284068.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-03T13:20:05.284068.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_09_03T13_20_05.284068", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-09-03T13:20:05.284068.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-09-03T13:20:05.284068.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_09_03T13_20_05.284068", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-03T13:20:05.284068.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-03T13:20:05.284068.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_09_03T13_20_05.284068", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-03T13:20:05.284068.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-03T13:20:05.284068.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_09_03T13_20_05.284068", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-03T13:20:05.284068.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-03T13:20:05.284068.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_09_03T13_20_05.284068", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-03T13:20:05.284068.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-03T13:20:05.284068.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_09_03T13_20_05.284068", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-09-03T13:20:05.284068.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-09-03T13:20:05.284068.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_09_03T13_20_05.284068", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-03T13:20:05.284068.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-03T13:20:05.284068.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_09_03T13_20_05.284068", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-03T13:20:05.284068.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-03T13:20:05.284068.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_09_03T13_20_05.284068", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-03T13:20:05.284068.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-03T13:20:05.284068.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_09_03T13_20_05.284068", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-03T13:20:05.284068.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-03T13:20:05.284068.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_09_03T13_20_05.284068", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-09-03T13:20:05.284068.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-09-03T13:20:05.284068.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_09_03T13_20_05.284068", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-09-03T13:20:05.284068.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-09-03T13:20:05.284068.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_09_03T13_20_05.284068", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-09-03T13:20:05.284068.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-09-03T13:20:05.284068.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_09_03T13_20_05.284068", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-09-03T13:20:05.284068.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-09-03T13:20:05.284068.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_09_03T13_20_05.284068", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-03T13:20:05.284068.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-03T13:20:05.284068.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_09_03T13_20_05.284068", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-09-03T13:20:05.284068.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-09-03T13:20:05.284068.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_09_03T13_20_05.284068", "path": ["**/details_harness|hendrycksTest-management|5_2023-09-03T13:20:05.284068.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-09-03T13:20:05.284068.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_09_03T13_20_05.284068", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-09-03T13:20:05.284068.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-09-03T13:20:05.284068.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_09_03T13_20_05.284068", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-09-03T13:20:05.284068.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-09-03T13:20:05.284068.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_09_03T13_20_05.284068", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-09-03T13:20:05.284068.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-09-03T13:20:05.284068.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_09_03T13_20_05.284068", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-09-03T13:20:05.284068.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-09-03T13:20:05.284068.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_09_03T13_20_05.284068", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-03T13:20:05.284068.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-03T13:20:05.284068.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_09_03T13_20_05.284068", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-09-03T13:20:05.284068.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-09-03T13:20:05.284068.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_09_03T13_20_05.284068", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-09-03T13:20:05.284068.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-09-03T13:20:05.284068.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_09_03T13_20_05.284068", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-09-03T13:20:05.284068.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-09-03T13:20:05.284068.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_09_03T13_20_05.284068", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-09-03T13:20:05.284068.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-09-03T13:20:05.284068.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_09_03T13_20_05.284068", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-09-03T13:20:05.284068.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-09-03T13:20:05.284068.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_09_03T13_20_05.284068", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-09-03T13:20:05.284068.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-09-03T13:20:05.284068.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_09_03T13_20_05.284068", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-09-03T13:20:05.284068.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-09-03T13:20:05.284068.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_09_03T13_20_05.284068", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-09-03T13:20:05.284068.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-09-03T13:20:05.284068.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_09_03T13_20_05.284068", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-09-03T13:20:05.284068.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-09-03T13:20:05.284068.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_09_03T13_20_05.284068", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-09-03T13:20:05.284068.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-09-03T13:20:05.284068.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_09_03T13_20_05.284068", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-03T13:20:05.284068.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-03T13:20:05.284068.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_09_03T13_20_05.284068", "path": ["**/details_harness|hendrycksTest-virology|5_2023-09-03T13:20:05.284068.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-09-03T13:20:05.284068.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_09_03T13_20_05.284068", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-09-03T13:20:05.284068.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-09-03T13:20:05.284068.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_09_03T13_20_05.284068", "path": ["**/details_harness|truthfulqa:mc|0_2023-09-03T13:20:05.284068.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-09-03T13:20:05.284068.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_09_03T13_20_05.284068", "path": ["results_2023-09-03T13:20:05.284068.parquet"]}, {"split": "latest", "path": ["results_2023-09-03T13:20:05.284068.parquet"]}]}]}
2023-09-03T12:21:25+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of xxyyy123/20k_v1_lora_qkvo_rank14_v2 ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model xxyyy123/20k_v1_lora_qkvo_rank14_v2 on the Open LLM Leaderboard. The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-09-03T13:20:05.284068(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of xxyyy123/20k_v1_lora_qkvo_rank14_v2", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model xxyyy123/20k_v1_lora_qkvo_rank14_v2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-09-03T13:20:05.284068(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of xxyyy123/20k_v1_lora_qkvo_rank14_v2", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model xxyyy123/20k_v1_lora_qkvo_rank14_v2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-09-03T13:20:05.284068(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 30, 31, 178, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of xxyyy123/20k_v1_lora_qkvo_rank14_v2## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model xxyyy123/20k_v1_lora_qkvo_rank14_v2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-09-03T13:20:05.284068(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
9959f24019c42d63e36388b0bce4b7839e6823a1
支持ChatHaruhi2 的虚竹数据,可以使用如下方式调用 ```python from chatharuhi import ChatHaruhi chatbot = ChatHaruhi( role_from_hf = 'hhhwmws/xuzhu', \ llm = 'openai') response = chatbot.chat(role='僧人', text = '你好!') print(response) ``` 上传者: 米唯实 更具体的信息,见 [ChatHaruhi](https://github.com/LC1332/Chat-Haruhi-Suzumiya) 欢迎加入我们的 [众筹角色创建项目](https://github.com/LC1332/Chat-Haruhi-Suzumiya/tree/main/characters/novel_collecting) ### Citation引用 Please cite the repo if you use the data or code in this repo. ``` @misc{li2023chatharuhi, title={ChatHaruhi: Reviving Anime Character in Reality via Large Language Model}, author={Cheng Li and Ziang Leng and Chenxi Yan and Junyi Shen and Hao Wang and Weishi MI and Yaying Fei and Xiaoyang Feng and Song Yan and HaoSheng Wang and Linkang Zhan and Yaokai Jia and Pingyu Wu and Haozhen Sun}, year={2023}, eprint={2308.09597}, archivePrefix={arXiv}, primaryClass={cs.CL} } ```
hhhwmws/xuzhu
[ "task_categories:text-generation", "size_categories:1K<n<10K", "language:zh", "license:cc-by-4.0", "arxiv:2308.09597", "region:us" ]
2023-09-03T12:45:48+00:00
{"language": ["zh"], "license": "cc-by-4.0", "size_categories": ["1K<n<10K"], "task_categories": ["text-generation"]}
2023-09-03T12:48:11+00:00
[ "2308.09597" ]
[ "zh" ]
TAGS #task_categories-text-generation #size_categories-1K<n<10K #language-Chinese #license-cc-by-4.0 #arxiv-2308.09597 #region-us
支持ChatHaruhi2 的虚竹数据,可以使用如下方式调用 上传者: 米唯实 更具体的信息,见 ChatHaruhi 欢迎加入我们的 众筹角色创建项目 引用 Please cite the repo if you use the data or code in this repo.
[]
[ "TAGS\n#task_categories-text-generation #size_categories-1K<n<10K #language-Chinese #license-cc-by-4.0 #arxiv-2308.09597 #region-us \n" ]
[ 51 ]
[ "passage: TAGS\n#task_categories-text-generation #size_categories-1K<n<10K #language-Chinese #license-cc-by-4.0 #arxiv-2308.09597 #region-us \n" ]
df56c98729d8775fa51c65cc757acc4ec948bb12
# Dataset Card for "infer_on_testds" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
linhtran92/infer_on_testds_v1
[ "region:us" ]
2023-09-03T12:47:25+00:00
{"dataset_info": {"features": [{"name": "audio", "dtype": {"audio": {"sampling_rate": 16000}}}, {"name": "id", "dtype": "string"}, {"name": "w2v2_baseline_transcription", "dtype": "string"}, {"name": "w2v2_baseline_norm", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 174371437.027, "num_examples": 1299}], "download_size": 164200336, "dataset_size": 174371437.027}}
2023-09-03T12:47:31+00:00
[]
[]
TAGS #region-us
# Dataset Card for "infer_on_testds" More Information needed
[ "# Dataset Card for \"infer_on_testds\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"infer_on_testds\"\n\nMore Information needed" ]
[ 6, 17 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"infer_on_testds\"\n\nMore Information needed" ]
67bdb4322003ecac211eac25ee986f6a8eba990a
支持ChatHaruhi2 的丁春秋数据,可以使用如下方式调用 ```python from chatharuhi import ChatHaruhi chatbot = ChatHaruhi( role_from_hf = 'hhhwmws/dingchunqiu', \ llm = 'openai') response = chatbot.chat(role='萧峰', text = '丁春秋!') print(response) ``` 上传者: 米唯实 更具体的信息,见 [ChatHaruhi](https://github.com/LC1332/Chat-Haruhi-Suzumiya) 欢迎加入我们的 [众筹角色创建项目](https://github.com/LC1332/Chat-Haruhi-Suzumiya/tree/main/characters/novel_collecting) ### Citation引用 Please cite the repo if you use the data or code in this repo. ``` @misc{li2023chatharuhi, title={ChatHaruhi: Reviving Anime Character in Reality via Large Language Model}, author={Cheng Li and Ziang Leng and Chenxi Yan and Junyi Shen and Hao Wang and Weishi MI and Yaying Fei and Xiaoyang Feng and Song Yan and HaoSheng Wang and Linkang Zhan and Yaokai Jia and Pingyu Wu and Haozhen Sun}, year={2023}, eprint={2308.09597}, archivePrefix={arXiv}, primaryClass={cs.CL} } ```
hhhwmws/dingchunqiu
[ "task_categories:text-generation", "size_categories:1K<n<10K", "language:zh", "license:cc-by-4.0", "arxiv:2308.09597", "region:us" ]
2023-09-03T12:49:33+00:00
{"language": ["zh"], "license": "cc-by-4.0", "size_categories": ["1K<n<10K"], "task_categories": ["text-generation"]}
2023-09-03T12:51:28+00:00
[ "2308.09597" ]
[ "zh" ]
TAGS #task_categories-text-generation #size_categories-1K<n<10K #language-Chinese #license-cc-by-4.0 #arxiv-2308.09597 #region-us
支持ChatHaruhi2 的丁春秋数据,可以使用如下方式调用 上传者: 米唯实 更具体的信息,见 ChatHaruhi 欢迎加入我们的 众筹角色创建项目 引用 Please cite the repo if you use the data or code in this repo.
[]
[ "TAGS\n#task_categories-text-generation #size_categories-1K<n<10K #language-Chinese #license-cc-by-4.0 #arxiv-2308.09597 #region-us \n" ]
[ 51 ]
[ "passage: TAGS\n#task_categories-text-generation #size_categories-1K<n<10K #language-Chinese #license-cc-by-4.0 #arxiv-2308.09597 #region-us \n" ]
832f303f1a30bccbb016607a37aaa718de2345c5
支持ChatHaruhi2 的萧峰数据,可以使用如下方式调用 ```python from chatharuhi import ChatHaruhi chatbot = ChatHaruhi( role_from_hf = 'hhhwmws/xiaofeng', \ llm = 'openai') response = chatbot.chat(role='丁春秋', text = '是我!') print(response) ``` 上传者: 米唯实 更具体的信息,见 [ChatHaruhi](https://github.com/LC1332/Chat-Haruhi-Suzumiya) 欢迎加入我们的 [众筹角色创建项目](https://github.com/LC1332/Chat-Haruhi-Suzumiya/tree/main/characters/novel_collecting) ### Citation引用 Please cite the repo if you use the data or code in this repo. ``` @misc{li2023chatharuhi, title={ChatHaruhi: Reviving Anime Character in Reality via Large Language Model}, author={Cheng Li and Ziang Leng and Chenxi Yan and Junyi Shen and Hao Wang and Weishi MI and Yaying Fei and Xiaoyang Feng and Song Yan and HaoSheng Wang and Linkang Zhan and Yaokai Jia and Pingyu Wu and Haozhen Sun}, year={2023}, eprint={2308.09597}, archivePrefix={arXiv}, primaryClass={cs.CL} } ```
hhhwmws/xiaofeng
[ "task_categories:text-generation", "size_categories:1K<n<10K", "language:zh", "license:cc-by-4.0", "arxiv:2308.09597", "region:us" ]
2023-09-03T12:52:20+00:00
{"language": ["zh"], "license": "cc-by-4.0", "size_categories": ["1K<n<10K"], "task_categories": ["text-generation"]}
2023-09-03T12:53:38+00:00
[ "2308.09597" ]
[ "zh" ]
TAGS #task_categories-text-generation #size_categories-1K<n<10K #language-Chinese #license-cc-by-4.0 #arxiv-2308.09597 #region-us
支持ChatHaruhi2 的萧峰数据,可以使用如下方式调用 上传者: 米唯实 更具体的信息,见 ChatHaruhi 欢迎加入我们的 众筹角色创建项目 引用 Please cite the repo if you use the data or code in this repo.
[]
[ "TAGS\n#task_categories-text-generation #size_categories-1K<n<10K #language-Chinese #license-cc-by-4.0 #arxiv-2308.09597 #region-us \n" ]
[ 51 ]
[ "passage: TAGS\n#task_categories-text-generation #size_categories-1K<n<10K #language-Chinese #license-cc-by-4.0 #arxiv-2308.09597 #region-us \n" ]
6c2cbe52050214638b2578c80bededf43e9f7f68
支持ChatHaruhi2 的鸠摩智数据,可以使用如下方式调用 ```python from chatharuhi import ChatHaruhi chatbot = ChatHaruhi( role_from_hf = 'hhhwmws/jiumozhi', \ llm = 'openai') response = chatbot.chat(role='萧峰', text = '是我!') print(response) ``` 上传者: 米唯实 更具体的信息,见 [ChatHaruhi](https://github.com/LC1332/Chat-Haruhi-Suzumiya) 欢迎加入我们的 [众筹角色创建项目](https://github.com/LC1332/Chat-Haruhi-Suzumiya/tree/main/characters/novel_collecting) ### Citation引用 Please cite the repo if you use the data or code in this repo. ``` @misc{li2023chatharuhi, title={ChatHaruhi: Reviving Anime Character in Reality via Large Language Model}, author={Cheng Li and Ziang Leng and Chenxi Yan and Junyi Shen and Hao Wang and Weishi MI and Yaying Fei and Xiaoyang Feng and Song Yan and HaoSheng Wang and Linkang Zhan and Yaokai Jia and Pingyu Wu and Haozhen Sun}, year={2023}, eprint={2308.09597}, archivePrefix={arXiv}, primaryClass={cs.CL} } ```
hhhwmws/jiumozhi
[ "task_categories:text-generation", "size_categories:1K<n<10K", "language:zh", "license:cc-by-4.0", "arxiv:2308.09597", "region:us" ]
2023-09-03T12:54:59+00:00
{"language": ["zh"], "license": "cc-by-4.0", "size_categories": ["1K<n<10K"], "task_categories": ["text-generation"]}
2023-09-03T12:56:03+00:00
[ "2308.09597" ]
[ "zh" ]
TAGS #task_categories-text-generation #size_categories-1K<n<10K #language-Chinese #license-cc-by-4.0 #arxiv-2308.09597 #region-us
支持ChatHaruhi2 的鸠摩智数据,可以使用如下方式调用 上传者: 米唯实 更具体的信息,见 ChatHaruhi 欢迎加入我们的 众筹角色创建项目 引用 Please cite the repo if you use the data or code in this repo.
[]
[ "TAGS\n#task_categories-text-generation #size_categories-1K<n<10K #language-Chinese #license-cc-by-4.0 #arxiv-2308.09597 #region-us \n" ]
[ 51 ]
[ "passage: TAGS\n#task_categories-text-generation #size_categories-1K<n<10K #language-Chinese #license-cc-by-4.0 #arxiv-2308.09597 #region-us \n" ]
4af77e42bdf7fb70144b781d9fb70c3953ffe84b
# Dataset Card for Evaluation run of RWKV/rwkv-4-14b-pile ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/RWKV/rwkv-4-14b-pile - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [RWKV/rwkv-4-14b-pile](https://huggingface.co/RWKV/rwkv-4-14b-pile) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_RWKV__rwkv-4-14b-pile", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-10-15T21:19:09.628553](https://huggingface.co/datasets/open-llm-leaderboard/details_RWKV__rwkv-4-14b-pile/blob/main/results_2023-10-15T21-19-09.628553.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.0017827181208053692, "em_stderr": 0.0004320097346039099, "f1": 0.045845218120805505, "f1_stderr": 0.0012042308580297368, "acc": 0.32904612508698977, "acc_stderr": 0.007529648846042276 }, "harness|drop|3": { "em": 0.0017827181208053692, "em_stderr": 0.0004320097346039099, "f1": 0.045845218120805505, "f1_stderr": 0.0012042308580297368 }, "harness|gsm8k|5": { "acc": 0.0037907505686125853, "acc_stderr": 0.0016927007401501789 }, "harness|winogrande|5": { "acc": 0.654301499605367, "acc_stderr": 0.013366596951934373 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_RWKV__rwkv-4-14b-pile
[ "region:us" ]
2023-09-03T12:55:51+00:00
{"pretty_name": "Evaluation run of RWKV/rwkv-4-14b-pile", "dataset_summary": "Dataset automatically created during the evaluation run of model [RWKV/rwkv-4-14b-pile](https://huggingface.co/RWKV/rwkv-4-14b-pile) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_RWKV__rwkv-4-14b-pile\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-15T21:19:09.628553](https://huggingface.co/datasets/open-llm-leaderboard/details_RWKV__rwkv-4-14b-pile/blob/main/results_2023-10-15T21-19-09.628553.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0017827181208053692,\n \"em_stderr\": 0.0004320097346039099,\n \"f1\": 0.045845218120805505,\n \"f1_stderr\": 0.0012042308580297368,\n \"acc\": 0.32904612508698977,\n \"acc_stderr\": 0.007529648846042276\n },\n \"harness|drop|3\": {\n \"em\": 0.0017827181208053692,\n \"em_stderr\": 0.0004320097346039099,\n \"f1\": 0.045845218120805505,\n \"f1_stderr\": 0.0012042308580297368\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0037907505686125853,\n \"acc_stderr\": 0.0016927007401501789\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.654301499605367,\n \"acc_stderr\": 0.013366596951934373\n }\n}\n```", "repo_url": "https://huggingface.co/RWKV/rwkv-4-14b-pile", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_09_03T13_55_36.441206", "path": ["**/details_harness|arc:challenge|25_2023-09-03T13:55:36.441206.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-09-03T13:55:36.441206.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_15T21_19_09.628553", "path": ["**/details_harness|drop|3_2023-10-15T21-19-09.628553.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-15T21-19-09.628553.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_15T21_19_09.628553", "path": ["**/details_harness|gsm8k|5_2023-10-15T21-19-09.628553.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-15T21-19-09.628553.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_09_03T13_55_36.441206", "path": ["**/details_harness|hellaswag|10_2023-09-03T13:55:36.441206.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-09-03T13:55:36.441206.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_09_03T13_55_36.441206", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-03T13:55:36.441206.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-09-03T13:55:36.441206.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-09-03T13:55:36.441206.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-09-03T13:55:36.441206.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-03T13:55:36.441206.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-09-03T13:55:36.441206.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-09-03T13:55:36.441206.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-09-03T13:55:36.441206.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-09-03T13:55:36.441206.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-09-03T13:55:36.441206.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-09-03T13:55:36.441206.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-09-03T13:55:36.441206.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-03T13:55:36.441206.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-09-03T13:55:36.441206.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-03T13:55:36.441206.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-03T13:55:36.441206.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-09-03T13:55:36.441206.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-09-03T13:55:36.441206.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-09-03T13:55:36.441206.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-03T13:55:36.441206.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-03T13:55:36.441206.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-03T13:55:36.441206.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-09-03T13:55:36.441206.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-03T13:55:36.441206.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-03T13:55:36.441206.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-03T13:55:36.441206.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-03T13:55:36.441206.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-09-03T13:55:36.441206.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-03T13:55:36.441206.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-03T13:55:36.441206.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-03T13:55:36.441206.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-03T13:55:36.441206.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-09-03T13:55:36.441206.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-09-03T13:55:36.441206.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-09-03T13:55:36.441206.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-09-03T13:55:36.441206.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-03T13:55:36.441206.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-09-03T13:55:36.441206.parquet", "**/details_harness|hendrycksTest-management|5_2023-09-03T13:55:36.441206.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-09-03T13:55:36.441206.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-09-03T13:55:36.441206.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-09-03T13:55:36.441206.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-09-03T13:55:36.441206.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-03T13:55:36.441206.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-09-03T13:55:36.441206.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-09-03T13:55:36.441206.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-09-03T13:55:36.441206.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-09-03T13:55:36.441206.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-09-03T13:55:36.441206.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-09-03T13:55:36.441206.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-09-03T13:55:36.441206.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-09-03T13:55:36.441206.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-09-03T13:55:36.441206.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-09-03T13:55:36.441206.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-03T13:55:36.441206.parquet", "**/details_harness|hendrycksTest-virology|5_2023-09-03T13:55:36.441206.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-09-03T13:55:36.441206.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-03T13:55:36.441206.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-09-03T13:55:36.441206.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-09-03T13:55:36.441206.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-09-03T13:55:36.441206.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-03T13:55:36.441206.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-09-03T13:55:36.441206.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-09-03T13:55:36.441206.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-09-03T13:55:36.441206.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-09-03T13:55:36.441206.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-09-03T13:55:36.441206.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-09-03T13:55:36.441206.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-09-03T13:55:36.441206.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-03T13:55:36.441206.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-09-03T13:55:36.441206.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-03T13:55:36.441206.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-03T13:55:36.441206.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-09-03T13:55:36.441206.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-09-03T13:55:36.441206.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-09-03T13:55:36.441206.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-03T13:55:36.441206.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-03T13:55:36.441206.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-03T13:55:36.441206.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-09-03T13:55:36.441206.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-03T13:55:36.441206.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-03T13:55:36.441206.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-03T13:55:36.441206.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-03T13:55:36.441206.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-09-03T13:55:36.441206.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-03T13:55:36.441206.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-03T13:55:36.441206.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-03T13:55:36.441206.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-03T13:55:36.441206.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-09-03T13:55:36.441206.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-09-03T13:55:36.441206.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-09-03T13:55:36.441206.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-09-03T13:55:36.441206.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-03T13:55:36.441206.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-09-03T13:55:36.441206.parquet", "**/details_harness|hendrycksTest-management|5_2023-09-03T13:55:36.441206.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-09-03T13:55:36.441206.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-09-03T13:55:36.441206.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-09-03T13:55:36.441206.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-09-03T13:55:36.441206.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-03T13:55:36.441206.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-09-03T13:55:36.441206.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-09-03T13:55:36.441206.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-09-03T13:55:36.441206.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-09-03T13:55:36.441206.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-09-03T13:55:36.441206.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-09-03T13:55:36.441206.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-09-03T13:55:36.441206.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-09-03T13:55:36.441206.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-09-03T13:55:36.441206.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-09-03T13:55:36.441206.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-03T13:55:36.441206.parquet", "**/details_harness|hendrycksTest-virology|5_2023-09-03T13:55:36.441206.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-09-03T13:55:36.441206.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_09_03T13_55_36.441206", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-03T13:55:36.441206.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-03T13:55:36.441206.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_09_03T13_55_36.441206", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-09-03T13:55:36.441206.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-09-03T13:55:36.441206.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_09_03T13_55_36.441206", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-09-03T13:55:36.441206.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-09-03T13:55:36.441206.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_09_03T13_55_36.441206", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-09-03T13:55:36.441206.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-09-03T13:55:36.441206.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_09_03T13_55_36.441206", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-03T13:55:36.441206.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-03T13:55:36.441206.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_09_03T13_55_36.441206", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-09-03T13:55:36.441206.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-09-03T13:55:36.441206.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_09_03T13_55_36.441206", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-09-03T13:55:36.441206.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-09-03T13:55:36.441206.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_09_03T13_55_36.441206", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-09-03T13:55:36.441206.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-09-03T13:55:36.441206.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_09_03T13_55_36.441206", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-09-03T13:55:36.441206.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-09-03T13:55:36.441206.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_09_03T13_55_36.441206", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-09-03T13:55:36.441206.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-09-03T13:55:36.441206.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_09_03T13_55_36.441206", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-09-03T13:55:36.441206.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-09-03T13:55:36.441206.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_09_03T13_55_36.441206", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-09-03T13:55:36.441206.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-09-03T13:55:36.441206.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_09_03T13_55_36.441206", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-03T13:55:36.441206.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-03T13:55:36.441206.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_09_03T13_55_36.441206", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-09-03T13:55:36.441206.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-09-03T13:55:36.441206.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_09_03T13_55_36.441206", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-03T13:55:36.441206.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-03T13:55:36.441206.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_09_03T13_55_36.441206", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-03T13:55:36.441206.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-03T13:55:36.441206.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_09_03T13_55_36.441206", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-09-03T13:55:36.441206.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-09-03T13:55:36.441206.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_09_03T13_55_36.441206", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-09-03T13:55:36.441206.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-09-03T13:55:36.441206.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_09_03T13_55_36.441206", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-09-03T13:55:36.441206.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-09-03T13:55:36.441206.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_09_03T13_55_36.441206", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-03T13:55:36.441206.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-03T13:55:36.441206.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_09_03T13_55_36.441206", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-03T13:55:36.441206.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-03T13:55:36.441206.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_09_03T13_55_36.441206", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-03T13:55:36.441206.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-03T13:55:36.441206.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_09_03T13_55_36.441206", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-09-03T13:55:36.441206.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-09-03T13:55:36.441206.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_09_03T13_55_36.441206", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-03T13:55:36.441206.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-03T13:55:36.441206.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_09_03T13_55_36.441206", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-03T13:55:36.441206.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-03T13:55:36.441206.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_09_03T13_55_36.441206", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-03T13:55:36.441206.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-03T13:55:36.441206.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_09_03T13_55_36.441206", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-03T13:55:36.441206.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-03T13:55:36.441206.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_09_03T13_55_36.441206", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-09-03T13:55:36.441206.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-09-03T13:55:36.441206.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_09_03T13_55_36.441206", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-03T13:55:36.441206.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-03T13:55:36.441206.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_09_03T13_55_36.441206", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-03T13:55:36.441206.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-03T13:55:36.441206.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_09_03T13_55_36.441206", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-03T13:55:36.441206.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-03T13:55:36.441206.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_09_03T13_55_36.441206", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-03T13:55:36.441206.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-03T13:55:36.441206.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_09_03T13_55_36.441206", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-09-03T13:55:36.441206.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-09-03T13:55:36.441206.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_09_03T13_55_36.441206", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-09-03T13:55:36.441206.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-09-03T13:55:36.441206.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_09_03T13_55_36.441206", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-09-03T13:55:36.441206.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-09-03T13:55:36.441206.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_09_03T13_55_36.441206", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-09-03T13:55:36.441206.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-09-03T13:55:36.441206.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_09_03T13_55_36.441206", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-03T13:55:36.441206.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-03T13:55:36.441206.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_09_03T13_55_36.441206", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-09-03T13:55:36.441206.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-09-03T13:55:36.441206.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_09_03T13_55_36.441206", "path": ["**/details_harness|hendrycksTest-management|5_2023-09-03T13:55:36.441206.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-09-03T13:55:36.441206.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_09_03T13_55_36.441206", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-09-03T13:55:36.441206.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-09-03T13:55:36.441206.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_09_03T13_55_36.441206", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-09-03T13:55:36.441206.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-09-03T13:55:36.441206.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_09_03T13_55_36.441206", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-09-03T13:55:36.441206.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-09-03T13:55:36.441206.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_09_03T13_55_36.441206", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-09-03T13:55:36.441206.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-09-03T13:55:36.441206.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_09_03T13_55_36.441206", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-03T13:55:36.441206.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-03T13:55:36.441206.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_09_03T13_55_36.441206", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-09-03T13:55:36.441206.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-09-03T13:55:36.441206.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_09_03T13_55_36.441206", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-09-03T13:55:36.441206.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-09-03T13:55:36.441206.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_09_03T13_55_36.441206", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-09-03T13:55:36.441206.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-09-03T13:55:36.441206.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_09_03T13_55_36.441206", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-09-03T13:55:36.441206.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-09-03T13:55:36.441206.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_09_03T13_55_36.441206", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-09-03T13:55:36.441206.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-09-03T13:55:36.441206.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_09_03T13_55_36.441206", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-09-03T13:55:36.441206.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-09-03T13:55:36.441206.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_09_03T13_55_36.441206", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-09-03T13:55:36.441206.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-09-03T13:55:36.441206.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_09_03T13_55_36.441206", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-09-03T13:55:36.441206.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-09-03T13:55:36.441206.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_09_03T13_55_36.441206", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-09-03T13:55:36.441206.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-09-03T13:55:36.441206.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_09_03T13_55_36.441206", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-09-03T13:55:36.441206.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-09-03T13:55:36.441206.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_09_03T13_55_36.441206", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-03T13:55:36.441206.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-03T13:55:36.441206.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_09_03T13_55_36.441206", "path": ["**/details_harness|hendrycksTest-virology|5_2023-09-03T13:55:36.441206.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-09-03T13:55:36.441206.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_09_03T13_55_36.441206", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-09-03T13:55:36.441206.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-09-03T13:55:36.441206.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_09_03T13_55_36.441206", "path": ["**/details_harness|truthfulqa:mc|0_2023-09-03T13:55:36.441206.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-09-03T13:55:36.441206.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_15T21_19_09.628553", "path": ["**/details_harness|winogrande|5_2023-10-15T21-19-09.628553.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-15T21-19-09.628553.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_09_03T13_55_36.441206", "path": ["results_2023-09-03T13:55:36.441206.parquet"]}, {"split": "2023_10_15T21_19_09.628553", "path": ["results_2023-10-15T21-19-09.628553.parquet"]}, {"split": "latest", "path": ["results_2023-10-15T21-19-09.628553.parquet"]}]}]}
2023-10-15T20:19:21+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of RWKV/rwkv-4-14b-pile ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model RWKV/rwkv-4-14b-pile on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-10-15T21:19:09.628553(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of RWKV/rwkv-4-14b-pile", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model RWKV/rwkv-4-14b-pile on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-15T21:19:09.628553(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of RWKV/rwkv-4-14b-pile", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model RWKV/rwkv-4-14b-pile on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-15T21:19:09.628553(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 22, 31, 170, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of RWKV/rwkv-4-14b-pile## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model RWKV/rwkv-4-14b-pile on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-15T21:19:09.628553(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
8ed67ab9461930b069a5a6cfff9aea2f25459609
# Dataset Card for "infer_on_testds_v2" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
linhtran92/infer_on_testds_v2
[ "region:us" ]
2023-09-03T12:58:17+00:00
{"dataset_info": {"features": [{"name": "audio", "dtype": {"audio": {"sampling_rate": 16000}}}, {"name": "id", "dtype": "string"}, {"name": "w2v2_baseline_transcription", "dtype": "string"}, {"name": "w2v2_baseline_norm", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 174371557.027, "num_examples": 1299}], "download_size": 164199656, "dataset_size": 174371557.027}}
2023-09-03T12:58:24+00:00
[]
[]
TAGS #region-us
# Dataset Card for "infer_on_testds_v2" More Information needed
[ "# Dataset Card for \"infer_on_testds_v2\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"infer_on_testds_v2\"\n\nMore Information needed" ]
[ 6, 20 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"infer_on_testds_v2\"\n\nMore Information needed" ]
015dcc76a7ddf565b22ffc1857843faeaa93db42
# Dataset Card for "Messidor2_except_0" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
sngsfydy/Messidor2_except_0
[ "region:us" ]
2023-09-03T12:59:51+00:00
{"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}], "dataset_info": {"features": [{"name": "image", "dtype": "image"}, {"name": "label", "dtype": {"class_label": {"names": {"0": "1", "1": "2", "2": "3", "3": "4"}}}}], "splits": [{"name": "train", "num_bytes": 1381059381.0, "num_examples": 727}], "download_size": 1375867454, "dataset_size": 1381059381.0}}
2023-09-03T13:04:20+00:00
[]
[]
TAGS #region-us
# Dataset Card for "Messidor2_except_0" More Information needed
[ "# Dataset Card for \"Messidor2_except_0\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"Messidor2_except_0\"\n\nMore Information needed" ]
[ 6, 18 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"Messidor2_except_0\"\n\nMore Information needed" ]
2578717ed3e1a03e8d189e3ecff17ef5201df175
# Dataset Card for Evaluation run of Undi95/ReMM-L2-13B ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/Undi95/ReMM-L2-13B - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [Undi95/ReMM-L2-13B](https://huggingface.co/Undi95/ReMM-L2-13B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_Undi95__ReMM-L2-13B", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-10-16T18:10:03.763192](https://huggingface.co/datasets/open-llm-leaderboard/details_Undi95__ReMM-L2-13B/blob/main/results_2023-10-16T18-10-03.763192.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.3603187919463087, "em_stderr": 0.004916600307723715, "f1": 0.4369505033557066, "f1_stderr": 0.004727018193601331, "acc": 0.3873174710218511, "acc_stderr": 0.008457350051798611 }, "harness|drop|3": { "em": 0.3603187919463087, "em_stderr": 0.004916600307723715, "f1": 0.4369505033557066, "f1_stderr": 0.004727018193601331 }, "harness|gsm8k|5": { "acc": 0.029567854435178165, "acc_stderr": 0.004665893134220799 }, "harness|winogrande|5": { "acc": 0.745067087608524, "acc_stderr": 0.012248806969376422 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_Undi95__ReMM-L2-13B
[ "region:us" ]
2023-09-03T13:15:52+00:00
{"pretty_name": "Evaluation run of Undi95/ReMM-L2-13B", "dataset_summary": "Dataset automatically created during the evaluation run of model [Undi95/ReMM-L2-13B](https://huggingface.co/Undi95/ReMM-L2-13B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Undi95__ReMM-L2-13B\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-16T18:10:03.763192](https://huggingface.co/datasets/open-llm-leaderboard/details_Undi95__ReMM-L2-13B/blob/main/results_2023-10-16T18-10-03.763192.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.3603187919463087,\n \"em_stderr\": 0.004916600307723715,\n \"f1\": 0.4369505033557066,\n \"f1_stderr\": 0.004727018193601331,\n \"acc\": 0.3873174710218511,\n \"acc_stderr\": 0.008457350051798611\n },\n \"harness|drop|3\": {\n \"em\": 0.3603187919463087,\n \"em_stderr\": 0.004916600307723715,\n \"f1\": 0.4369505033557066,\n \"f1_stderr\": 0.004727018193601331\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.029567854435178165,\n \"acc_stderr\": 0.004665893134220799\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.745067087608524,\n \"acc_stderr\": 0.012248806969376422\n }\n}\n```", "repo_url": "https://huggingface.co/Undi95/ReMM-L2-13B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_09_03T14_15_27.893202", "path": ["**/details_harness|arc:challenge|25_2023-09-03T14:15:27.893202.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-09-03T14:15:27.893202.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_16T18_10_03.763192", "path": ["**/details_harness|drop|3_2023-10-16T18-10-03.763192.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-16T18-10-03.763192.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_16T18_10_03.763192", "path": ["**/details_harness|gsm8k|5_2023-10-16T18-10-03.763192.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-16T18-10-03.763192.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_09_03T14_15_27.893202", "path": ["**/details_harness|hellaswag|10_2023-09-03T14:15:27.893202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-09-03T14:15:27.893202.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_09_03T14_15_27.893202", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-03T14:15:27.893202.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-09-03T14:15:27.893202.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-09-03T14:15:27.893202.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-09-03T14:15:27.893202.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-03T14:15:27.893202.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-09-03T14:15:27.893202.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-09-03T14:15:27.893202.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-09-03T14:15:27.893202.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-09-03T14:15:27.893202.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-09-03T14:15:27.893202.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-09-03T14:15:27.893202.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-09-03T14:15:27.893202.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-03T14:15:27.893202.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-09-03T14:15:27.893202.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-03T14:15:27.893202.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-03T14:15:27.893202.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-09-03T14:15:27.893202.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-09-03T14:15:27.893202.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-09-03T14:15:27.893202.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-03T14:15:27.893202.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-03T14:15:27.893202.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-03T14:15:27.893202.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-09-03T14:15:27.893202.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-03T14:15:27.893202.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-03T14:15:27.893202.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-03T14:15:27.893202.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-03T14:15:27.893202.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-09-03T14:15:27.893202.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-03T14:15:27.893202.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-03T14:15:27.893202.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-03T14:15:27.893202.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-03T14:15:27.893202.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-09-03T14:15:27.893202.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-09-03T14:15:27.893202.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-09-03T14:15:27.893202.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-09-03T14:15:27.893202.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-03T14:15:27.893202.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-09-03T14:15:27.893202.parquet", "**/details_harness|hendrycksTest-management|5_2023-09-03T14:15:27.893202.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-09-03T14:15:27.893202.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-09-03T14:15:27.893202.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-09-03T14:15:27.893202.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-09-03T14:15:27.893202.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-03T14:15:27.893202.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-09-03T14:15:27.893202.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-09-03T14:15:27.893202.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-09-03T14:15:27.893202.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-09-03T14:15:27.893202.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-09-03T14:15:27.893202.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-09-03T14:15:27.893202.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-09-03T14:15:27.893202.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-09-03T14:15:27.893202.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-09-03T14:15:27.893202.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-09-03T14:15:27.893202.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-03T14:15:27.893202.parquet", "**/details_harness|hendrycksTest-virology|5_2023-09-03T14:15:27.893202.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-09-03T14:15:27.893202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-03T14:15:27.893202.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-09-03T14:15:27.893202.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-09-03T14:15:27.893202.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-09-03T14:15:27.893202.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-03T14:15:27.893202.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-09-03T14:15:27.893202.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-09-03T14:15:27.893202.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-09-03T14:15:27.893202.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-09-03T14:15:27.893202.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-09-03T14:15:27.893202.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-09-03T14:15:27.893202.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-09-03T14:15:27.893202.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-03T14:15:27.893202.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-09-03T14:15:27.893202.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-03T14:15:27.893202.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-03T14:15:27.893202.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-09-03T14:15:27.893202.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-09-03T14:15:27.893202.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-09-03T14:15:27.893202.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-03T14:15:27.893202.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-03T14:15:27.893202.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-03T14:15:27.893202.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-09-03T14:15:27.893202.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-03T14:15:27.893202.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-03T14:15:27.893202.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-03T14:15:27.893202.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-03T14:15:27.893202.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-09-03T14:15:27.893202.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-03T14:15:27.893202.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-03T14:15:27.893202.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-03T14:15:27.893202.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-03T14:15:27.893202.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-09-03T14:15:27.893202.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-09-03T14:15:27.893202.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-09-03T14:15:27.893202.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-09-03T14:15:27.893202.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-03T14:15:27.893202.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-09-03T14:15:27.893202.parquet", "**/details_harness|hendrycksTest-management|5_2023-09-03T14:15:27.893202.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-09-03T14:15:27.893202.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-09-03T14:15:27.893202.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-09-03T14:15:27.893202.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-09-03T14:15:27.893202.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-03T14:15:27.893202.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-09-03T14:15:27.893202.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-09-03T14:15:27.893202.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-09-03T14:15:27.893202.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-09-03T14:15:27.893202.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-09-03T14:15:27.893202.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-09-03T14:15:27.893202.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-09-03T14:15:27.893202.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-09-03T14:15:27.893202.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-09-03T14:15:27.893202.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-09-03T14:15:27.893202.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-03T14:15:27.893202.parquet", "**/details_harness|hendrycksTest-virology|5_2023-09-03T14:15:27.893202.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-09-03T14:15:27.893202.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_09_03T14_15_27.893202", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-03T14:15:27.893202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-03T14:15:27.893202.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_09_03T14_15_27.893202", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-09-03T14:15:27.893202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-09-03T14:15:27.893202.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_09_03T14_15_27.893202", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-09-03T14:15:27.893202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-09-03T14:15:27.893202.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_09_03T14_15_27.893202", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-09-03T14:15:27.893202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-09-03T14:15:27.893202.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_09_03T14_15_27.893202", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-03T14:15:27.893202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-03T14:15:27.893202.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_09_03T14_15_27.893202", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-09-03T14:15:27.893202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-09-03T14:15:27.893202.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_09_03T14_15_27.893202", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-09-03T14:15:27.893202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-09-03T14:15:27.893202.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_09_03T14_15_27.893202", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-09-03T14:15:27.893202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-09-03T14:15:27.893202.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_09_03T14_15_27.893202", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-09-03T14:15:27.893202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-09-03T14:15:27.893202.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_09_03T14_15_27.893202", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-09-03T14:15:27.893202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-09-03T14:15:27.893202.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_09_03T14_15_27.893202", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-09-03T14:15:27.893202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-09-03T14:15:27.893202.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_09_03T14_15_27.893202", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-09-03T14:15:27.893202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-09-03T14:15:27.893202.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_09_03T14_15_27.893202", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-03T14:15:27.893202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-03T14:15:27.893202.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_09_03T14_15_27.893202", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-09-03T14:15:27.893202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-09-03T14:15:27.893202.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_09_03T14_15_27.893202", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-03T14:15:27.893202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-03T14:15:27.893202.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_09_03T14_15_27.893202", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-03T14:15:27.893202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-03T14:15:27.893202.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_09_03T14_15_27.893202", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-09-03T14:15:27.893202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-09-03T14:15:27.893202.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_09_03T14_15_27.893202", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-09-03T14:15:27.893202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-09-03T14:15:27.893202.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_09_03T14_15_27.893202", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-09-03T14:15:27.893202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-09-03T14:15:27.893202.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_09_03T14_15_27.893202", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-03T14:15:27.893202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-03T14:15:27.893202.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_09_03T14_15_27.893202", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-03T14:15:27.893202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-03T14:15:27.893202.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_09_03T14_15_27.893202", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-03T14:15:27.893202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-03T14:15:27.893202.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_09_03T14_15_27.893202", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-09-03T14:15:27.893202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-09-03T14:15:27.893202.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_09_03T14_15_27.893202", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-03T14:15:27.893202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-03T14:15:27.893202.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_09_03T14_15_27.893202", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-03T14:15:27.893202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-03T14:15:27.893202.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_09_03T14_15_27.893202", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-03T14:15:27.893202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-03T14:15:27.893202.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_09_03T14_15_27.893202", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-03T14:15:27.893202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-03T14:15:27.893202.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_09_03T14_15_27.893202", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-09-03T14:15:27.893202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-09-03T14:15:27.893202.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_09_03T14_15_27.893202", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-03T14:15:27.893202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-03T14:15:27.893202.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_09_03T14_15_27.893202", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-03T14:15:27.893202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-03T14:15:27.893202.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_09_03T14_15_27.893202", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-03T14:15:27.893202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-03T14:15:27.893202.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_09_03T14_15_27.893202", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-03T14:15:27.893202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-03T14:15:27.893202.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_09_03T14_15_27.893202", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-09-03T14:15:27.893202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-09-03T14:15:27.893202.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_09_03T14_15_27.893202", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-09-03T14:15:27.893202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-09-03T14:15:27.893202.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_09_03T14_15_27.893202", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-09-03T14:15:27.893202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-09-03T14:15:27.893202.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_09_03T14_15_27.893202", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-09-03T14:15:27.893202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-09-03T14:15:27.893202.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_09_03T14_15_27.893202", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-03T14:15:27.893202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-03T14:15:27.893202.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_09_03T14_15_27.893202", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-09-03T14:15:27.893202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-09-03T14:15:27.893202.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_09_03T14_15_27.893202", "path": ["**/details_harness|hendrycksTest-management|5_2023-09-03T14:15:27.893202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-09-03T14:15:27.893202.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_09_03T14_15_27.893202", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-09-03T14:15:27.893202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-09-03T14:15:27.893202.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_09_03T14_15_27.893202", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-09-03T14:15:27.893202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-09-03T14:15:27.893202.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_09_03T14_15_27.893202", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-09-03T14:15:27.893202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-09-03T14:15:27.893202.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_09_03T14_15_27.893202", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-09-03T14:15:27.893202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-09-03T14:15:27.893202.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_09_03T14_15_27.893202", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-03T14:15:27.893202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-03T14:15:27.893202.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_09_03T14_15_27.893202", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-09-03T14:15:27.893202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-09-03T14:15:27.893202.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_09_03T14_15_27.893202", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-09-03T14:15:27.893202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-09-03T14:15:27.893202.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_09_03T14_15_27.893202", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-09-03T14:15:27.893202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-09-03T14:15:27.893202.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_09_03T14_15_27.893202", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-09-03T14:15:27.893202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-09-03T14:15:27.893202.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_09_03T14_15_27.893202", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-09-03T14:15:27.893202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-09-03T14:15:27.893202.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_09_03T14_15_27.893202", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-09-03T14:15:27.893202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-09-03T14:15:27.893202.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_09_03T14_15_27.893202", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-09-03T14:15:27.893202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-09-03T14:15:27.893202.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_09_03T14_15_27.893202", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-09-03T14:15:27.893202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-09-03T14:15:27.893202.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_09_03T14_15_27.893202", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-09-03T14:15:27.893202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-09-03T14:15:27.893202.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_09_03T14_15_27.893202", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-09-03T14:15:27.893202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-09-03T14:15:27.893202.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_09_03T14_15_27.893202", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-03T14:15:27.893202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-03T14:15:27.893202.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_09_03T14_15_27.893202", "path": ["**/details_harness|hendrycksTest-virology|5_2023-09-03T14:15:27.893202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-09-03T14:15:27.893202.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_09_03T14_15_27.893202", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-09-03T14:15:27.893202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-09-03T14:15:27.893202.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_09_03T14_15_27.893202", "path": ["**/details_harness|truthfulqa:mc|0_2023-09-03T14:15:27.893202.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-09-03T14:15:27.893202.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_16T18_10_03.763192", "path": ["**/details_harness|winogrande|5_2023-10-16T18-10-03.763192.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-16T18-10-03.763192.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_09_03T14_15_27.893202", "path": ["results_2023-09-03T14:15:27.893202.parquet"]}, {"split": "2023_10_16T18_10_03.763192", "path": ["results_2023-10-16T18-10-03.763192.parquet"]}, {"split": "latest", "path": ["results_2023-10-16T18-10-03.763192.parquet"]}]}]}
2023-10-16T17:10:16+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of Undi95/ReMM-L2-13B ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model Undi95/ReMM-L2-13B on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-10-16T18:10:03.763192(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of Undi95/ReMM-L2-13B", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model Undi95/ReMM-L2-13B on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-16T18:10:03.763192(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of Undi95/ReMM-L2-13B", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model Undi95/ReMM-L2-13B on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-16T18:10:03.763192(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 20, 31, 168, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of Undi95/ReMM-L2-13B## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model Undi95/ReMM-L2-13B on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-16T18:10:03.763192(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
f053329f5376665234e7de845b253f20ca89275f
# Dataset Card for "FronxOwnerManual" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
jmgb0127/FronxOwnerManual
[ "region:us" ]
2023-09-03T13:16:37+00:00
{"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "test", "path": "data/test-*"}]}], "dataset_info": {"features": [{"name": "context", "dtype": "string"}, {"name": "question", "dtype": "string"}, {"name": "answer", "dtype": "string"}, {"name": "__index_level_0__", "dtype": "int64"}], "splits": [{"name": "train", "num_bytes": 1330685, "num_examples": 1177}, {"name": "test", "num_bytes": 332811, "num_examples": 294}], "download_size": 990561, "dataset_size": 1663496}}
2023-09-03T13:24:43+00:00
[]
[]
TAGS #region-us
# Dataset Card for "FronxOwnerManual" More Information needed
[ "# Dataset Card for \"FronxOwnerManual\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"FronxOwnerManual\"\n\nMore Information needed" ]
[ 6, 18 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"FronxOwnerManual\"\n\nMore Information needed" ]
c0a536656b7c90ec68479e6066943dcbf66af75f
# Dataset Card for Evaluation run of xxyyy123/10k_v1_lora_qkvo_rank28_v2 ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/xxyyy123/10k_v1_lora_qkvo_rank28_v2 - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [xxyyy123/10k_v1_lora_qkvo_rank28_v2](https://huggingface.co/xxyyy123/10k_v1_lora_qkvo_rank28_v2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_xxyyy123__10k_v1_lora_qkvo_rank28_v2", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-09-17T22:28:45.139807](https://huggingface.co/datasets/open-llm-leaderboard/details_xxyyy123__10k_v1_lora_qkvo_rank28_v2/blob/main/results_2023-09-17T22-28-45.139807.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.30253775167785235, "em_stderr": 0.004704243479116463, "f1": 0.3736682046979874, "f1_stderr": 0.004609071808093349, "acc": 0.36925201639806293, "acc_stderr": 0.007290194379176739 }, "harness|drop|3": { "em": 0.30253775167785235, "em_stderr": 0.004704243479116463, "f1": 0.3736682046979874, "f1_stderr": 0.004609071808093349 }, "harness|gsm8k|5": { "acc": 0.006065200909780136, "acc_stderr": 0.0021386703014604704 }, "harness|winogrande|5": { "acc": 0.7324388318863457, "acc_stderr": 0.012441718456893009 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_xxyyy123__10k_v1_lora_qkvo_rank28_v2
[ "region:us" ]
2023-09-03T13:47:23+00:00
{"pretty_name": "Evaluation run of xxyyy123/10k_v1_lora_qkvo_rank28_v2", "dataset_summary": "Dataset automatically created during the evaluation run of model [xxyyy123/10k_v1_lora_qkvo_rank28_v2](https://huggingface.co/xxyyy123/10k_v1_lora_qkvo_rank28_v2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_xxyyy123__10k_v1_lora_qkvo_rank28_v2\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-09-17T22:28:45.139807](https://huggingface.co/datasets/open-llm-leaderboard/details_xxyyy123__10k_v1_lora_qkvo_rank28_v2/blob/main/results_2023-09-17T22-28-45.139807.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.30253775167785235,\n \"em_stderr\": 0.004704243479116463,\n \"f1\": 0.3736682046979874,\n \"f1_stderr\": 0.004609071808093349,\n \"acc\": 0.36925201639806293,\n \"acc_stderr\": 0.007290194379176739\n },\n \"harness|drop|3\": {\n \"em\": 0.30253775167785235,\n \"em_stderr\": 0.004704243479116463,\n \"f1\": 0.3736682046979874,\n \"f1_stderr\": 0.004609071808093349\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.006065200909780136,\n \"acc_stderr\": 0.0021386703014604704\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7324388318863457,\n \"acc_stderr\": 0.012441718456893009\n }\n}\n```", "repo_url": "https://huggingface.co/xxyyy123/10k_v1_lora_qkvo_rank28_v2", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_09_03T14_46_59.619219", "path": ["**/details_harness|arc:challenge|25_2023-09-03T14:46:59.619219.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-09-03T14:46:59.619219.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_09_17T22_28_45.139807", "path": ["**/details_harness|drop|3_2023-09-17T22-28-45.139807.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-09-17T22-28-45.139807.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_09_17T22_28_45.139807", "path": ["**/details_harness|gsm8k|5_2023-09-17T22-28-45.139807.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-09-17T22-28-45.139807.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_09_03T14_46_59.619219", "path": ["**/details_harness|hellaswag|10_2023-09-03T14:46:59.619219.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-09-03T14:46:59.619219.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_09_03T14_46_59.619219", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-03T14:46:59.619219.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-09-03T14:46:59.619219.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-09-03T14:46:59.619219.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-09-03T14:46:59.619219.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-03T14:46:59.619219.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-09-03T14:46:59.619219.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-09-03T14:46:59.619219.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-09-03T14:46:59.619219.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-09-03T14:46:59.619219.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-09-03T14:46:59.619219.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-09-03T14:46:59.619219.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-09-03T14:46:59.619219.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-03T14:46:59.619219.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-09-03T14:46:59.619219.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-03T14:46:59.619219.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-03T14:46:59.619219.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-09-03T14:46:59.619219.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-09-03T14:46:59.619219.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-09-03T14:46:59.619219.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-03T14:46:59.619219.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-03T14:46:59.619219.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-03T14:46:59.619219.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-09-03T14:46:59.619219.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-03T14:46:59.619219.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-03T14:46:59.619219.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-03T14:46:59.619219.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-03T14:46:59.619219.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-09-03T14:46:59.619219.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-03T14:46:59.619219.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-03T14:46:59.619219.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-03T14:46:59.619219.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-03T14:46:59.619219.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-09-03T14:46:59.619219.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-09-03T14:46:59.619219.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-09-03T14:46:59.619219.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-09-03T14:46:59.619219.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-03T14:46:59.619219.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-09-03T14:46:59.619219.parquet", "**/details_harness|hendrycksTest-management|5_2023-09-03T14:46:59.619219.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-09-03T14:46:59.619219.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-09-03T14:46:59.619219.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-09-03T14:46:59.619219.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-09-03T14:46:59.619219.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-03T14:46:59.619219.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-09-03T14:46:59.619219.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-09-03T14:46:59.619219.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-09-03T14:46:59.619219.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-09-03T14:46:59.619219.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-09-03T14:46:59.619219.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-09-03T14:46:59.619219.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-09-03T14:46:59.619219.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-09-03T14:46:59.619219.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-09-03T14:46:59.619219.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-09-03T14:46:59.619219.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-03T14:46:59.619219.parquet", "**/details_harness|hendrycksTest-virology|5_2023-09-03T14:46:59.619219.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-09-03T14:46:59.619219.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-03T14:46:59.619219.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-09-03T14:46:59.619219.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-09-03T14:46:59.619219.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-09-03T14:46:59.619219.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-03T14:46:59.619219.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-09-03T14:46:59.619219.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-09-03T14:46:59.619219.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-09-03T14:46:59.619219.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-09-03T14:46:59.619219.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-09-03T14:46:59.619219.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-09-03T14:46:59.619219.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-09-03T14:46:59.619219.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-03T14:46:59.619219.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-09-03T14:46:59.619219.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-03T14:46:59.619219.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-03T14:46:59.619219.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-09-03T14:46:59.619219.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-09-03T14:46:59.619219.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-09-03T14:46:59.619219.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-03T14:46:59.619219.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-03T14:46:59.619219.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-03T14:46:59.619219.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-09-03T14:46:59.619219.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-03T14:46:59.619219.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-03T14:46:59.619219.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-03T14:46:59.619219.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-03T14:46:59.619219.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-09-03T14:46:59.619219.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-03T14:46:59.619219.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-03T14:46:59.619219.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-03T14:46:59.619219.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-03T14:46:59.619219.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-09-03T14:46:59.619219.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-09-03T14:46:59.619219.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-09-03T14:46:59.619219.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-09-03T14:46:59.619219.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-03T14:46:59.619219.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-09-03T14:46:59.619219.parquet", "**/details_harness|hendrycksTest-management|5_2023-09-03T14:46:59.619219.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-09-03T14:46:59.619219.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-09-03T14:46:59.619219.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-09-03T14:46:59.619219.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-09-03T14:46:59.619219.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-03T14:46:59.619219.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-09-03T14:46:59.619219.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-09-03T14:46:59.619219.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-09-03T14:46:59.619219.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-09-03T14:46:59.619219.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-09-03T14:46:59.619219.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-09-03T14:46:59.619219.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-09-03T14:46:59.619219.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-09-03T14:46:59.619219.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-09-03T14:46:59.619219.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-09-03T14:46:59.619219.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-03T14:46:59.619219.parquet", "**/details_harness|hendrycksTest-virology|5_2023-09-03T14:46:59.619219.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-09-03T14:46:59.619219.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_09_03T14_46_59.619219", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-03T14:46:59.619219.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-03T14:46:59.619219.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_09_03T14_46_59.619219", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-09-03T14:46:59.619219.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-09-03T14:46:59.619219.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_09_03T14_46_59.619219", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-09-03T14:46:59.619219.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-09-03T14:46:59.619219.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_09_03T14_46_59.619219", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-09-03T14:46:59.619219.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-09-03T14:46:59.619219.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_09_03T14_46_59.619219", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-03T14:46:59.619219.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-03T14:46:59.619219.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_09_03T14_46_59.619219", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-09-03T14:46:59.619219.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-09-03T14:46:59.619219.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_09_03T14_46_59.619219", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-09-03T14:46:59.619219.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-09-03T14:46:59.619219.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_09_03T14_46_59.619219", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-09-03T14:46:59.619219.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-09-03T14:46:59.619219.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_09_03T14_46_59.619219", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-09-03T14:46:59.619219.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-09-03T14:46:59.619219.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_09_03T14_46_59.619219", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-09-03T14:46:59.619219.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-09-03T14:46:59.619219.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_09_03T14_46_59.619219", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-09-03T14:46:59.619219.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-09-03T14:46:59.619219.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_09_03T14_46_59.619219", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-09-03T14:46:59.619219.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-09-03T14:46:59.619219.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_09_03T14_46_59.619219", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-03T14:46:59.619219.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-03T14:46:59.619219.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_09_03T14_46_59.619219", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-09-03T14:46:59.619219.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-09-03T14:46:59.619219.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_09_03T14_46_59.619219", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-03T14:46:59.619219.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-03T14:46:59.619219.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_09_03T14_46_59.619219", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-03T14:46:59.619219.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-03T14:46:59.619219.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_09_03T14_46_59.619219", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-09-03T14:46:59.619219.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-09-03T14:46:59.619219.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_09_03T14_46_59.619219", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-09-03T14:46:59.619219.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-09-03T14:46:59.619219.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_09_03T14_46_59.619219", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-09-03T14:46:59.619219.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-09-03T14:46:59.619219.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_09_03T14_46_59.619219", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-03T14:46:59.619219.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-03T14:46:59.619219.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_09_03T14_46_59.619219", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-03T14:46:59.619219.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-03T14:46:59.619219.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_09_03T14_46_59.619219", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-03T14:46:59.619219.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-03T14:46:59.619219.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_09_03T14_46_59.619219", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-09-03T14:46:59.619219.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-09-03T14:46:59.619219.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_09_03T14_46_59.619219", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-03T14:46:59.619219.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-03T14:46:59.619219.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_09_03T14_46_59.619219", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-03T14:46:59.619219.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-03T14:46:59.619219.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_09_03T14_46_59.619219", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-03T14:46:59.619219.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-03T14:46:59.619219.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_09_03T14_46_59.619219", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-03T14:46:59.619219.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-03T14:46:59.619219.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_09_03T14_46_59.619219", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-09-03T14:46:59.619219.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-09-03T14:46:59.619219.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_09_03T14_46_59.619219", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-03T14:46:59.619219.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-03T14:46:59.619219.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_09_03T14_46_59.619219", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-03T14:46:59.619219.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-03T14:46:59.619219.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_09_03T14_46_59.619219", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-03T14:46:59.619219.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-03T14:46:59.619219.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_09_03T14_46_59.619219", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-03T14:46:59.619219.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-03T14:46:59.619219.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_09_03T14_46_59.619219", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-09-03T14:46:59.619219.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-09-03T14:46:59.619219.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_09_03T14_46_59.619219", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-09-03T14:46:59.619219.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-09-03T14:46:59.619219.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_09_03T14_46_59.619219", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-09-03T14:46:59.619219.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-09-03T14:46:59.619219.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_09_03T14_46_59.619219", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-09-03T14:46:59.619219.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-09-03T14:46:59.619219.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_09_03T14_46_59.619219", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-03T14:46:59.619219.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-03T14:46:59.619219.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_09_03T14_46_59.619219", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-09-03T14:46:59.619219.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-09-03T14:46:59.619219.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_09_03T14_46_59.619219", "path": ["**/details_harness|hendrycksTest-management|5_2023-09-03T14:46:59.619219.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-09-03T14:46:59.619219.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_09_03T14_46_59.619219", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-09-03T14:46:59.619219.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-09-03T14:46:59.619219.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_09_03T14_46_59.619219", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-09-03T14:46:59.619219.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-09-03T14:46:59.619219.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_09_03T14_46_59.619219", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-09-03T14:46:59.619219.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-09-03T14:46:59.619219.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_09_03T14_46_59.619219", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-09-03T14:46:59.619219.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-09-03T14:46:59.619219.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_09_03T14_46_59.619219", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-03T14:46:59.619219.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-03T14:46:59.619219.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_09_03T14_46_59.619219", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-09-03T14:46:59.619219.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-09-03T14:46:59.619219.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_09_03T14_46_59.619219", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-09-03T14:46:59.619219.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-09-03T14:46:59.619219.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_09_03T14_46_59.619219", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-09-03T14:46:59.619219.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-09-03T14:46:59.619219.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_09_03T14_46_59.619219", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-09-03T14:46:59.619219.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-09-03T14:46:59.619219.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_09_03T14_46_59.619219", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-09-03T14:46:59.619219.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-09-03T14:46:59.619219.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_09_03T14_46_59.619219", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-09-03T14:46:59.619219.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-09-03T14:46:59.619219.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_09_03T14_46_59.619219", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-09-03T14:46:59.619219.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-09-03T14:46:59.619219.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_09_03T14_46_59.619219", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-09-03T14:46:59.619219.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-09-03T14:46:59.619219.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_09_03T14_46_59.619219", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-09-03T14:46:59.619219.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-09-03T14:46:59.619219.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_09_03T14_46_59.619219", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-09-03T14:46:59.619219.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-09-03T14:46:59.619219.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_09_03T14_46_59.619219", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-03T14:46:59.619219.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-03T14:46:59.619219.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_09_03T14_46_59.619219", "path": ["**/details_harness|hendrycksTest-virology|5_2023-09-03T14:46:59.619219.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-09-03T14:46:59.619219.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_09_03T14_46_59.619219", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-09-03T14:46:59.619219.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-09-03T14:46:59.619219.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_09_03T14_46_59.619219", "path": ["**/details_harness|truthfulqa:mc|0_2023-09-03T14:46:59.619219.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-09-03T14:46:59.619219.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_09_17T22_28_45.139807", "path": ["**/details_harness|winogrande|5_2023-09-17T22-28-45.139807.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-09-17T22-28-45.139807.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_09_03T14_46_59.619219", "path": ["results_2023-09-03T14:46:59.619219.parquet"]}, {"split": "2023_09_17T22_28_45.139807", "path": ["results_2023-09-17T22-28-45.139807.parquet"]}, {"split": "latest", "path": ["results_2023-09-17T22-28-45.139807.parquet"]}]}]}
2023-09-17T21:28:56+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of xxyyy123/10k_v1_lora_qkvo_rank28_v2 ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model xxyyy123/10k_v1_lora_qkvo_rank28_v2 on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-09-17T22:28:45.139807(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of xxyyy123/10k_v1_lora_qkvo_rank28_v2", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model xxyyy123/10k_v1_lora_qkvo_rank28_v2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-09-17T22:28:45.139807(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of xxyyy123/10k_v1_lora_qkvo_rank28_v2", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model xxyyy123/10k_v1_lora_qkvo_rank28_v2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-09-17T22:28:45.139807(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 30, 31, 178, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of xxyyy123/10k_v1_lora_qkvo_rank28_v2## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model xxyyy123/10k_v1_lora_qkvo_rank28_v2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-09-17T22:28:45.139807(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
f6c011be1bb3d7f1dee2c1f58c6c4d03815c16c7
# Dolly-15k-German This is a German version of the Dolly-15k Dataset from databricks (https://huggingface.co/datasets/databricks/databricks-dolly-15k). The Translation was done using the Translator API from Azure. Disclaimer: The quality of the translation has not been reviewed and there was no post-processing. So everything said in the original model card should also apply here. Therefore the Licence is the same as in the original model. Have Fun :)
DRXD1000/Dolly-15k-German
[ "task_categories:question-answering", "task_categories:summarization", "task_categories:text-generation", "size_categories:10K<n<100K", "language:de", "license:cc-by-3.0", "region:us" ]
2023-09-03T13:54:18+00:00
{"language": ["de"], "license": "cc-by-3.0", "size_categories": ["10K<n<100K"], "task_categories": ["question-answering", "summarization", "text-generation"], "dataset_info": {"features": [{"name": "instruction_de", "dtype": "string"}, {"name": "context_de", "dtype": "string"}, {"name": "response_de", "dtype": "string"}, {"name": "category", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 13900072, "num_examples": 15011}], "download_size": 8816923, "dataset_size": 13900072}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]}
2023-10-31T07:06:14+00:00
[]
[ "de" ]
TAGS #task_categories-question-answering #task_categories-summarization #task_categories-text-generation #size_categories-10K<n<100K #language-German #license-cc-by-3.0 #region-us
# Dolly-15k-German This is a German version of the Dolly-15k Dataset from databricks (URL The Translation was done using the Translator API from Azure. Disclaimer: The quality of the translation has not been reviewed and there was no post-processing. So everything said in the original model card should also apply here. Therefore the Licence is the same as in the original model. Have Fun :)
[ "# Dolly-15k-German\n\nThis is a German version of the Dolly-15k Dataset from databricks (URL\n\nThe Translation was done using the Translator API from Azure.\n\nDisclaimer: The quality of the translation has not been reviewed and there was no post-processing. \n\nSo everything said in the original model card should also apply here. \n\nTherefore the Licence is the same as in the original model.\n\nHave Fun :)" ]
[ "TAGS\n#task_categories-question-answering #task_categories-summarization #task_categories-text-generation #size_categories-10K<n<100K #language-German #license-cc-by-3.0 #region-us \n", "# Dolly-15k-German\n\nThis is a German version of the Dolly-15k Dataset from databricks (URL\n\nThe Translation was done using the Translator API from Azure.\n\nDisclaimer: The quality of the translation has not been reviewed and there was no post-processing. \n\nSo everything said in the original model card should also apply here. \n\nTherefore the Licence is the same as in the original model.\n\nHave Fun :)" ]
[ 64, 89 ]
[ "passage: TAGS\n#task_categories-question-answering #task_categories-summarization #task_categories-text-generation #size_categories-10K<n<100K #language-German #license-cc-by-3.0 #region-us \n# Dolly-15k-German\n\nThis is a German version of the Dolly-15k Dataset from databricks (URL\n\nThe Translation was done using the Translator API from Azure.\n\nDisclaimer: The quality of the translation has not been reviewed and there was no post-processing. \n\nSo everything said in the original model card should also apply here. \n\nTherefore the Licence is the same as in the original model.\n\nHave Fun :)" ]
13d015eec0f44b9c1f0a20ccd49f657fd3cf097c
# Dataset of Vipère This is the dataset of Vipère, containing 167 images and their tags. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)). | Name | Images | Download | Description | |:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------| | raw | 167 | [Download](dataset-raw.zip) | Raw data with meta information. | | raw-stage3 | 386 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. | | 384x512 | 167 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. | | 512x512 | 167 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. | | 512x704 | 167 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. | | 640x640 | 167 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. | | 640x880 | 167 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. | | stage3-640 | 386 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. | | stage3-800 | 386 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. | | stage3-1200 | 386 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
CyberHarem/vipere_birdiewinggolfgirlsstory
[ "task_categories:text-to-image", "size_categories:n<1K", "license:mit", "art", "not-for-all-audiences", "region:us" ]
2023-09-03T14:39:19+00:00
{"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]}
2023-09-17T16:27:32+00:00
[]
[]
TAGS #task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
Dataset of Vipère ================= This is the dataset of Vipère, containing 167 images and their tags. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
[]
[ "TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n" ]
[ 44 ]
[ "passage: TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n" ]
0b0af9333fb60898ee6edf70b87119fc7ee48fda
# Dataset Card for Evaluation run of xxyyy123/10k_v1_lora_qk_rank14_v2 ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/xxyyy123/10k_v1_lora_qk_rank14_v2 - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [xxyyy123/10k_v1_lora_qk_rank14_v2](https://huggingface.co/xxyyy123/10k_v1_lora_qk_rank14_v2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_xxyyy123__10k_v1_lora_qk_rank14_v2", "harness_truthfulqa_mc_0", split="train") ``` ## Latest results These are the [latest results from run 2023-09-03T15:46:18.274387](https://huggingface.co/datasets/open-llm-leaderboard/details_xxyyy123__10k_v1_lora_qk_rank14_v2/blob/main/results_2023-09-03T15%3A46%3A18.274387.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.5170296348361414, "acc_stderr": 0.03493290232216538, "acc_norm": 0.5207737377982975, "acc_norm_stderr": 0.034916919339016556, "mc1": 0.3574051407588739, "mc1_stderr": 0.016776599676729398, "mc2": 0.5241397415740128, "mc2_stderr": 0.0157002252598079 }, "harness|arc:challenge|25": { "acc": 0.5298634812286689, "acc_stderr": 0.014585305840007105, "acc_norm": 0.5648464163822525, "acc_norm_stderr": 0.014487986197186043 }, "harness|hellaswag|10": { "acc": 0.6100378410675165, "acc_stderr": 0.004867445945277159, "acc_norm": 0.7959569806811392, "acc_norm_stderr": 0.004021769582317863 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.28, "acc_stderr": 0.04512608598542128, "acc_norm": 0.28, "acc_norm_stderr": 0.04512608598542128 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.4962962962962963, "acc_stderr": 0.04319223625811331, "acc_norm": 0.4962962962962963, "acc_norm_stderr": 0.04319223625811331 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.48026315789473684, "acc_stderr": 0.040657710025626036, "acc_norm": 0.48026315789473684, "acc_norm_stderr": 0.040657710025626036 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.47, "acc_stderr": 0.05016135580465919, "acc_norm": 0.47, "acc_norm_stderr": 0.05016135580465919 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.6037735849056604, "acc_stderr": 0.030102793781791197, "acc_norm": 0.6037735849056604, "acc_norm_stderr": 0.030102793781791197 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.5486111111111112, "acc_stderr": 0.041614023984032786, "acc_norm": 0.5486111111111112, "acc_norm_stderr": 0.041614023984032786 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.39, "acc_stderr": 0.04902071300001974, "acc_norm": 0.39, "acc_norm_stderr": 0.04902071300001974 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.38, "acc_stderr": 0.048783173121456316, "acc_norm": 0.38, "acc_norm_stderr": 0.048783173121456316 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.3, "acc_stderr": 0.046056618647183814, "acc_norm": 0.3, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.4797687861271676, "acc_stderr": 0.03809342081273957, "acc_norm": 0.4797687861271676, "acc_norm_stderr": 0.03809342081273957 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.29411764705882354, "acc_stderr": 0.04533838195929775, "acc_norm": 0.29411764705882354, "acc_norm_stderr": 0.04533838195929775 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.61, "acc_stderr": 0.04902071300001975, "acc_norm": 0.61, "acc_norm_stderr": 0.04902071300001975 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.4595744680851064, "acc_stderr": 0.032579014820998356, "acc_norm": 0.4595744680851064, "acc_norm_stderr": 0.032579014820998356 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.34210526315789475, "acc_stderr": 0.04462917535336936, "acc_norm": 0.34210526315789475, "acc_norm_stderr": 0.04462917535336936 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.4413793103448276, "acc_stderr": 0.04137931034482758, "acc_norm": 0.4413793103448276, "acc_norm_stderr": 0.04137931034482758 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.2962962962962963, "acc_stderr": 0.023517294335963286, "acc_norm": 0.2962962962962963, "acc_norm_stderr": 0.023517294335963286 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.2857142857142857, "acc_stderr": 0.0404061017820884, "acc_norm": 0.2857142857142857, "acc_norm_stderr": 0.0404061017820884 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.37, "acc_stderr": 0.04852365870939099, "acc_norm": 0.37, "acc_norm_stderr": 0.04852365870939099 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.5709677419354838, "acc_stderr": 0.028156036538233193, "acc_norm": 0.5709677419354838, "acc_norm_stderr": 0.028156036538233193 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.3694581280788177, "acc_stderr": 0.03395970381998573, "acc_norm": 0.3694581280788177, "acc_norm_stderr": 0.03395970381998573 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.43, "acc_stderr": 0.049756985195624284, "acc_norm": 0.43, "acc_norm_stderr": 0.049756985195624284 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.7090909090909091, "acc_stderr": 0.03546563019624336, "acc_norm": 0.7090909090909091, "acc_norm_stderr": 0.03546563019624336 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.6818181818181818, "acc_stderr": 0.033184773338453294, "acc_norm": 0.6818181818181818, "acc_norm_stderr": 0.033184773338453294 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.7409326424870466, "acc_stderr": 0.03161877917935413, "acc_norm": 0.7409326424870466, "acc_norm_stderr": 0.03161877917935413 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.4897435897435897, "acc_stderr": 0.025345672221942374, "acc_norm": 0.4897435897435897, "acc_norm_stderr": 0.025345672221942374 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.25555555555555554, "acc_stderr": 0.026593939101844075, "acc_norm": 0.25555555555555554, "acc_norm_stderr": 0.026593939101844075 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.5084033613445378, "acc_stderr": 0.0324739027656967, "acc_norm": 0.5084033613445378, "acc_norm_stderr": 0.0324739027656967 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.3443708609271523, "acc_stderr": 0.038796870240733264, "acc_norm": 0.3443708609271523, "acc_norm_stderr": 0.038796870240733264 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.728440366972477, "acc_stderr": 0.01906909836319144, "acc_norm": 0.728440366972477, "acc_norm_stderr": 0.01906909836319144 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.375, "acc_stderr": 0.033016908987210894, "acc_norm": 0.375, "acc_norm_stderr": 0.033016908987210894 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.6862745098039216, "acc_stderr": 0.032566854844603886, "acc_norm": 0.6862745098039216, "acc_norm_stderr": 0.032566854844603886 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.7257383966244726, "acc_stderr": 0.029041333510598035, "acc_norm": 0.7257383966244726, "acc_norm_stderr": 0.029041333510598035 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.5964125560538116, "acc_stderr": 0.03292802819330314, "acc_norm": 0.5964125560538116, "acc_norm_stderr": 0.03292802819330314 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.5954198473282443, "acc_stderr": 0.043046937953806645, "acc_norm": 0.5954198473282443, "acc_norm_stderr": 0.043046937953806645 }, "harness|hendrycksTest-international_law|5": { "acc": 0.6528925619834711, "acc_stderr": 0.04345724570292534, "acc_norm": 0.6528925619834711, "acc_norm_stderr": 0.04345724570292534 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.6851851851851852, "acc_stderr": 0.04489931073591312, "acc_norm": 0.6851851851851852, "acc_norm_stderr": 0.04489931073591312 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.5950920245398773, "acc_stderr": 0.03856672163548914, "acc_norm": 0.5950920245398773, "acc_norm_stderr": 0.03856672163548914 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.39285714285714285, "acc_stderr": 0.04635550135609976, "acc_norm": 0.39285714285714285, "acc_norm_stderr": 0.04635550135609976 }, "harness|hendrycksTest-management|5": { "acc": 0.7087378640776699, "acc_stderr": 0.04498676320572924, "acc_norm": 0.7087378640776699, "acc_norm_stderr": 0.04498676320572924 }, "harness|hendrycksTest-marketing|5": { "acc": 0.782051282051282, "acc_stderr": 0.02704685763071669, "acc_norm": 0.782051282051282, "acc_norm_stderr": 0.02704685763071669 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.57, "acc_stderr": 0.049756985195624284, "acc_norm": 0.57, "acc_norm_stderr": 0.049756985195624284 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.7088122605363985, "acc_stderr": 0.016246087069701407, "acc_norm": 0.7088122605363985, "acc_norm_stderr": 0.016246087069701407 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.5664739884393064, "acc_stderr": 0.026680134761679214, "acc_norm": 0.5664739884393064, "acc_norm_stderr": 0.026680134761679214 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.2681564245810056, "acc_stderr": 0.014816119635317003, "acc_norm": 0.2681564245810056, "acc_norm_stderr": 0.014816119635317003 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.5392156862745098, "acc_stderr": 0.028541722692618874, "acc_norm": 0.5392156862745098, "acc_norm_stderr": 0.028541722692618874 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.5819935691318328, "acc_stderr": 0.028013651891995072, "acc_norm": 0.5819935691318328, "acc_norm_stderr": 0.028013651891995072 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.5555555555555556, "acc_stderr": 0.027648477877413324, "acc_norm": 0.5555555555555556, "acc_norm_stderr": 0.027648477877413324 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.3723404255319149, "acc_stderr": 0.028838921471251458, "acc_norm": 0.3723404255319149, "acc_norm_stderr": 0.028838921471251458 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.38396349413298564, "acc_stderr": 0.01242158783313423, "acc_norm": 0.38396349413298564, "acc_norm_stderr": 0.01242158783313423 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.4852941176470588, "acc_stderr": 0.03035969707904611, "acc_norm": 0.4852941176470588, "acc_norm_stderr": 0.03035969707904611 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.4869281045751634, "acc_stderr": 0.020220920829626912, "acc_norm": 0.4869281045751634, "acc_norm_stderr": 0.020220920829626912 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6, "acc_stderr": 0.0469237132203465, "acc_norm": 0.6, "acc_norm_stderr": 0.0469237132203465 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.6081632653061224, "acc_stderr": 0.03125127591089165, "acc_norm": 0.6081632653061224, "acc_norm_stderr": 0.03125127591089165 }, "harness|hendrycksTest-sociology|5": { "acc": 0.6019900497512438, "acc_stderr": 0.03461199429040013, "acc_norm": 0.6019900497512438, "acc_norm_stderr": 0.03461199429040013 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.76, "acc_stderr": 0.042923469599092816, "acc_norm": 0.76, "acc_norm_stderr": 0.042923469599092816 }, "harness|hendrycksTest-virology|5": { "acc": 0.42771084337349397, "acc_stderr": 0.03851597683718534, "acc_norm": 0.42771084337349397, "acc_norm_stderr": 0.03851597683718534 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.695906432748538, "acc_stderr": 0.03528211258245229, "acc_norm": 0.695906432748538, "acc_norm_stderr": 0.03528211258245229 }, "harness|truthfulqa:mc|0": { "mc1": 0.3574051407588739, "mc1_stderr": 0.016776599676729398, "mc2": 0.5241397415740128, "mc2_stderr": 0.0157002252598079 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_xxyyy123__10k_v1_lora_qk_rank14_v2
[ "region:us" ]
2023-09-03T14:46:42+00:00
{"pretty_name": "Evaluation run of xxyyy123/10k_v1_lora_qk_rank14_v2", "dataset_summary": "Dataset automatically created during the evaluation run of model [xxyyy123/10k_v1_lora_qk_rank14_v2](https://huggingface.co/xxyyy123/10k_v1_lora_qk_rank14_v2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_xxyyy123__10k_v1_lora_qk_rank14_v2\",\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-09-03T15:46:18.274387](https://huggingface.co/datasets/open-llm-leaderboard/details_xxyyy123__10k_v1_lora_qk_rank14_v2/blob/main/results_2023-09-03T15%3A46%3A18.274387.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5170296348361414,\n \"acc_stderr\": 0.03493290232216538,\n \"acc_norm\": 0.5207737377982975,\n \"acc_norm_stderr\": 0.034916919339016556,\n \"mc1\": 0.3574051407588739,\n \"mc1_stderr\": 0.016776599676729398,\n \"mc2\": 0.5241397415740128,\n \"mc2_stderr\": 0.0157002252598079\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5298634812286689,\n \"acc_stderr\": 0.014585305840007105,\n \"acc_norm\": 0.5648464163822525,\n \"acc_norm_stderr\": 0.014487986197186043\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6100378410675165,\n \"acc_stderr\": 0.004867445945277159,\n \"acc_norm\": 0.7959569806811392,\n \"acc_norm_stderr\": 0.004021769582317863\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4962962962962963,\n \"acc_stderr\": 0.04319223625811331,\n \"acc_norm\": 0.4962962962962963,\n \"acc_norm_stderr\": 0.04319223625811331\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.48026315789473684,\n \"acc_stderr\": 0.040657710025626036,\n \"acc_norm\": 0.48026315789473684,\n \"acc_norm_stderr\": 0.040657710025626036\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.47,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6037735849056604,\n \"acc_stderr\": 0.030102793781791197,\n \"acc_norm\": 0.6037735849056604,\n \"acc_norm_stderr\": 0.030102793781791197\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5486111111111112,\n \"acc_stderr\": 0.041614023984032786,\n \"acc_norm\": 0.5486111111111112,\n \"acc_norm_stderr\": 0.041614023984032786\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001974,\n \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001974\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.4797687861271676,\n \"acc_stderr\": 0.03809342081273957,\n \"acc_norm\": 0.4797687861271676,\n \"acc_norm_stderr\": 0.03809342081273957\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.29411764705882354,\n \"acc_stderr\": 0.04533838195929775,\n \"acc_norm\": 0.29411764705882354,\n \"acc_norm_stderr\": 0.04533838195929775\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.61,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.61,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.4595744680851064,\n \"acc_stderr\": 0.032579014820998356,\n \"acc_norm\": 0.4595744680851064,\n \"acc_norm_stderr\": 0.032579014820998356\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.34210526315789475,\n \"acc_stderr\": 0.04462917535336936,\n \"acc_norm\": 0.34210526315789475,\n \"acc_norm_stderr\": 0.04462917535336936\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.4413793103448276,\n \"acc_stderr\": 0.04137931034482758,\n \"acc_norm\": 0.4413793103448276,\n \"acc_norm_stderr\": 0.04137931034482758\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.2962962962962963,\n \"acc_stderr\": 0.023517294335963286,\n \"acc_norm\": 0.2962962962962963,\n \"acc_norm_stderr\": 0.023517294335963286\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2857142857142857,\n \"acc_stderr\": 0.0404061017820884,\n \"acc_norm\": 0.2857142857142857,\n \"acc_norm_stderr\": 0.0404061017820884\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.5709677419354838,\n \"acc_stderr\": 0.028156036538233193,\n \"acc_norm\": 0.5709677419354838,\n \"acc_norm_stderr\": 0.028156036538233193\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.3694581280788177,\n \"acc_stderr\": 0.03395970381998573,\n \"acc_norm\": 0.3694581280788177,\n \"acc_norm_stderr\": 0.03395970381998573\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.43,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7090909090909091,\n \"acc_stderr\": 0.03546563019624336,\n \"acc_norm\": 0.7090909090909091,\n \"acc_norm_stderr\": 0.03546563019624336\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.6818181818181818,\n \"acc_stderr\": 0.033184773338453294,\n \"acc_norm\": 0.6818181818181818,\n \"acc_norm_stderr\": 0.033184773338453294\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.7409326424870466,\n \"acc_stderr\": 0.03161877917935413,\n \"acc_norm\": 0.7409326424870466,\n \"acc_norm_stderr\": 0.03161877917935413\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.4897435897435897,\n \"acc_stderr\": 0.025345672221942374,\n \"acc_norm\": 0.4897435897435897,\n \"acc_norm_stderr\": 0.025345672221942374\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.25555555555555554,\n \"acc_stderr\": 0.026593939101844075,\n \"acc_norm\": 0.25555555555555554,\n \"acc_norm_stderr\": 0.026593939101844075\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.5084033613445378,\n \"acc_stderr\": 0.0324739027656967,\n \"acc_norm\": 0.5084033613445378,\n \"acc_norm_stderr\": 0.0324739027656967\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3443708609271523,\n \"acc_stderr\": 0.038796870240733264,\n \"acc_norm\": 0.3443708609271523,\n \"acc_norm_stderr\": 0.038796870240733264\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.728440366972477,\n \"acc_stderr\": 0.01906909836319144,\n \"acc_norm\": 0.728440366972477,\n \"acc_norm_stderr\": 0.01906909836319144\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.375,\n \"acc_stderr\": 0.033016908987210894,\n \"acc_norm\": 0.375,\n \"acc_norm_stderr\": 0.033016908987210894\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.6862745098039216,\n \"acc_stderr\": 0.032566854844603886,\n \"acc_norm\": 0.6862745098039216,\n \"acc_norm_stderr\": 0.032566854844603886\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7257383966244726,\n \"acc_stderr\": 0.029041333510598035,\n \"acc_norm\": 0.7257383966244726,\n \"acc_norm_stderr\": 0.029041333510598035\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5964125560538116,\n \"acc_stderr\": 0.03292802819330314,\n \"acc_norm\": 0.5964125560538116,\n \"acc_norm_stderr\": 0.03292802819330314\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.5954198473282443,\n \"acc_stderr\": 0.043046937953806645,\n \"acc_norm\": 0.5954198473282443,\n \"acc_norm_stderr\": 0.043046937953806645\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.6528925619834711,\n \"acc_stderr\": 0.04345724570292534,\n \"acc_norm\": 0.6528925619834711,\n \"acc_norm_stderr\": 0.04345724570292534\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6851851851851852,\n \"acc_stderr\": 0.04489931073591312,\n \"acc_norm\": 0.6851851851851852,\n \"acc_norm_stderr\": 0.04489931073591312\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.5950920245398773,\n \"acc_stderr\": 0.03856672163548914,\n \"acc_norm\": 0.5950920245398773,\n \"acc_norm_stderr\": 0.03856672163548914\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.39285714285714285,\n \"acc_stderr\": 0.04635550135609976,\n \"acc_norm\": 0.39285714285714285,\n \"acc_norm_stderr\": 0.04635550135609976\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7087378640776699,\n \"acc_stderr\": 0.04498676320572924,\n \"acc_norm\": 0.7087378640776699,\n \"acc_norm_stderr\": 0.04498676320572924\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.782051282051282,\n \"acc_stderr\": 0.02704685763071669,\n \"acc_norm\": 0.782051282051282,\n \"acc_norm_stderr\": 0.02704685763071669\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.57,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.57,\n \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7088122605363985,\n \"acc_stderr\": 0.016246087069701407,\n \"acc_norm\": 0.7088122605363985,\n \"acc_norm_stderr\": 0.016246087069701407\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.5664739884393064,\n \"acc_stderr\": 0.026680134761679214,\n \"acc_norm\": 0.5664739884393064,\n \"acc_norm_stderr\": 0.026680134761679214\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2681564245810056,\n \"acc_stderr\": 0.014816119635317003,\n \"acc_norm\": 0.2681564245810056,\n \"acc_norm_stderr\": 0.014816119635317003\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.5392156862745098,\n \"acc_stderr\": 0.028541722692618874,\n \"acc_norm\": 0.5392156862745098,\n \"acc_norm_stderr\": 0.028541722692618874\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5819935691318328,\n \"acc_stderr\": 0.028013651891995072,\n \"acc_norm\": 0.5819935691318328,\n \"acc_norm_stderr\": 0.028013651891995072\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.5555555555555556,\n \"acc_stderr\": 0.027648477877413324,\n \"acc_norm\": 0.5555555555555556,\n \"acc_norm_stderr\": 0.027648477877413324\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.3723404255319149,\n \"acc_stderr\": 0.028838921471251458,\n \"acc_norm\": 0.3723404255319149,\n \"acc_norm_stderr\": 0.028838921471251458\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.38396349413298564,\n \"acc_stderr\": 0.01242158783313423,\n \"acc_norm\": 0.38396349413298564,\n \"acc_norm_stderr\": 0.01242158783313423\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.4852941176470588,\n \"acc_stderr\": 0.03035969707904611,\n \"acc_norm\": 0.4852941176470588,\n \"acc_norm_stderr\": 0.03035969707904611\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.4869281045751634,\n \"acc_stderr\": 0.020220920829626912,\n \"acc_norm\": 0.4869281045751634,\n \"acc_norm_stderr\": 0.020220920829626912\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.0469237132203465,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.0469237132203465\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.6081632653061224,\n \"acc_stderr\": 0.03125127591089165,\n \"acc_norm\": 0.6081632653061224,\n \"acc_norm_stderr\": 0.03125127591089165\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6019900497512438,\n \"acc_stderr\": 0.03461199429040013,\n \"acc_norm\": 0.6019900497512438,\n \"acc_norm_stderr\": 0.03461199429040013\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.76,\n \"acc_stderr\": 0.042923469599092816,\n \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.042923469599092816\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.42771084337349397,\n \"acc_stderr\": 0.03851597683718534,\n \"acc_norm\": 0.42771084337349397,\n \"acc_norm_stderr\": 0.03851597683718534\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.695906432748538,\n \"acc_stderr\": 0.03528211258245229,\n \"acc_norm\": 0.695906432748538,\n \"acc_norm_stderr\": 0.03528211258245229\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3574051407588739,\n \"mc1_stderr\": 0.016776599676729398,\n \"mc2\": 0.5241397415740128,\n \"mc2_stderr\": 0.0157002252598079\n }\n}\n```", "repo_url": "https://huggingface.co/xxyyy123/10k_v1_lora_qk_rank14_v2", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_09_03T15_46_18.274387", "path": ["**/details_harness|arc:challenge|25_2023-09-03T15:46:18.274387.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-09-03T15:46:18.274387.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_09_03T15_46_18.274387", "path": ["**/details_harness|hellaswag|10_2023-09-03T15:46:18.274387.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-09-03T15:46:18.274387.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_09_03T15_46_18.274387", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-03T15:46:18.274387.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-09-03T15:46:18.274387.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-09-03T15:46:18.274387.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-09-03T15:46:18.274387.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-03T15:46:18.274387.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-09-03T15:46:18.274387.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-09-03T15:46:18.274387.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-09-03T15:46:18.274387.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-09-03T15:46:18.274387.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-09-03T15:46:18.274387.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-09-03T15:46:18.274387.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-09-03T15:46:18.274387.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-03T15:46:18.274387.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-09-03T15:46:18.274387.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-03T15:46:18.274387.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-03T15:46:18.274387.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-09-03T15:46:18.274387.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-09-03T15:46:18.274387.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-09-03T15:46:18.274387.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-03T15:46:18.274387.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-03T15:46:18.274387.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-03T15:46:18.274387.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-09-03T15:46:18.274387.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-03T15:46:18.274387.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-03T15:46:18.274387.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-03T15:46:18.274387.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-03T15:46:18.274387.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-09-03T15:46:18.274387.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-03T15:46:18.274387.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-03T15:46:18.274387.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-03T15:46:18.274387.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-03T15:46:18.274387.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-09-03T15:46:18.274387.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-09-03T15:46:18.274387.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-09-03T15:46:18.274387.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-09-03T15:46:18.274387.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-03T15:46:18.274387.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-09-03T15:46:18.274387.parquet", "**/details_harness|hendrycksTest-management|5_2023-09-03T15:46:18.274387.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-09-03T15:46:18.274387.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-09-03T15:46:18.274387.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-09-03T15:46:18.274387.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-09-03T15:46:18.274387.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-03T15:46:18.274387.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-09-03T15:46:18.274387.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-09-03T15:46:18.274387.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-09-03T15:46:18.274387.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-09-03T15:46:18.274387.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-09-03T15:46:18.274387.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-09-03T15:46:18.274387.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-09-03T15:46:18.274387.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-09-03T15:46:18.274387.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-09-03T15:46:18.274387.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-09-03T15:46:18.274387.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-03T15:46:18.274387.parquet", "**/details_harness|hendrycksTest-virology|5_2023-09-03T15:46:18.274387.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-09-03T15:46:18.274387.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-03T15:46:18.274387.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-09-03T15:46:18.274387.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-09-03T15:46:18.274387.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-09-03T15:46:18.274387.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-03T15:46:18.274387.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-09-03T15:46:18.274387.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-09-03T15:46:18.274387.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-09-03T15:46:18.274387.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-09-03T15:46:18.274387.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-09-03T15:46:18.274387.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-09-03T15:46:18.274387.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-09-03T15:46:18.274387.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-03T15:46:18.274387.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-09-03T15:46:18.274387.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-03T15:46:18.274387.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-03T15:46:18.274387.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-09-03T15:46:18.274387.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-09-03T15:46:18.274387.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-09-03T15:46:18.274387.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-03T15:46:18.274387.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-03T15:46:18.274387.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-03T15:46:18.274387.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-09-03T15:46:18.274387.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-03T15:46:18.274387.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-03T15:46:18.274387.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-03T15:46:18.274387.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-03T15:46:18.274387.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-09-03T15:46:18.274387.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-03T15:46:18.274387.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-03T15:46:18.274387.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-03T15:46:18.274387.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-03T15:46:18.274387.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-09-03T15:46:18.274387.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-09-03T15:46:18.274387.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-09-03T15:46:18.274387.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-09-03T15:46:18.274387.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-03T15:46:18.274387.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-09-03T15:46:18.274387.parquet", "**/details_harness|hendrycksTest-management|5_2023-09-03T15:46:18.274387.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-09-03T15:46:18.274387.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-09-03T15:46:18.274387.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-09-03T15:46:18.274387.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-09-03T15:46:18.274387.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-03T15:46:18.274387.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-09-03T15:46:18.274387.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-09-03T15:46:18.274387.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-09-03T15:46:18.274387.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-09-03T15:46:18.274387.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-09-03T15:46:18.274387.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-09-03T15:46:18.274387.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-09-03T15:46:18.274387.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-09-03T15:46:18.274387.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-09-03T15:46:18.274387.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-09-03T15:46:18.274387.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-03T15:46:18.274387.parquet", "**/details_harness|hendrycksTest-virology|5_2023-09-03T15:46:18.274387.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-09-03T15:46:18.274387.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_09_03T15_46_18.274387", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-03T15:46:18.274387.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-03T15:46:18.274387.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_09_03T15_46_18.274387", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-09-03T15:46:18.274387.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-09-03T15:46:18.274387.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_09_03T15_46_18.274387", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-09-03T15:46:18.274387.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-09-03T15:46:18.274387.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_09_03T15_46_18.274387", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-09-03T15:46:18.274387.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-09-03T15:46:18.274387.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_09_03T15_46_18.274387", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-03T15:46:18.274387.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-03T15:46:18.274387.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_09_03T15_46_18.274387", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-09-03T15:46:18.274387.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-09-03T15:46:18.274387.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_09_03T15_46_18.274387", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-09-03T15:46:18.274387.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-09-03T15:46:18.274387.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_09_03T15_46_18.274387", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-09-03T15:46:18.274387.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-09-03T15:46:18.274387.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_09_03T15_46_18.274387", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-09-03T15:46:18.274387.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-09-03T15:46:18.274387.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_09_03T15_46_18.274387", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-09-03T15:46:18.274387.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-09-03T15:46:18.274387.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_09_03T15_46_18.274387", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-09-03T15:46:18.274387.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-09-03T15:46:18.274387.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_09_03T15_46_18.274387", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-09-03T15:46:18.274387.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-09-03T15:46:18.274387.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_09_03T15_46_18.274387", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-03T15:46:18.274387.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-03T15:46:18.274387.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_09_03T15_46_18.274387", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-09-03T15:46:18.274387.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-09-03T15:46:18.274387.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_09_03T15_46_18.274387", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-03T15:46:18.274387.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-03T15:46:18.274387.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_09_03T15_46_18.274387", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-03T15:46:18.274387.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-03T15:46:18.274387.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_09_03T15_46_18.274387", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-09-03T15:46:18.274387.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-09-03T15:46:18.274387.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_09_03T15_46_18.274387", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-09-03T15:46:18.274387.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-09-03T15:46:18.274387.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_09_03T15_46_18.274387", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-09-03T15:46:18.274387.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-09-03T15:46:18.274387.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_09_03T15_46_18.274387", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-03T15:46:18.274387.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-03T15:46:18.274387.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_09_03T15_46_18.274387", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-03T15:46:18.274387.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-03T15:46:18.274387.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_09_03T15_46_18.274387", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-03T15:46:18.274387.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-03T15:46:18.274387.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_09_03T15_46_18.274387", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-09-03T15:46:18.274387.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-09-03T15:46:18.274387.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_09_03T15_46_18.274387", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-03T15:46:18.274387.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-03T15:46:18.274387.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_09_03T15_46_18.274387", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-03T15:46:18.274387.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-03T15:46:18.274387.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_09_03T15_46_18.274387", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-03T15:46:18.274387.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-03T15:46:18.274387.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_09_03T15_46_18.274387", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-03T15:46:18.274387.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-03T15:46:18.274387.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_09_03T15_46_18.274387", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-09-03T15:46:18.274387.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-09-03T15:46:18.274387.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_09_03T15_46_18.274387", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-03T15:46:18.274387.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-03T15:46:18.274387.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_09_03T15_46_18.274387", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-03T15:46:18.274387.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-03T15:46:18.274387.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_09_03T15_46_18.274387", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-03T15:46:18.274387.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-03T15:46:18.274387.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_09_03T15_46_18.274387", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-03T15:46:18.274387.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-03T15:46:18.274387.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_09_03T15_46_18.274387", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-09-03T15:46:18.274387.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-09-03T15:46:18.274387.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_09_03T15_46_18.274387", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-09-03T15:46:18.274387.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-09-03T15:46:18.274387.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_09_03T15_46_18.274387", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-09-03T15:46:18.274387.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-09-03T15:46:18.274387.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_09_03T15_46_18.274387", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-09-03T15:46:18.274387.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-09-03T15:46:18.274387.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_09_03T15_46_18.274387", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-03T15:46:18.274387.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-03T15:46:18.274387.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_09_03T15_46_18.274387", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-09-03T15:46:18.274387.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-09-03T15:46:18.274387.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_09_03T15_46_18.274387", "path": ["**/details_harness|hendrycksTest-management|5_2023-09-03T15:46:18.274387.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-09-03T15:46:18.274387.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_09_03T15_46_18.274387", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-09-03T15:46:18.274387.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-09-03T15:46:18.274387.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_09_03T15_46_18.274387", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-09-03T15:46:18.274387.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-09-03T15:46:18.274387.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_09_03T15_46_18.274387", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-09-03T15:46:18.274387.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-09-03T15:46:18.274387.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_09_03T15_46_18.274387", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-09-03T15:46:18.274387.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-09-03T15:46:18.274387.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_09_03T15_46_18.274387", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-03T15:46:18.274387.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-03T15:46:18.274387.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_09_03T15_46_18.274387", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-09-03T15:46:18.274387.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-09-03T15:46:18.274387.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_09_03T15_46_18.274387", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-09-03T15:46:18.274387.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-09-03T15:46:18.274387.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_09_03T15_46_18.274387", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-09-03T15:46:18.274387.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-09-03T15:46:18.274387.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_09_03T15_46_18.274387", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-09-03T15:46:18.274387.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-09-03T15:46:18.274387.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_09_03T15_46_18.274387", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-09-03T15:46:18.274387.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-09-03T15:46:18.274387.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_09_03T15_46_18.274387", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-09-03T15:46:18.274387.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-09-03T15:46:18.274387.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_09_03T15_46_18.274387", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-09-03T15:46:18.274387.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-09-03T15:46:18.274387.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_09_03T15_46_18.274387", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-09-03T15:46:18.274387.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-09-03T15:46:18.274387.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_09_03T15_46_18.274387", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-09-03T15:46:18.274387.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-09-03T15:46:18.274387.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_09_03T15_46_18.274387", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-09-03T15:46:18.274387.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-09-03T15:46:18.274387.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_09_03T15_46_18.274387", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-03T15:46:18.274387.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-03T15:46:18.274387.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_09_03T15_46_18.274387", "path": ["**/details_harness|hendrycksTest-virology|5_2023-09-03T15:46:18.274387.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-09-03T15:46:18.274387.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_09_03T15_46_18.274387", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-09-03T15:46:18.274387.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-09-03T15:46:18.274387.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_09_03T15_46_18.274387", "path": ["**/details_harness|truthfulqa:mc|0_2023-09-03T15:46:18.274387.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-09-03T15:46:18.274387.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_09_03T15_46_18.274387", "path": ["results_2023-09-03T15:46:18.274387.parquet"]}, {"split": "latest", "path": ["results_2023-09-03T15:46:18.274387.parquet"]}]}]}
2023-09-03T14:47:42+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of xxyyy123/10k_v1_lora_qk_rank14_v2 ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model xxyyy123/10k_v1_lora_qk_rank14_v2 on the Open LLM Leaderboard. The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-09-03T15:46:18.274387(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of xxyyy123/10k_v1_lora_qk_rank14_v2", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model xxyyy123/10k_v1_lora_qk_rank14_v2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-09-03T15:46:18.274387(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of xxyyy123/10k_v1_lora_qk_rank14_v2", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model xxyyy123/10k_v1_lora_qk_rank14_v2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-09-03T15:46:18.274387(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 30, 31, 178, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of xxyyy123/10k_v1_lora_qk_rank14_v2## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model xxyyy123/10k_v1_lora_qk_rank14_v2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-09-03T15:46:18.274387(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
b148e728582e2dacfbc813955bd0ead6b4c4e90e
# Bangumi Image Base of Bento This is the image base of bangumi bento, we detected 17 characters, 1566 images in total. The full dataset is [here](all.zip). **Please note that these image bases are not guaranteed to be 100% cleaned, they may be noisy actual.** If you intend to manually train models using this dataset, we recommend performing necessary preprocessing on the downloaded dataset to eliminate potential noisy samples (approximately 1% probability). Here is the characters' preview: | # | Images | Download | Preview 1 | Preview 2 | Preview 3 | Preview 4 | Preview 5 | Preview 6 | Preview 7 | Preview 8 | |:------|---------:|:---------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------| | 0 | 208 | [Download](0/dataset.zip) | ![preview 1](0/preview_1.png) | ![preview 2](0/preview_2.png) | ![preview 3](0/preview_3.png) | ![preview 4](0/preview_4.png) | ![preview 5](0/preview_5.png) | ![preview 6](0/preview_6.png) | ![preview 7](0/preview_7.png) | ![preview 8](0/preview_8.png) | | 1 | 125 | [Download](1/dataset.zip) | ![preview 1](1/preview_1.png) | ![preview 2](1/preview_2.png) | ![preview 3](1/preview_3.png) | ![preview 4](1/preview_4.png) | ![preview 5](1/preview_5.png) | ![preview 6](1/preview_6.png) | ![preview 7](1/preview_7.png) | ![preview 8](1/preview_8.png) | | 2 | 72 | [Download](2/dataset.zip) | ![preview 1](2/preview_1.png) | ![preview 2](2/preview_2.png) | ![preview 3](2/preview_3.png) | ![preview 4](2/preview_4.png) | ![preview 5](2/preview_5.png) | ![preview 6](2/preview_6.png) | ![preview 7](2/preview_7.png) | ![preview 8](2/preview_8.png) | | 3 | 411 | [Download](3/dataset.zip) | ![preview 1](3/preview_1.png) | ![preview 2](3/preview_2.png) | ![preview 3](3/preview_3.png) | ![preview 4](3/preview_4.png) | ![preview 5](3/preview_5.png) | ![preview 6](3/preview_6.png) | ![preview 7](3/preview_7.png) | ![preview 8](3/preview_8.png) | | 4 | 15 | [Download](4/dataset.zip) | ![preview 1](4/preview_1.png) | ![preview 2](4/preview_2.png) | ![preview 3](4/preview_3.png) | ![preview 4](4/preview_4.png) | ![preview 5](4/preview_5.png) | ![preview 6](4/preview_6.png) | ![preview 7](4/preview_7.png) | ![preview 8](4/preview_8.png) | | 5 | 18 | [Download](5/dataset.zip) | ![preview 1](5/preview_1.png) | ![preview 2](5/preview_2.png) | ![preview 3](5/preview_3.png) | ![preview 4](5/preview_4.png) | ![preview 5](5/preview_5.png) | ![preview 6](5/preview_6.png) | ![preview 7](5/preview_7.png) | ![preview 8](5/preview_8.png) | | 6 | 42 | [Download](6/dataset.zip) | ![preview 1](6/preview_1.png) | ![preview 2](6/preview_2.png) | ![preview 3](6/preview_3.png) | ![preview 4](6/preview_4.png) | ![preview 5](6/preview_5.png) | ![preview 6](6/preview_6.png) | ![preview 7](6/preview_7.png) | ![preview 8](6/preview_8.png) | | 7 | 40 | [Download](7/dataset.zip) | ![preview 1](7/preview_1.png) | ![preview 2](7/preview_2.png) | ![preview 3](7/preview_3.png) | ![preview 4](7/preview_4.png) | ![preview 5](7/preview_5.png) | ![preview 6](7/preview_6.png) | ![preview 7](7/preview_7.png) | ![preview 8](7/preview_8.png) | | 8 | 18 | [Download](8/dataset.zip) | ![preview 1](8/preview_1.png) | ![preview 2](8/preview_2.png) | ![preview 3](8/preview_3.png) | ![preview 4](8/preview_4.png) | ![preview 5](8/preview_5.png) | ![preview 6](8/preview_6.png) | ![preview 7](8/preview_7.png) | ![preview 8](8/preview_8.png) | | 9 | 139 | [Download](9/dataset.zip) | ![preview 1](9/preview_1.png) | ![preview 2](9/preview_2.png) | ![preview 3](9/preview_3.png) | ![preview 4](9/preview_4.png) | ![preview 5](9/preview_5.png) | ![preview 6](9/preview_6.png) | ![preview 7](9/preview_7.png) | ![preview 8](9/preview_8.png) | | 10 | 29 | [Download](10/dataset.zip) | ![preview 1](10/preview_1.png) | ![preview 2](10/preview_2.png) | ![preview 3](10/preview_3.png) | ![preview 4](10/preview_4.png) | ![preview 5](10/preview_5.png) | ![preview 6](10/preview_6.png) | ![preview 7](10/preview_7.png) | ![preview 8](10/preview_8.png) | | 11 | 26 | [Download](11/dataset.zip) | ![preview 1](11/preview_1.png) | ![preview 2](11/preview_2.png) | ![preview 3](11/preview_3.png) | ![preview 4](11/preview_4.png) | ![preview 5](11/preview_5.png) | ![preview 6](11/preview_6.png) | ![preview 7](11/preview_7.png) | ![preview 8](11/preview_8.png) | | 12 | 18 | [Download](12/dataset.zip) | ![preview 1](12/preview_1.png) | ![preview 2](12/preview_2.png) | ![preview 3](12/preview_3.png) | ![preview 4](12/preview_4.png) | ![preview 5](12/preview_5.png) | ![preview 6](12/preview_6.png) | ![preview 7](12/preview_7.png) | ![preview 8](12/preview_8.png) | | 13 | 46 | [Download](13/dataset.zip) | ![preview 1](13/preview_1.png) | ![preview 2](13/preview_2.png) | ![preview 3](13/preview_3.png) | ![preview 4](13/preview_4.png) | ![preview 5](13/preview_5.png) | ![preview 6](13/preview_6.png) | ![preview 7](13/preview_7.png) | ![preview 8](13/preview_8.png) | | 14 | 18 | [Download](14/dataset.zip) | ![preview 1](14/preview_1.png) | ![preview 2](14/preview_2.png) | ![preview 3](14/preview_3.png) | ![preview 4](14/preview_4.png) | ![preview 5](14/preview_5.png) | ![preview 6](14/preview_6.png) | ![preview 7](14/preview_7.png) | ![preview 8](14/preview_8.png) | | 15 | 183 | [Download](15/dataset.zip) | ![preview 1](15/preview_1.png) | ![preview 2](15/preview_2.png) | ![preview 3](15/preview_3.png) | ![preview 4](15/preview_4.png) | ![preview 5](15/preview_5.png) | ![preview 6](15/preview_6.png) | ![preview 7](15/preview_7.png) | ![preview 8](15/preview_8.png) | | noise | 158 | [Download](-1/dataset.zip) | ![preview 1](-1/preview_1.png) | ![preview 2](-1/preview_2.png) | ![preview 3](-1/preview_3.png) | ![preview 4](-1/preview_4.png) | ![preview 5](-1/preview_5.png) | ![preview 6](-1/preview_6.png) | ![preview 7](-1/preview_7.png) | ![preview 8](-1/preview_8.png) |
BangumiBase/xxx
[ "size_categories:1K<n<10K", "license:mit", "art", "region:us" ]
2023-09-03T14:48:27+00:00
{"license": "mit", "size_categories": ["1K<n<10K"], "tags": ["art"]}
2023-09-29T04:33:27+00:00
[]
[]
TAGS #size_categories-1K<n<10K #license-mit #art #region-us
Bangumi Image Base of Bento =========================== This is the image base of bangumi bento, we detected 17 characters, 1566 images in total. The full dataset is here. Please note that these image bases are not guaranteed to be 100% cleaned, they may be noisy actual. If you intend to manually train models using this dataset, we recommend performing necessary preprocessing on the downloaded dataset to eliminate potential noisy samples (approximately 1% probability). Here is the characters' preview:
[]
[ "TAGS\n#size_categories-1K<n<10K #license-mit #art #region-us \n" ]
[ 25 ]
[ "passage: TAGS\n#size_categories-1K<n<10K #license-mit #art #region-us \n" ]
2e633d270c3367632ef9471790b621f6319cf151
# Dataset Card ## Table of Contents - [Table of Contents](#table-of-contents) - [Dataset Description](#dataset-description) - [Dataset Summary](#dataset-summary) - [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards) - [Languages](#languages) - [Dataset Structure](#dataset-structure) - [Data Instances](#data-instances) - [Data Fields](#data-fields) - [Data Splits](#data-splits) - [Additional Information](#additional-information) - [Licensing Information](#licensing-information) - [Citation Information](#citation-information) ## Dataset Description ### Dataset Summary Dataset for automatic text segmentation of Russian news. News corpora based on Yuri Baburov's news dataset https://github.com/buriy/russian-nlp-datasets/releases/ for 2014 and 2015 years. Markup was generated automatically based on 2 methods: taking texts with ready division into paragraphs and random joining parts of different texts. ### Supported Tasks and Leaderboards Dataset designed for text segmentation task. ### Languages The dataset is in Russian. ### Usage ```python from datasets import load_dataset dataset = load_dataset('mlenjoyneer/RuTextSegNews') ``` ### Other datasets mlenjoyneer/RuTextSegWiki ## Dataset Structure ### Data Instances For each instance, there is a list of strings for text sentences, a list of ints for labels (1 is new topic starting and 0 is previous topic continuation) and a string for sample generation method (base or random_joining). ``` { "sentences": [ "В среду в столице Катара Дохе начался чемпионат мира по плаванию на короткой воде.", "За пять соревновательных дней в бассейне Hamad Aquatic Complex будет разыграно 46 комплектов наград.", "Соревнования в Дохе станут последней серьезной проверкой спортсменов перед чемпионатом мира по водным видам спорта, который пройдет в Казани в следующем году.", "В первый день соревнований серебро на дистанции 200 метров вольным стилем выиграли россиянин Данила Изотов и мужская сборная России в эстафете 4х100 метров кролем.", "Для российских спортсменов главным отборочным турниром к ЧМ стал чемпионат России, который состоялся месяц назад в Казани.", "Тренерский штаб сборной освободил от квалификации только одного спортсмена – Владимира Морозова, который на прошлом ЧМ в Стамбуле выиграл на дистанциях 50 и 100 м вольным стилем.", "Он стал единственным пловцом в составе россиян, кто выиграл на том чемпионате золотые медали.", "Изначально планировалось, что Морозов в Дохе стартует сразу на четырех дистанциях, но в итоге его программу сократили вдвое – до 100 м в комплексном плавании и 50 м кролем.", "Остальным спортсменам, в том числе и двукратному серебряному медалисту Олимпиады-2012 в Лондоне Евгению Коротышкину, пришлось проходить отбор.", "По окончании чемпионата России Коротышкин в интервью РБК отмечал, что его победного результата на чемпионате России на дистанции 100 м баттерфляем (50,1 с) может не хватить для попадания на пьедестал.", "«Чтобы бороться за медали в Дохе, необходимо улучшать это время на полсекунды, а может, и больше, – отметил Коротышкин в разговоре с автором этих строк.", "– Думаю, для бронзовой награды хватит 49,3 с, а победитель вполне может пробить отметку в 49 с».", "Российский пловец в начале ноября отмечал, что за месяц до старта чемпионата мира в Дохе у него есть резервы в улучшении результата.", "«В первую очередь надо работать над стартом и переворотами», – рассказал Коротышкин РБК.", "Главным соперником Евгения на стометровке баттерфляем станет южноафриканец Чад де Кло, накануне признанный FINA пловцом года.", "По словам Коротышкина, де Кло находится нынче в великолепной форме и постарается побить мировой рекорд россиянина на этой дистанции, который составляет 48,48 с. Еще до старта чемпионата одним из претендентов на медали помимо Морозова и Коротышкина в составе россиян являлся Данила Изотов.", "Вчера он сначала пробился в финал на дистанции 200 м вольным стилем с третьим результатом (1:42,45), а в решающем заплыве уступил лишь 22 сотые секунды Чаду де Кло и принес России первую серебряную медаль чемпионата мира в Дохе.", "Вместе со сборной России Изотов отобрался с пятым временем (3:08,15) в финал эстафеты 4 х 200 кролем.", "Женская команда России также выступит в решающем заплыве в этом виде программы.", "В эстафете 4 х 100 метров кролем мужская сборная России завоевала серебряную медаль, уступив только французскому квартету."], "labels": [0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0], "method": "base"} ``` ### Data Splits | Dataset Split | Number of Instances in Split | |:---------|:---------| | Train | 20000 | | Test | 4000 | ## Additional Information ### Licensing Information In progress ### Citation Information ```bibtex In progress ```
mlenjoyneer/RuTextSegNews
[ "annotations_creators:machine-generated", "language_creators:found", "multilinguality:monolingual", "size_categories:10K<n<100K", "source_datasets:original", "language:ru", "license:unknown", "region:us" ]
2023-09-03T14:53:42+00:00
{"annotations_creators": ["machine-generated"], "language_creators": ["found"], "language": ["ru"], "license": ["unknown"], "multilinguality": ["monolingual"], "size_categories": ["10K<n<100K"], "source_datasets": ["original"]}
2023-09-03T15:14:33+00:00
[]
[ "ru" ]
TAGS #annotations_creators-machine-generated #language_creators-found #multilinguality-monolingual #size_categories-10K<n<100K #source_datasets-original #language-Russian #license-unknown #region-us
Dataset Card ============ Table of Contents ----------------- * Table of Contents * Dataset Description + Dataset Summary + Supported Tasks and Leaderboards + Languages * Dataset Structure + Data Instances + Data Fields + Data Splits * Additional Information + Licensing Information + Citation Information Dataset Description ------------------- ### Dataset Summary Dataset for automatic text segmentation of Russian news. News corpora based on Yuri Baburov's news dataset URL for 2014 and 2015 years. Markup was generated automatically based on 2 methods: taking texts with ready division into paragraphs and random joining parts of different texts. ### Supported Tasks and Leaderboards Dataset designed for text segmentation task. ### Languages The dataset is in Russian. ### Usage ### Other datasets mlenjoyneer/RuTextSegWiki Dataset Structure ----------------- ### Data Instances For each instance, there is a list of strings for text sentences, a list of ints for labels (1 is new topic starting and 0 is previous topic continuation) and a string for sample generation method (base or random\_joining). ### Data Splits Additional Information ---------------------- ### Licensing Information In progress
[ "### Dataset Summary\n\n\nDataset for automatic text segmentation of Russian news. News corpora based on Yuri Baburov's news dataset URL for 2014 and 2015 years. Markup was generated automatically based on 2 methods: taking texts with ready division into paragraphs and random joining parts of different texts.", "### Supported Tasks and Leaderboards\n\n\nDataset designed for text segmentation task.", "### Languages\n\n\nThe dataset is in Russian.", "### Usage", "### Other datasets\n\n\nmlenjoyneer/RuTextSegWiki\n\n\nDataset Structure\n-----------------", "### Data Instances\n\n\nFor each instance, there is a list of strings for text sentences, a list of ints for labels (1 is new topic starting and 0 is previous topic continuation) and a string for sample generation method (base or random\\_joining).", "### Data Splits\n\n\n\nAdditional Information\n----------------------", "### Licensing Information\n\n\nIn progress" ]
[ "TAGS\n#annotations_creators-machine-generated #language_creators-found #multilinguality-monolingual #size_categories-10K<n<100K #source_datasets-original #language-Russian #license-unknown #region-us \n", "### Dataset Summary\n\n\nDataset for automatic text segmentation of Russian news. News corpora based on Yuri Baburov's news dataset URL for 2014 and 2015 years. Markup was generated automatically based on 2 methods: taking texts with ready division into paragraphs and random joining parts of different texts.", "### Supported Tasks and Leaderboards\n\n\nDataset designed for text segmentation task.", "### Languages\n\n\nThe dataset is in Russian.", "### Usage", "### Other datasets\n\n\nmlenjoyneer/RuTextSegWiki\n\n\nDataset Structure\n-----------------", "### Data Instances\n\n\nFor each instance, there is a list of strings for text sentences, a list of ints for labels (1 is new topic starting and 0 is previous topic continuation) and a string for sample generation method (base or random\\_joining).", "### Data Splits\n\n\n\nAdditional Information\n----------------------", "### Licensing Information\n\n\nIn progress" ]
[ 67, 66, 19, 11, 4, 24, 59, 12, 8 ]
[ "passage: TAGS\n#annotations_creators-machine-generated #language_creators-found #multilinguality-monolingual #size_categories-10K<n<100K #source_datasets-original #language-Russian #license-unknown #region-us \n### Dataset Summary\n\n\nDataset for automatic text segmentation of Russian news. News corpora based on Yuri Baburov's news dataset URL for 2014 and 2015 years. Markup was generated automatically based on 2 methods: taking texts with ready division into paragraphs and random joining parts of different texts.### Supported Tasks and Leaderboards\n\n\nDataset designed for text segmentation task.### Languages\n\n\nThe dataset is in Russian.### Usage### Other datasets\n\n\nmlenjoyneer/RuTextSegWiki\n\n\nDataset Structure\n-----------------### Data Instances\n\n\nFor each instance, there is a list of strings for text sentences, a list of ints for labels (1 is new topic starting and 0 is previous topic continuation) and a string for sample generation method (base or random\\_joining).### Data Splits\n\n\n\nAdditional Information\n----------------------### Licensing Information\n\n\nIn progress" ]
105dcd9651cb00f2197871523ac30bad2bf46450
# Dataset of Evangeline Burton This is the dataset of Evangeline Burton, containing 200 images and their tags. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)). | Name | Images | Download | Description | |:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------| | raw | 200 | [Download](dataset-raw.zip) | Raw data with meta information. | | raw-stage3 | 473 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. | | 384x512 | 200 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. | | 512x512 | 200 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. | | 512x704 | 200 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. | | 640x640 | 200 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. | | 640x880 | 200 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. | | stage3-640 | 473 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. | | stage3-800 | 473 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. | | stage3-1200 | 473 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
CyberHarem/evangeline_burton_birdiewinggolfgirlsstory
[ "task_categories:text-to-image", "size_categories:n<1K", "license:mit", "art", "not-for-all-audiences", "region:us" ]
2023-09-03T15:03:29+00:00
{"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]}
2023-09-17T16:27:34+00:00
[]
[]
TAGS #task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
Dataset of Evangeline Burton ============================ This is the dataset of Evangeline Burton, containing 200 images and their tags. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
[]
[ "TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n" ]
[ 44 ]
[ "passage: TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n" ]
f7d2c6175b781abc39e404e108901f6cf3933644
# Dataset Card ## Table of Contents - [Table of Contents](#table-of-contents) - [Dataset Description](#dataset-description) - [Dataset Summary](#dataset-summary) - [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards) - [Languages](#languages) - [Dataset Structure](#dataset-structure) - [Data Instances](#data-instances) - [Data Fields](#data-fields) - [Data Splits](#data-splits) - [Additional Information](#additional-information) - [Licensing Information](#licensing-information) - [Citation Information](#citation-information) ## Dataset Description ### Dataset Summary Dataset for automatic text segmentation of Russian wiki. Text corpora based on May 2023 Wikipedia dump. Markup was generated automatically based on 2 methods: taking texts with ready division into paragraphs and random joining parts of different texts. ### Supported Tasks and Leaderboards Dataset designed for text segmentation task. ### Languages The dataset is in Russian. ### Usage ```python from datasets import load_dataset dataset = load_dataset('mlenjoyneer/RuTextSegWiki') ``` ### Other datasets mlenjoyneer/RuTextSegNews - similar dataset based on news corpora ## Dataset Structure ### Data Instances For each instance, there is a list of strings for text sentences, a list of ints for labels (1 is new topic starting and 0 is previous topic continuation) and a string for sample generation method (base or random_joining). ### Data Splits | Dataset Split | Number of Instances in Split | |:---------|:---------| | Train | 20000 | | Test | 4000 | ## Additional Information ### Licensing Information In progress ### Citation Information ```bibtex In progress ```
mlenjoyneer/RuTextSegWiki
[ "annotations_creators:machine-generated", "language_creators:found", "multilinguality:monolingual", "size_categories:10K<n<100K", "source_datasets:original", "language:ru", "license:unknown", "region:us" ]
2023-09-03T15:07:05+00:00
{"annotations_creators": ["machine-generated"], "language_creators": ["found"], "language": ["ru"], "license": ["unknown"], "multilinguality": ["monolingual"], "size_categories": ["10K<n<100K"], "source_datasets": ["original"]}
2023-09-03T18:19:45+00:00
[]
[ "ru" ]
TAGS #annotations_creators-machine-generated #language_creators-found #multilinguality-monolingual #size_categories-10K<n<100K #source_datasets-original #language-Russian #license-unknown #region-us
Dataset Card ============ Table of Contents ----------------- * Table of Contents * Dataset Description + Dataset Summary + Supported Tasks and Leaderboards + Languages * Dataset Structure + Data Instances + Data Fields + Data Splits * Additional Information + Licensing Information + Citation Information Dataset Description ------------------- ### Dataset Summary Dataset for automatic text segmentation of Russian wiki. Text corpora based on May 2023 Wikipedia dump. Markup was generated automatically based on 2 methods: taking texts with ready division into paragraphs and random joining parts of different texts. ### Supported Tasks and Leaderboards Dataset designed for text segmentation task. ### Languages The dataset is in Russian. ### Usage ### Other datasets mlenjoyneer/RuTextSegNews - similar dataset based on news corpora Dataset Structure ----------------- ### Data Instances For each instance, there is a list of strings for text sentences, a list of ints for labels (1 is new topic starting and 0 is previous topic continuation) and a string for sample generation method (base or random\_joining). ### Data Splits Additional Information ---------------------- ### Licensing Information In progress
[ "### Dataset Summary\n\n\nDataset for automatic text segmentation of Russian wiki. Text corpora based on May 2023 Wikipedia dump. Markup was generated automatically based on 2 methods: taking texts with ready division into paragraphs and random joining parts of different texts.", "### Supported Tasks and Leaderboards\n\n\nDataset designed for text segmentation task.", "### Languages\n\n\nThe dataset is in Russian.", "### Usage", "### Other datasets\n\n\nmlenjoyneer/RuTextSegNews - similar dataset based on news corpora\n\n\nDataset Structure\n-----------------", "### Data Instances\n\n\nFor each instance, there is a list of strings for text sentences, a list of ints for labels (1 is new topic starting and 0 is previous topic continuation) and a string for sample generation method (base or random\\_joining).", "### Data Splits\n\n\n\nAdditional Information\n----------------------", "### Licensing Information\n\n\nIn progress" ]
[ "TAGS\n#annotations_creators-machine-generated #language_creators-found #multilinguality-monolingual #size_categories-10K<n<100K #source_datasets-original #language-Russian #license-unknown #region-us \n", "### Dataset Summary\n\n\nDataset for automatic text segmentation of Russian wiki. Text corpora based on May 2023 Wikipedia dump. Markup was generated automatically based on 2 methods: taking texts with ready division into paragraphs and random joining parts of different texts.", "### Supported Tasks and Leaderboards\n\n\nDataset designed for text segmentation task.", "### Languages\n\n\nThe dataset is in Russian.", "### Usage", "### Other datasets\n\n\nmlenjoyneer/RuTextSegNews - similar dataset based on news corpora\n\n\nDataset Structure\n-----------------", "### Data Instances\n\n\nFor each instance, there is a list of strings for text sentences, a list of ints for labels (1 is new topic starting and 0 is previous topic continuation) and a string for sample generation method (base or random\\_joining).", "### Data Splits\n\n\n\nAdditional Information\n----------------------", "### Licensing Information\n\n\nIn progress" ]
[ 67, 57, 19, 11, 4, 32, 59, 12, 8 ]
[ "passage: TAGS\n#annotations_creators-machine-generated #language_creators-found #multilinguality-monolingual #size_categories-10K<n<100K #source_datasets-original #language-Russian #license-unknown #region-us \n### Dataset Summary\n\n\nDataset for automatic text segmentation of Russian wiki. Text corpora based on May 2023 Wikipedia dump. Markup was generated automatically based on 2 methods: taking texts with ready division into paragraphs and random joining parts of different texts.### Supported Tasks and Leaderboards\n\n\nDataset designed for text segmentation task.### Languages\n\n\nThe dataset is in Russian.### Usage### Other datasets\n\n\nmlenjoyneer/RuTextSegNews - similar dataset based on news corpora\n\n\nDataset Structure\n-----------------### Data Instances\n\n\nFor each instance, there is a list of strings for text sentences, a list of ints for labels (1 is new topic starting and 0 is previous topic continuation) and a string for sample generation method (base or random\\_joining).### Data Splits\n\n\n\nAdditional Information\n----------------------### Licensing Information\n\n\nIn progress" ]
bbdfed476c7cd2004403d318dbb464b78bfe0de7
- `TinyStories-{train,validation}.txt` from [roneneldan/TinyStories](https://huggingface.co/datasets/roneneldan/TinyStories) - ad-hoc Unicode -> ASCII normalization - remove empty/incomplete stories
cyrilzhang/TinyStories-ascii
[ "license:cdla-sharing-1.0", "region:us" ]
2023-09-03T15:20:56+00:00
{"license": "cdla-sharing-1.0", "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "validation", "path": "data/validation-*"}]}], "dataset_info": {"features": [{"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 1885104075.974957, "num_examples": 2105240}, {"name": "validation", "num_bytes": 19045764.092269212, "num_examples": 21839}], "download_size": 985544567, "dataset_size": 1904149840.0672262}}
2023-09-03T18:17:11+00:00
[]
[]
TAGS #license-cdla-sharing-1.0 #region-us
- 'TinyStories-{train,validation}.txt' from roneneldan/TinyStories - ad-hoc Unicode -> ASCII normalization - remove empty/incomplete stories
[]
[ "TAGS\n#license-cdla-sharing-1.0 #region-us \n" ]
[ 17 ]
[ "passage: TAGS\n#license-cdla-sharing-1.0 #region-us \n" ]
eec46cef1415d7fb803f866bf5cc77da39e961fc
# Dataset Card for "TinyStories2-ascii" - `TinyStoriesV2-GPT4-{train,validation}.txt` from [roneneldan/TinyStories](https://huggingface.co/datasets/roneneldan/TinyStories) - ad-hoc Unicode -> ASCII normalization - remove empty/incomplete stories
cyrilzhang/TinyStories2-ascii
[ "license:cdla-sharing-1.0", "region:us" ]
2023-09-03T15:28:13+00:00
{"license": "cdla-sharing-1.0", "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "validation", "path": "data/validation-*"}]}], "dataset_info": {"features": [{"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 2188147629.6032934, "num_examples": 2708167}, {"name": "validation", "num_bytes": 22108524.398827404, "num_examples": 27544}], "download_size": 1143073704, "dataset_size": 2210256154.002121}}
2023-09-27T12:05:51+00:00
[]
[]
TAGS #license-cdla-sharing-1.0 #region-us
# Dataset Card for "TinyStories2-ascii" - 'TinyStoriesV2-GPT4-{train,validation}.txt' from roneneldan/TinyStories - ad-hoc Unicode -> ASCII normalization - remove empty/incomplete stories
[ "# Dataset Card for \"TinyStories2-ascii\"\n\n- 'TinyStoriesV2-GPT4-{train,validation}.txt' from roneneldan/TinyStories\n- ad-hoc Unicode -> ASCII normalization\n- remove empty/incomplete stories" ]
[ "TAGS\n#license-cdla-sharing-1.0 #region-us \n", "# Dataset Card for \"TinyStories2-ascii\"\n\n- 'TinyStoriesV2-GPT4-{train,validation}.txt' from roneneldan/TinyStories\n- ad-hoc Unicode -> ASCII normalization\n- remove empty/incomplete stories" ]
[ 17, 68 ]
[ "passage: TAGS\n#license-cdla-sharing-1.0 #region-us \n# Dataset Card for \"TinyStories2-ascii\"\n\n- 'TinyStoriesV2-GPT4-{train,validation}.txt' from roneneldan/TinyStories\n- ad-hoc Unicode -> ASCII normalization\n- remove empty/incomplete stories" ]
619a5359393c54416b0504d5df7b074392e7a16a
# Dataset Card for "alan" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
tanguyrenaudie/alan
[ "region:us" ]
2023-09-03T16:28:04+00:00
{"dataset_info": {"features": [{"name": "image", "dtype": "image"}, {"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 2440375.0, "num_examples": 12}], "download_size": 2441920, "dataset_size": 2440375.0}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]}
2023-09-03T16:28:12+00:00
[]
[]
TAGS #region-us
# Dataset Card for "alan" More Information needed
[ "# Dataset Card for \"alan\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"alan\"\n\nMore Information needed" ]
[ 6, 11 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"alan\"\n\nMore Information needed" ]
37d95c8f50a78786b9e19f2295057f368fdbc96a
# Dataset of Amawashi Aoi This is the dataset of Amawashi Aoi, containing 188 images and their tags. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)). | Name | Images | Download | Description | |:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------| | raw | 188 | [Download](dataset-raw.zip) | Raw data with meta information. | | raw-stage3 | 429 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. | | 384x512 | 188 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. | | 512x512 | 188 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. | | 512x704 | 188 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. | | 640x640 | 188 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. | | 640x880 | 188 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. | | stage3-640 | 429 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. | | stage3-800 | 429 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. | | stage3-1200 | 429 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
CyberHarem/amawashi_aoi_birdiewinggolfgirlsstory
[ "task_categories:text-to-image", "size_categories:n<1K", "license:mit", "art", "not-for-all-audiences", "region:us" ]
2023-09-03T16:35:45+00:00
{"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]}
2023-09-17T16:27:36+00:00
[]
[]
TAGS #task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
Dataset of Amawashi Aoi ======================= This is the dataset of Amawashi Aoi, containing 188 images and their tags. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
[]
[ "TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n" ]
[ 44 ]
[ "passage: TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n" ]
d902f1d85176b73703973cb81980de110e692b67
# Dataset Card for "mltest" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
tea90210/mltest
[ "region:us" ]
2023-09-03T16:48:55+00:00
{"dataset_info": {"features": [{"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 205326, "num_examples": 100}], "download_size": 115128, "dataset_size": 205326}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]}
2023-09-04T16:38:44+00:00
[]
[]
TAGS #region-us
# Dataset Card for "mltest" More Information needed
[ "# Dataset Card for \"mltest\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"mltest\"\n\nMore Information needed" ]
[ 6, 12 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"mltest\"\n\nMore Information needed" ]
0679317eb63611d906332844e7caa618b15382f4
# Dataset of Shinjo Amane This is the dataset of Shinjo Amane, containing 200 images and their tags. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)). | Name | Images | Download | Description | |:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------| | raw | 200 | [Download](dataset-raw.zip) | Raw data with meta information. | | raw-stage3 | 430 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. | | 384x512 | 200 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. | | 512x512 | 200 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. | | 512x704 | 200 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. | | 640x640 | 200 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. | | 640x880 | 200 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. | | stage3-640 | 430 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. | | stage3-800 | 430 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. | | stage3-1200 | 430 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
CyberHarem/shinjo_amane_birdiewinggolfgirlsstory
[ "task_categories:text-to-image", "size_categories:n<1K", "license:mit", "art", "not-for-all-audiences", "region:us" ]
2023-09-03T16:51:53+00:00
{"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]}
2023-09-17T16:27:38+00:00
[]
[]
TAGS #task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
Dataset of Shinjo Amane ======================= This is the dataset of Shinjo Amane, containing 200 images and their tags. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
[]
[ "TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n" ]
[ 44 ]
[ "passage: TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n" ]
09e7796f13f2f81b6218bc6845def5c7d45c0916
# Dataset Card for Evaluation run of quantumaikr/quantumairk-llama-2-70B-instruct ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/quantumaikr/quantumairk-llama-2-70B-instruct - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [quantumaikr/quantumairk-llama-2-70B-instruct](https://huggingface.co/quantumaikr/quantumairk-llama-2-70B-instruct) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_quantumaikr__quantumairk-llama-2-70B-instruct", "harness_truthfulqa_mc_0", split="train") ``` ## Latest results These are the [latest results from run 2023-09-03T17:56:31.707465](https://huggingface.co/datasets/open-llm-leaderboard/details_quantumaikr__quantumairk-llama-2-70B-instruct/blob/main/results_2023-09-03T17%3A56%3A31.707465.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.7034684893089378, "acc_stderr": 0.03095672218595075, "acc_norm": 0.7074134061674743, "acc_norm_stderr": 0.03092655497572306, "mc1": 0.3818849449204406, "mc1_stderr": 0.017008101939163495, "mc2": 0.5442099278190564, "mc2_stderr": 0.014507128903598229 }, "harness|arc:challenge|25": { "acc": 0.6621160409556314, "acc_stderr": 0.013822047922283504, "acc_norm": 0.7030716723549488, "acc_norm_stderr": 0.013352025976725225 }, "harness|hellaswag|10": { "acc": 0.6787492531368253, "acc_stderr": 0.004660025270817022, "acc_norm": 0.8705437163911571, "acc_norm_stderr": 0.003350181812941604 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.37, "acc_stderr": 0.04852365870939099, "acc_norm": 0.37, "acc_norm_stderr": 0.04852365870939099 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.6148148148148148, "acc_stderr": 0.042039210401562783, "acc_norm": 0.6148148148148148, "acc_norm_stderr": 0.042039210401562783 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.7960526315789473, "acc_stderr": 0.0327900040631005, "acc_norm": 0.7960526315789473, "acc_norm_stderr": 0.0327900040631005 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.71, "acc_stderr": 0.045604802157206845, "acc_norm": 0.71, "acc_norm_stderr": 0.045604802157206845 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.7584905660377359, "acc_stderr": 0.026341480371118366, "acc_norm": 0.7584905660377359, "acc_norm_stderr": 0.026341480371118366 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.8402777777777778, "acc_stderr": 0.030635578972093274, "acc_norm": 0.8402777777777778, "acc_norm_stderr": 0.030635578972093274 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.5, "acc_stderr": 0.050251890762960605, "acc_norm": 0.5, "acc_norm_stderr": 0.050251890762960605 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.54, "acc_stderr": 0.05009082659620332, "acc_norm": 0.54, "acc_norm_stderr": 0.05009082659620332 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.42, "acc_stderr": 0.049604496374885836, "acc_norm": 0.42, "acc_norm_stderr": 0.049604496374885836 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.6936416184971098, "acc_stderr": 0.03514942551267438, "acc_norm": 0.6936416184971098, "acc_norm_stderr": 0.03514942551267438 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.3137254901960784, "acc_stderr": 0.04617034827006717, "acc_norm": 0.3137254901960784, "acc_norm_stderr": 0.04617034827006717 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.78, "acc_stderr": 0.041633319989322626, "acc_norm": 0.78, "acc_norm_stderr": 0.041633319989322626 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.6638297872340425, "acc_stderr": 0.030881618520676942, "acc_norm": 0.6638297872340425, "acc_norm_stderr": 0.030881618520676942 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.47368421052631576, "acc_stderr": 0.04697085136647863, "acc_norm": 0.47368421052631576, "acc_norm_stderr": 0.04697085136647863 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.5862068965517241, "acc_stderr": 0.04104269211806232, "acc_norm": 0.5862068965517241, "acc_norm_stderr": 0.04104269211806232 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.46296296296296297, "acc_stderr": 0.025680564640056882, "acc_norm": 0.46296296296296297, "acc_norm_stderr": 0.025680564640056882 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.5396825396825397, "acc_stderr": 0.04458029125470973, "acc_norm": 0.5396825396825397, "acc_norm_stderr": 0.04458029125470973 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.48, "acc_stderr": 0.050211673156867795, "acc_norm": 0.48, "acc_norm_stderr": 0.050211673156867795 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.8258064516129032, "acc_stderr": 0.02157624818451459, "acc_norm": 0.8258064516129032, "acc_norm_stderr": 0.02157624818451459 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.5665024630541872, "acc_stderr": 0.034867317274198714, "acc_norm": 0.5665024630541872, "acc_norm_stderr": 0.034867317274198714 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.77, "acc_stderr": 0.04229525846816505, "acc_norm": 0.77, "acc_norm_stderr": 0.04229525846816505 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.8787878787878788, "acc_stderr": 0.02548549837334323, "acc_norm": 0.8787878787878788, "acc_norm_stderr": 0.02548549837334323 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.8636363636363636, "acc_stderr": 0.024450155973189835, "acc_norm": 0.8636363636363636, "acc_norm_stderr": 0.024450155973189835 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.9481865284974094, "acc_stderr": 0.01599622932024412, "acc_norm": 0.9481865284974094, "acc_norm_stderr": 0.01599622932024412 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.7205128205128205, "acc_stderr": 0.022752388839776823, "acc_norm": 0.7205128205128205, "acc_norm_stderr": 0.022752388839776823 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.3037037037037037, "acc_stderr": 0.028037929969114986, "acc_norm": 0.3037037037037037, "acc_norm_stderr": 0.028037929969114986 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.7773109243697479, "acc_stderr": 0.02702543349888238, "acc_norm": 0.7773109243697479, "acc_norm_stderr": 0.02702543349888238 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.5099337748344371, "acc_stderr": 0.04081677107248437, "acc_norm": 0.5099337748344371, "acc_norm_stderr": 0.04081677107248437 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.9009174311926605, "acc_stderr": 0.01280978008187893, "acc_norm": 0.9009174311926605, "acc_norm_stderr": 0.01280978008187893 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.625, "acc_stderr": 0.033016908987210894, "acc_norm": 0.625, "acc_norm_stderr": 0.033016908987210894 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.9215686274509803, "acc_stderr": 0.018869514646658928, "acc_norm": 0.9215686274509803, "acc_norm_stderr": 0.018869514646658928 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.9029535864978903, "acc_stderr": 0.019269323025640255, "acc_norm": 0.9029535864978903, "acc_norm_stderr": 0.019269323025640255 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.8026905829596412, "acc_stderr": 0.02670985334496796, "acc_norm": 0.8026905829596412, "acc_norm_stderr": 0.02670985334496796 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.8091603053435115, "acc_stderr": 0.03446513350752596, "acc_norm": 0.8091603053435115, "acc_norm_stderr": 0.03446513350752596 }, "harness|hendrycksTest-international_law|5": { "acc": 0.8842975206611571, "acc_stderr": 0.029199802455622804, "acc_norm": 0.8842975206611571, "acc_norm_stderr": 0.029199802455622804 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.8333333333333334, "acc_stderr": 0.036028141763926456, "acc_norm": 0.8333333333333334, "acc_norm_stderr": 0.036028141763926456 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.8159509202453987, "acc_stderr": 0.030446777687971726, "acc_norm": 0.8159509202453987, "acc_norm_stderr": 0.030446777687971726 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.6160714285714286, "acc_stderr": 0.04616143075028546, "acc_norm": 0.6160714285714286, "acc_norm_stderr": 0.04616143075028546 }, "harness|hendrycksTest-management|5": { "acc": 0.8543689320388349, "acc_stderr": 0.034926064766237906, "acc_norm": 0.8543689320388349, "acc_norm_stderr": 0.034926064766237906 }, "harness|hendrycksTest-marketing|5": { "acc": 0.9188034188034188, "acc_stderr": 0.017893784904018533, "acc_norm": 0.9188034188034188, "acc_norm_stderr": 0.017893784904018533 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.71, "acc_stderr": 0.04560480215720684, "acc_norm": 0.71, "acc_norm_stderr": 0.04560480215720684 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.8710089399744572, "acc_stderr": 0.01198637154808687, "acc_norm": 0.8710089399744572, "acc_norm_stderr": 0.01198637154808687 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.7687861271676301, "acc_stderr": 0.022698657167855713, "acc_norm": 0.7687861271676301, "acc_norm_stderr": 0.022698657167855713 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.5988826815642458, "acc_stderr": 0.016392221899407075, "acc_norm": 0.5988826815642458, "acc_norm_stderr": 0.016392221899407075 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.7647058823529411, "acc_stderr": 0.024288619466046112, "acc_norm": 0.7647058823529411, "acc_norm_stderr": 0.024288619466046112 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.7684887459807074, "acc_stderr": 0.023956532766639133, "acc_norm": 0.7684887459807074, "acc_norm_stderr": 0.023956532766639133 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.8302469135802469, "acc_stderr": 0.020888690414093868, "acc_norm": 0.8302469135802469, "acc_norm_stderr": 0.020888690414093868 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.5886524822695035, "acc_stderr": 0.02935491115994097, "acc_norm": 0.5886524822695035, "acc_norm_stderr": 0.02935491115994097 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.5840938722294654, "acc_stderr": 0.0125883238503136, "acc_norm": 0.5840938722294654, "acc_norm_stderr": 0.0125883238503136 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.7132352941176471, "acc_stderr": 0.027472274473233818, "acc_norm": 0.7132352941176471, "acc_norm_stderr": 0.027472274473233818 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.7532679738562091, "acc_stderr": 0.017440820367402507, "acc_norm": 0.7532679738562091, "acc_norm_stderr": 0.017440820367402507 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.7272727272727273, "acc_stderr": 0.04265792110940589, "acc_norm": 0.7272727272727273, "acc_norm_stderr": 0.04265792110940589 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.7673469387755102, "acc_stderr": 0.027049257915896175, "acc_norm": 0.7673469387755102, "acc_norm_stderr": 0.027049257915896175 }, "harness|hendrycksTest-sociology|5": { "acc": 0.8706467661691543, "acc_stderr": 0.023729830881018515, "acc_norm": 0.8706467661691543, "acc_norm_stderr": 0.023729830881018515 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.87, "acc_stderr": 0.033799766898963086, "acc_norm": 0.87, "acc_norm_stderr": 0.033799766898963086 }, "harness|hendrycksTest-virology|5": { "acc": 0.5421686746987951, "acc_stderr": 0.0387862677100236, "acc_norm": 0.5421686746987951, "acc_norm_stderr": 0.0387862677100236 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.8421052631578947, "acc_stderr": 0.027966785859160886, "acc_norm": 0.8421052631578947, "acc_norm_stderr": 0.027966785859160886 }, "harness|truthfulqa:mc|0": { "mc1": 0.3818849449204406, "mc1_stderr": 0.017008101939163495, "mc2": 0.5442099278190564, "mc2_stderr": 0.014507128903598229 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_quantumaikr__quantumairk-llama-2-70B-instruct
[ "region:us" ]
2023-09-03T16:56:56+00:00
{"pretty_name": "Evaluation run of quantumaikr/quantumairk-llama-2-70B-instruct", "dataset_summary": "Dataset automatically created during the evaluation run of model [quantumaikr/quantumairk-llama-2-70B-instruct](https://huggingface.co/quantumaikr/quantumairk-llama-2-70B-instruct) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_quantumaikr__quantumairk-llama-2-70B-instruct\",\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-09-03T17:56:31.707465](https://huggingface.co/datasets/open-llm-leaderboard/details_quantumaikr__quantumairk-llama-2-70B-instruct/blob/main/results_2023-09-03T17%3A56%3A31.707465.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.7034684893089378,\n \"acc_stderr\": 0.03095672218595075,\n \"acc_norm\": 0.7074134061674743,\n \"acc_norm_stderr\": 0.03092655497572306,\n \"mc1\": 0.3818849449204406,\n \"mc1_stderr\": 0.017008101939163495,\n \"mc2\": 0.5442099278190564,\n \"mc2_stderr\": 0.014507128903598229\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6621160409556314,\n \"acc_stderr\": 0.013822047922283504,\n \"acc_norm\": 0.7030716723549488,\n \"acc_norm_stderr\": 0.013352025976725225\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6787492531368253,\n \"acc_stderr\": 0.004660025270817022,\n \"acc_norm\": 0.8705437163911571,\n \"acc_norm_stderr\": 0.003350181812941604\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6148148148148148,\n \"acc_stderr\": 0.042039210401562783,\n \"acc_norm\": 0.6148148148148148,\n \"acc_norm_stderr\": 0.042039210401562783\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.7960526315789473,\n \"acc_stderr\": 0.0327900040631005,\n \"acc_norm\": 0.7960526315789473,\n \"acc_norm_stderr\": 0.0327900040631005\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7584905660377359,\n \"acc_stderr\": 0.026341480371118366,\n \"acc_norm\": 0.7584905660377359,\n \"acc_norm_stderr\": 0.026341480371118366\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8402777777777778,\n \"acc_stderr\": 0.030635578972093274,\n \"acc_norm\": 0.8402777777777778,\n \"acc_norm_stderr\": 0.030635578972093274\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6936416184971098,\n \"acc_stderr\": 0.03514942551267438,\n \"acc_norm\": 0.6936416184971098,\n \"acc_norm_stderr\": 0.03514942551267438\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.3137254901960784,\n \"acc_stderr\": 0.04617034827006717,\n \"acc_norm\": 0.3137254901960784,\n \"acc_norm_stderr\": 0.04617034827006717\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.78,\n \"acc_stderr\": 0.041633319989322626,\n \"acc_norm\": 0.78,\n \"acc_norm_stderr\": 0.041633319989322626\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.6638297872340425,\n \"acc_stderr\": 0.030881618520676942,\n \"acc_norm\": 0.6638297872340425,\n \"acc_norm_stderr\": 0.030881618520676942\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.47368421052631576,\n \"acc_stderr\": 0.04697085136647863,\n \"acc_norm\": 0.47368421052631576,\n \"acc_norm_stderr\": 0.04697085136647863\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5862068965517241,\n \"acc_stderr\": 0.04104269211806232,\n \"acc_norm\": 0.5862068965517241,\n \"acc_norm_stderr\": 0.04104269211806232\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.46296296296296297,\n \"acc_stderr\": 0.025680564640056882,\n \"acc_norm\": 0.46296296296296297,\n \"acc_norm_stderr\": 0.025680564640056882\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5396825396825397,\n \"acc_stderr\": 0.04458029125470973,\n \"acc_norm\": 0.5396825396825397,\n \"acc_norm_stderr\": 0.04458029125470973\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8258064516129032,\n \"acc_stderr\": 0.02157624818451459,\n \"acc_norm\": 0.8258064516129032,\n \"acc_norm_stderr\": 0.02157624818451459\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5665024630541872,\n \"acc_stderr\": 0.034867317274198714,\n \"acc_norm\": 0.5665024630541872,\n \"acc_norm_stderr\": 0.034867317274198714\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.77,\n \"acc_stderr\": 0.04229525846816505,\n \"acc_norm\": 0.77,\n \"acc_norm_stderr\": 0.04229525846816505\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.8787878787878788,\n \"acc_stderr\": 0.02548549837334323,\n \"acc_norm\": 0.8787878787878788,\n \"acc_norm_stderr\": 0.02548549837334323\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.8636363636363636,\n \"acc_stderr\": 0.024450155973189835,\n \"acc_norm\": 0.8636363636363636,\n \"acc_norm_stderr\": 0.024450155973189835\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9481865284974094,\n \"acc_stderr\": 0.01599622932024412,\n \"acc_norm\": 0.9481865284974094,\n \"acc_norm_stderr\": 0.01599622932024412\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.7205128205128205,\n \"acc_stderr\": 0.022752388839776823,\n \"acc_norm\": 0.7205128205128205,\n \"acc_norm_stderr\": 0.022752388839776823\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3037037037037037,\n \"acc_stderr\": 0.028037929969114986,\n \"acc_norm\": 0.3037037037037037,\n \"acc_norm_stderr\": 0.028037929969114986\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.7773109243697479,\n \"acc_stderr\": 0.02702543349888238,\n \"acc_norm\": 0.7773109243697479,\n \"acc_norm_stderr\": 0.02702543349888238\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.5099337748344371,\n \"acc_stderr\": 0.04081677107248437,\n \"acc_norm\": 0.5099337748344371,\n \"acc_norm_stderr\": 0.04081677107248437\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.9009174311926605,\n \"acc_stderr\": 0.01280978008187893,\n \"acc_norm\": 0.9009174311926605,\n \"acc_norm_stderr\": 0.01280978008187893\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.625,\n \"acc_stderr\": 0.033016908987210894,\n \"acc_norm\": 0.625,\n \"acc_norm_stderr\": 0.033016908987210894\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.9215686274509803,\n \"acc_stderr\": 0.018869514646658928,\n \"acc_norm\": 0.9215686274509803,\n \"acc_norm_stderr\": 0.018869514646658928\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.9029535864978903,\n \"acc_stderr\": 0.019269323025640255,\n \"acc_norm\": 0.9029535864978903,\n \"acc_norm_stderr\": 0.019269323025640255\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.8026905829596412,\n \"acc_stderr\": 0.02670985334496796,\n \"acc_norm\": 0.8026905829596412,\n \"acc_norm_stderr\": 0.02670985334496796\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.8091603053435115,\n \"acc_stderr\": 0.03446513350752596,\n \"acc_norm\": 0.8091603053435115,\n \"acc_norm_stderr\": 0.03446513350752596\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8842975206611571,\n \"acc_stderr\": 0.029199802455622804,\n \"acc_norm\": 0.8842975206611571,\n \"acc_norm_stderr\": 0.029199802455622804\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8333333333333334,\n \"acc_stderr\": 0.036028141763926456,\n \"acc_norm\": 0.8333333333333334,\n \"acc_norm_stderr\": 0.036028141763926456\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.8159509202453987,\n \"acc_stderr\": 0.030446777687971726,\n \"acc_norm\": 0.8159509202453987,\n \"acc_norm_stderr\": 0.030446777687971726\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.6160714285714286,\n \"acc_stderr\": 0.04616143075028546,\n \"acc_norm\": 0.6160714285714286,\n \"acc_norm_stderr\": 0.04616143075028546\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8543689320388349,\n \"acc_stderr\": 0.034926064766237906,\n \"acc_norm\": 0.8543689320388349,\n \"acc_norm_stderr\": 0.034926064766237906\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9188034188034188,\n \"acc_stderr\": 0.017893784904018533,\n \"acc_norm\": 0.9188034188034188,\n \"acc_norm_stderr\": 0.017893784904018533\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.04560480215720684,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.04560480215720684\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8710089399744572,\n \"acc_stderr\": 0.01198637154808687,\n \"acc_norm\": 0.8710089399744572,\n \"acc_norm_stderr\": 0.01198637154808687\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7687861271676301,\n \"acc_stderr\": 0.022698657167855713,\n \"acc_norm\": 0.7687861271676301,\n \"acc_norm_stderr\": 0.022698657167855713\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.5988826815642458,\n \"acc_stderr\": 0.016392221899407075,\n \"acc_norm\": 0.5988826815642458,\n \"acc_norm_stderr\": 0.016392221899407075\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7647058823529411,\n \"acc_stderr\": 0.024288619466046112,\n \"acc_norm\": 0.7647058823529411,\n \"acc_norm_stderr\": 0.024288619466046112\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7684887459807074,\n \"acc_stderr\": 0.023956532766639133,\n \"acc_norm\": 0.7684887459807074,\n \"acc_norm_stderr\": 0.023956532766639133\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.8302469135802469,\n \"acc_stderr\": 0.020888690414093868,\n \"acc_norm\": 0.8302469135802469,\n \"acc_norm_stderr\": 0.020888690414093868\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.5886524822695035,\n \"acc_stderr\": 0.02935491115994097,\n \"acc_norm\": 0.5886524822695035,\n \"acc_norm_stderr\": 0.02935491115994097\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5840938722294654,\n \"acc_stderr\": 0.0125883238503136,\n \"acc_norm\": 0.5840938722294654,\n \"acc_norm_stderr\": 0.0125883238503136\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.7132352941176471,\n \"acc_stderr\": 0.027472274473233818,\n \"acc_norm\": 0.7132352941176471,\n \"acc_norm_stderr\": 0.027472274473233818\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.7532679738562091,\n \"acc_stderr\": 0.017440820367402507,\n \"acc_norm\": 0.7532679738562091,\n \"acc_norm_stderr\": 0.017440820367402507\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7272727272727273,\n \"acc_stderr\": 0.04265792110940589,\n \"acc_norm\": 0.7272727272727273,\n \"acc_norm_stderr\": 0.04265792110940589\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7673469387755102,\n \"acc_stderr\": 0.027049257915896175,\n \"acc_norm\": 0.7673469387755102,\n \"acc_norm_stderr\": 0.027049257915896175\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8706467661691543,\n \"acc_stderr\": 0.023729830881018515,\n \"acc_norm\": 0.8706467661691543,\n \"acc_norm_stderr\": 0.023729830881018515\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.87,\n \"acc_stderr\": 0.033799766898963086,\n \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.033799766898963086\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5421686746987951,\n \"acc_stderr\": 0.0387862677100236,\n \"acc_norm\": 0.5421686746987951,\n \"acc_norm_stderr\": 0.0387862677100236\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8421052631578947,\n \"acc_stderr\": 0.027966785859160886,\n \"acc_norm\": 0.8421052631578947,\n \"acc_norm_stderr\": 0.027966785859160886\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3818849449204406,\n \"mc1_stderr\": 0.017008101939163495,\n \"mc2\": 0.5442099278190564,\n \"mc2_stderr\": 0.014507128903598229\n }\n}\n```", "repo_url": "https://huggingface.co/quantumaikr/quantumairk-llama-2-70B-instruct", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_09_03T17_56_31.707465", "path": ["**/details_harness|arc:challenge|25_2023-09-03T17:56:31.707465.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-09-03T17:56:31.707465.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_09_03T17_56_31.707465", "path": ["**/details_harness|hellaswag|10_2023-09-03T17:56:31.707465.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-09-03T17:56:31.707465.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_09_03T17_56_31.707465", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-03T17:56:31.707465.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-09-03T17:56:31.707465.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-09-03T17:56:31.707465.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-09-03T17:56:31.707465.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-03T17:56:31.707465.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-09-03T17:56:31.707465.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-09-03T17:56:31.707465.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-09-03T17:56:31.707465.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-09-03T17:56:31.707465.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-09-03T17:56:31.707465.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-09-03T17:56:31.707465.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-09-03T17:56:31.707465.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-03T17:56:31.707465.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-09-03T17:56:31.707465.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-03T17:56:31.707465.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-03T17:56:31.707465.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-09-03T17:56:31.707465.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-09-03T17:56:31.707465.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-09-03T17:56:31.707465.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-03T17:56:31.707465.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-03T17:56:31.707465.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-03T17:56:31.707465.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-09-03T17:56:31.707465.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-03T17:56:31.707465.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-03T17:56:31.707465.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-03T17:56:31.707465.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-03T17:56:31.707465.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-09-03T17:56:31.707465.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-03T17:56:31.707465.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-03T17:56:31.707465.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-03T17:56:31.707465.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-03T17:56:31.707465.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-09-03T17:56:31.707465.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-09-03T17:56:31.707465.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-09-03T17:56:31.707465.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-09-03T17:56:31.707465.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-03T17:56:31.707465.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-09-03T17:56:31.707465.parquet", "**/details_harness|hendrycksTest-management|5_2023-09-03T17:56:31.707465.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-09-03T17:56:31.707465.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-09-03T17:56:31.707465.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-09-03T17:56:31.707465.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-09-03T17:56:31.707465.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-03T17:56:31.707465.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-09-03T17:56:31.707465.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-09-03T17:56:31.707465.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-09-03T17:56:31.707465.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-09-03T17:56:31.707465.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-09-03T17:56:31.707465.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-09-03T17:56:31.707465.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-09-03T17:56:31.707465.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-09-03T17:56:31.707465.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-09-03T17:56:31.707465.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-09-03T17:56:31.707465.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-03T17:56:31.707465.parquet", "**/details_harness|hendrycksTest-virology|5_2023-09-03T17:56:31.707465.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-09-03T17:56:31.707465.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-03T17:56:31.707465.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-09-03T17:56:31.707465.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-09-03T17:56:31.707465.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-09-03T17:56:31.707465.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-03T17:56:31.707465.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-09-03T17:56:31.707465.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-09-03T17:56:31.707465.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-09-03T17:56:31.707465.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-09-03T17:56:31.707465.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-09-03T17:56:31.707465.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-09-03T17:56:31.707465.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-09-03T17:56:31.707465.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-03T17:56:31.707465.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-09-03T17:56:31.707465.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-03T17:56:31.707465.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-03T17:56:31.707465.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-09-03T17:56:31.707465.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-09-03T17:56:31.707465.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-09-03T17:56:31.707465.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-03T17:56:31.707465.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-03T17:56:31.707465.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-03T17:56:31.707465.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-09-03T17:56:31.707465.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-03T17:56:31.707465.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-03T17:56:31.707465.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-03T17:56:31.707465.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-03T17:56:31.707465.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-09-03T17:56:31.707465.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-03T17:56:31.707465.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-03T17:56:31.707465.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-03T17:56:31.707465.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-03T17:56:31.707465.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-09-03T17:56:31.707465.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-09-03T17:56:31.707465.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-09-03T17:56:31.707465.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-09-03T17:56:31.707465.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-03T17:56:31.707465.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-09-03T17:56:31.707465.parquet", "**/details_harness|hendrycksTest-management|5_2023-09-03T17:56:31.707465.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-09-03T17:56:31.707465.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-09-03T17:56:31.707465.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-09-03T17:56:31.707465.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-09-03T17:56:31.707465.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-03T17:56:31.707465.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-09-03T17:56:31.707465.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-09-03T17:56:31.707465.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-09-03T17:56:31.707465.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-09-03T17:56:31.707465.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-09-03T17:56:31.707465.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-09-03T17:56:31.707465.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-09-03T17:56:31.707465.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-09-03T17:56:31.707465.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-09-03T17:56:31.707465.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-09-03T17:56:31.707465.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-03T17:56:31.707465.parquet", "**/details_harness|hendrycksTest-virology|5_2023-09-03T17:56:31.707465.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-09-03T17:56:31.707465.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_09_03T17_56_31.707465", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-03T17:56:31.707465.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-03T17:56:31.707465.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_09_03T17_56_31.707465", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-09-03T17:56:31.707465.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-09-03T17:56:31.707465.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_09_03T17_56_31.707465", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-09-03T17:56:31.707465.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-09-03T17:56:31.707465.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_09_03T17_56_31.707465", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-09-03T17:56:31.707465.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-09-03T17:56:31.707465.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_09_03T17_56_31.707465", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-03T17:56:31.707465.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-03T17:56:31.707465.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_09_03T17_56_31.707465", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-09-03T17:56:31.707465.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-09-03T17:56:31.707465.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_09_03T17_56_31.707465", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-09-03T17:56:31.707465.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-09-03T17:56:31.707465.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_09_03T17_56_31.707465", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-09-03T17:56:31.707465.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-09-03T17:56:31.707465.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_09_03T17_56_31.707465", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-09-03T17:56:31.707465.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-09-03T17:56:31.707465.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_09_03T17_56_31.707465", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-09-03T17:56:31.707465.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-09-03T17:56:31.707465.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_09_03T17_56_31.707465", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-09-03T17:56:31.707465.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-09-03T17:56:31.707465.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_09_03T17_56_31.707465", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-09-03T17:56:31.707465.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-09-03T17:56:31.707465.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_09_03T17_56_31.707465", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-03T17:56:31.707465.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-03T17:56:31.707465.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_09_03T17_56_31.707465", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-09-03T17:56:31.707465.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-09-03T17:56:31.707465.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_09_03T17_56_31.707465", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-03T17:56:31.707465.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-03T17:56:31.707465.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_09_03T17_56_31.707465", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-03T17:56:31.707465.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-03T17:56:31.707465.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_09_03T17_56_31.707465", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-09-03T17:56:31.707465.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-09-03T17:56:31.707465.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_09_03T17_56_31.707465", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-09-03T17:56:31.707465.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-09-03T17:56:31.707465.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_09_03T17_56_31.707465", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-09-03T17:56:31.707465.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-09-03T17:56:31.707465.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_09_03T17_56_31.707465", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-03T17:56:31.707465.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-03T17:56:31.707465.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_09_03T17_56_31.707465", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-03T17:56:31.707465.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-03T17:56:31.707465.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_09_03T17_56_31.707465", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-03T17:56:31.707465.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-03T17:56:31.707465.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_09_03T17_56_31.707465", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-09-03T17:56:31.707465.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-09-03T17:56:31.707465.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_09_03T17_56_31.707465", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-03T17:56:31.707465.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-03T17:56:31.707465.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_09_03T17_56_31.707465", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-03T17:56:31.707465.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-03T17:56:31.707465.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_09_03T17_56_31.707465", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-03T17:56:31.707465.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-03T17:56:31.707465.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_09_03T17_56_31.707465", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-03T17:56:31.707465.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-03T17:56:31.707465.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_09_03T17_56_31.707465", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-09-03T17:56:31.707465.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-09-03T17:56:31.707465.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_09_03T17_56_31.707465", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-03T17:56:31.707465.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-03T17:56:31.707465.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_09_03T17_56_31.707465", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-03T17:56:31.707465.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-03T17:56:31.707465.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_09_03T17_56_31.707465", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-03T17:56:31.707465.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-03T17:56:31.707465.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_09_03T17_56_31.707465", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-03T17:56:31.707465.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-03T17:56:31.707465.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_09_03T17_56_31.707465", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-09-03T17:56:31.707465.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-09-03T17:56:31.707465.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_09_03T17_56_31.707465", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-09-03T17:56:31.707465.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-09-03T17:56:31.707465.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_09_03T17_56_31.707465", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-09-03T17:56:31.707465.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-09-03T17:56:31.707465.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_09_03T17_56_31.707465", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-09-03T17:56:31.707465.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-09-03T17:56:31.707465.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_09_03T17_56_31.707465", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-03T17:56:31.707465.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-03T17:56:31.707465.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_09_03T17_56_31.707465", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-09-03T17:56:31.707465.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-09-03T17:56:31.707465.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_09_03T17_56_31.707465", "path": ["**/details_harness|hendrycksTest-management|5_2023-09-03T17:56:31.707465.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-09-03T17:56:31.707465.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_09_03T17_56_31.707465", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-09-03T17:56:31.707465.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-09-03T17:56:31.707465.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_09_03T17_56_31.707465", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-09-03T17:56:31.707465.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-09-03T17:56:31.707465.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_09_03T17_56_31.707465", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-09-03T17:56:31.707465.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-09-03T17:56:31.707465.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_09_03T17_56_31.707465", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-09-03T17:56:31.707465.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-09-03T17:56:31.707465.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_09_03T17_56_31.707465", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-03T17:56:31.707465.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-03T17:56:31.707465.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_09_03T17_56_31.707465", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-09-03T17:56:31.707465.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-09-03T17:56:31.707465.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_09_03T17_56_31.707465", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-09-03T17:56:31.707465.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-09-03T17:56:31.707465.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_09_03T17_56_31.707465", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-09-03T17:56:31.707465.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-09-03T17:56:31.707465.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_09_03T17_56_31.707465", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-09-03T17:56:31.707465.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-09-03T17:56:31.707465.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_09_03T17_56_31.707465", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-09-03T17:56:31.707465.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-09-03T17:56:31.707465.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_09_03T17_56_31.707465", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-09-03T17:56:31.707465.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-09-03T17:56:31.707465.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_09_03T17_56_31.707465", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-09-03T17:56:31.707465.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-09-03T17:56:31.707465.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_09_03T17_56_31.707465", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-09-03T17:56:31.707465.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-09-03T17:56:31.707465.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_09_03T17_56_31.707465", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-09-03T17:56:31.707465.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-09-03T17:56:31.707465.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_09_03T17_56_31.707465", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-09-03T17:56:31.707465.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-09-03T17:56:31.707465.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_09_03T17_56_31.707465", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-03T17:56:31.707465.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-03T17:56:31.707465.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_09_03T17_56_31.707465", "path": ["**/details_harness|hendrycksTest-virology|5_2023-09-03T17:56:31.707465.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-09-03T17:56:31.707465.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_09_03T17_56_31.707465", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-09-03T17:56:31.707465.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-09-03T17:56:31.707465.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_09_03T17_56_31.707465", "path": ["**/details_harness|truthfulqa:mc|0_2023-09-03T17:56:31.707465.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-09-03T17:56:31.707465.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_09_03T17_56_31.707465", "path": ["results_2023-09-03T17:56:31.707465.parquet"]}, {"split": "latest", "path": ["results_2023-09-03T17:56:31.707465.parquet"]}]}]}
2023-09-03T16:57:50+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of quantumaikr/quantumairk-llama-2-70B-instruct ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model quantumaikr/quantumairk-llama-2-70B-instruct on the Open LLM Leaderboard. The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-09-03T17:56:31.707465(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of quantumaikr/quantumairk-llama-2-70B-instruct", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model quantumaikr/quantumairk-llama-2-70B-instruct on the Open LLM Leaderboard.\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-09-03T17:56:31.707465(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of quantumaikr/quantumairk-llama-2-70B-instruct", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model quantumaikr/quantumairk-llama-2-70B-instruct on the Open LLM Leaderboard.\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-09-03T17:56:31.707465(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 26, 31, 174, 66, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of quantumaikr/quantumairk-llama-2-70B-instruct## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model quantumaikr/quantumairk-llama-2-70B-instruct on the Open LLM Leaderboard.\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-09-03T17:56:31.707465(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
105547b7a0209d7b422d87c98861176c23759924
# NER Fine-Tuning We use Flair for fine-tuning NER models on [HIPE-2022](https://github.com/hipe-eval/HIPE-2022-data) datasets from [HIPE-2022 Shared Task](https://hipe-eval.github.io/HIPE-2022/). All models are fine-tuned on A10 (24GB) and A100 (40GB) instances from [Lambda Cloud](https://lambdalabs.com/service/gpu-cloud) using Flair: ```bash $ git clone https://github.com/flairNLP/flair.git $ cd flair && git checkout 419f13a05d6b36b2a42dd73a551dc3ba679f820c $ pip3 install -e . $ cd .. ``` Clone this repo for fine-tuning NER models: ```bash $ git clone https://github.com/stefan-it/hmTEAMS.git $ cd hmTEAMS/bench ``` Authorize via Hugging Face CLI (needed because hmTEAMS is currently only available after approval): ```bash # Use access token from https://huggingface.co/settings/tokens $ huggingface-cli login login ``` We use a config-driven hyper-parameter search. The script [`flair-fine-tuner.py`](flair-fine-tuner.py) can be used to fine-tune NER models from our Model Zoo. # Benchmark We test our pretrained language models on various datasets from HIPE-2020, HIPE-2022 and Europeana. The following table shows an overview of used datasets. | Language | Datasets |----------|----------------------------------------------------| | English | [AjMC] - [TopRes19th] | | German | [AjMC] - [NewsEye] | | French | [AjMC] - [ICDAR-Europeana] - [LeTemps] - [NewsEye] | | Finnish | [NewsEye] | | Swedish | [NewsEye] | | Dutch | [ICDAR-Europeana] | [AjMC]: https://github.com/hipe-eval/HIPE-2022-data/blob/main/documentation/README-ajmc.md [NewsEye]: https://github.com/hipe-eval/HIPE-2022-data/blob/main/documentation/README-newseye.md [TopRes19th]: https://github.com/hipe-eval/HIPE-2022-data/blob/main/documentation/README-topres19th.md [ICDAR-Europeana]: https://github.com/stefan-it/historic-domain-adaptation-icdar [LeTemps]: https://github.com/hipe-eval/HIPE-2022-data/blob/main/documentation/README-letemps.md # Results We report averaged F1-score over 5 runs with different seeds on development set: | Model | English AjMC | German AjMC | French AjMC | German NewsEye | French NewsEye | Finnish NewsEye | Swedish NewsEye | Dutch ICDAR | French ICDAR | French LeTemps | English TopRes19th | Avg. | |---------------------------------------------------------------------------|--------------|--------------|--------------|----------------|----------------|-----------------|-----------------|--------------|--------------|----------------|--------------------|-----------| | hmBERT (32k) [Schweter et al.](https://ceur-ws.org/Vol-3180/paper-87.pdf) | 85.36 ± 0.94 | 89.08 ± 0.09 | 85.10 ± 0.60 | 39.65 ± 1.01 | 81.47 ± 0.36 | 77.28 ± 0.37 | 82.85 ± 0.83 | 82.11 ± 0.61 | 77.21 ± 0.16 | 65.73 ± 0.56 | 80.94 ± 0.86 | 76.98 | | hmTEAMS (Ours) | 86.41 ± 0.36 | 88.64 ± 0.42 | 85.41 ± 0.67 | 41.51 ± 2.82 | 83.20 ± 0.79 | 79.27 ± 1.88 | 82.78 ± 0.60 | 88.21 ± 0.39 | 78.03 ± 0.39 | 66.71 ± 0.46 | 81.36 ± 0.59 | **78.32** |
stefan-it/autotrain-flair-hipe2022-de-hmbert
[ "region:us" ]
2023-09-03T17:11:57+00:00
{}
2023-09-03T23:08:49+00:00
[]
[]
TAGS #region-us
NER Fine-Tuning =============== We use Flair for fine-tuning NER models on HIPE-2022 datasets from HIPE-2022 Shared Task. All models are fine-tuned on A10 (24GB) and A100 (40GB) instances from Lambda Cloud using Flair: Clone this repo for fine-tuning NER models: Authorize via Hugging Face CLI (needed because hmTEAMS is currently only available after approval): We use a config-driven hyper-parameter search. The script 'URL' can be used to fine-tune NER models from our Model Zoo. Benchmark ========= We test our pretrained language models on various datasets from HIPE-2020, HIPE-2022 and Europeana. The following table shows an overview of used datasets. Results ======= We report averaged F1-score over 5 runs with different seeds on development set:
[]
[ "TAGS\n#region-us \n" ]
[ 6 ]
[ "passage: TAGS\n#region-us \n" ]
1bc0cd7e19a5930a77ac841a2d721a27ada5073e
# Dataset Card for LeetCode Preference ## Dataset Description - **Homepage:** - **Repository:** - **Paper:** - **Leaderboard:** - **Point of Contact:** ### Dataset Summary This dataset facilitates experiments utilizing Direct Preference Optimization (DPO) as outlined in the paper titled [Direct Preference Optimization: Your Language Model is Secretly a Reward Model](https://arxiv.org/abs/2305.18290). This repository provides code pairings crafted by CodeLLaMA-7b. For every LeetCode question posed, CodeLLaMA-7b produces two unique solutions. These are subsequently evaluated and ranked by human experts based on their accuracy, efficiency, and readability. ### Usage ``` from datasets import load_dataset dataset = load_dataset("minfeng-ai/leetcode_preference") ``` ### Data Fields Each row of the dataset contains the following fields: * id: A distinct identifier assigned to each LeetCode question. * title: The official title of the LeetCode question. * description: An in-depth prompt offering detailed insights into the respective question. * difficulty: Categorized into three tiers, indicating the complexity of the question - Easy, Medium, and Hard. * version1: The initial AI-generated code snippet pertaining to the question. * version2: A secondary AI-generated code snippet related to the same question. * preference: A human-assessed label indicating the preferred code snippet between Version1 and Version2.
minfeng-ai/leetcode_preference
[ "arxiv:2305.18290", "region:us" ]
2023-09-03T17:25:25+00:00
{}
2023-09-06T00:08:25+00:00
[ "2305.18290" ]
[]
TAGS #arxiv-2305.18290 #region-us
# Dataset Card for LeetCode Preference ## Dataset Description - Homepage: - Repository: - Paper: - Leaderboard: - Point of Contact: ### Dataset Summary This dataset facilitates experiments utilizing Direct Preference Optimization (DPO) as outlined in the paper titled Direct Preference Optimization: Your Language Model is Secretly a Reward Model. This repository provides code pairings crafted by CodeLLaMA-7b. For every LeetCode question posed, CodeLLaMA-7b produces two unique solutions. These are subsequently evaluated and ranked by human experts based on their accuracy, efficiency, and readability. ### Usage ### Data Fields Each row of the dataset contains the following fields: * id: A distinct identifier assigned to each LeetCode question. * title: The official title of the LeetCode question. * description: An in-depth prompt offering detailed insights into the respective question. * difficulty: Categorized into three tiers, indicating the complexity of the question - Easy, Medium, and Hard. * version1: The initial AI-generated code snippet pertaining to the question. * version2: A secondary AI-generated code snippet related to the same question. * preference: A human-assessed label indicating the preferred code snippet between Version1 and Version2.
[ "# Dataset Card for LeetCode Preference", "## Dataset Description\n\n- Homepage: \n- Repository: \n- Paper: \n- Leaderboard: \n- Point of Contact:", "### Dataset Summary\nThis dataset facilitates experiments utilizing Direct Preference Optimization (DPO) as outlined in the paper titled Direct Preference Optimization: Your Language Model is Secretly a Reward Model. This repository provides code pairings crafted by CodeLLaMA-7b. For every LeetCode question posed, CodeLLaMA-7b produces two unique solutions. These are subsequently evaluated and ranked by human experts based on their accuracy, efficiency, and readability.", "### Usage", "### Data Fields\n\nEach row of the dataset contains the following fields:\n* id: A distinct identifier assigned to each LeetCode question.\n* title: The official title of the LeetCode question.\n* description: An in-depth prompt offering detailed insights into the respective question.\n* difficulty: Categorized into three tiers, indicating the complexity of the question - Easy, Medium, and Hard.\n* version1: The initial AI-generated code snippet pertaining to the question.\n* version2: A secondary AI-generated code snippet related to the same question.\n* preference: A human-assessed label indicating the preferred code snippet between Version1 and Version2." ]
[ "TAGS\n#arxiv-2305.18290 #region-us \n", "# Dataset Card for LeetCode Preference", "## Dataset Description\n\n- Homepage: \n- Repository: \n- Paper: \n- Leaderboard: \n- Point of Contact:", "### Dataset Summary\nThis dataset facilitates experiments utilizing Direct Preference Optimization (DPO) as outlined in the paper titled Direct Preference Optimization: Your Language Model is Secretly a Reward Model. This repository provides code pairings crafted by CodeLLaMA-7b. For every LeetCode question posed, CodeLLaMA-7b produces two unique solutions. These are subsequently evaluated and ranked by human experts based on their accuracy, efficiency, and readability.", "### Usage", "### Data Fields\n\nEach row of the dataset contains the following fields:\n* id: A distinct identifier assigned to each LeetCode question.\n* title: The official title of the LeetCode question.\n* description: An in-depth prompt offering detailed insights into the respective question.\n* difficulty: Categorized into three tiers, indicating the complexity of the question - Easy, Medium, and Hard.\n* version1: The initial AI-generated code snippet pertaining to the question.\n* version2: A secondary AI-generated code snippet related to the same question.\n* preference: A human-assessed label indicating the preferred code snippet between Version1 and Version2." ]
[ 15, 10, 24, 113, 4, 159 ]
[ "passage: TAGS\n#arxiv-2305.18290 #region-us \n# Dataset Card for LeetCode Preference## Dataset Description\n\n- Homepage: \n- Repository: \n- Paper: \n- Leaderboard: \n- Point of Contact:### Dataset Summary\nThis dataset facilitates experiments utilizing Direct Preference Optimization (DPO) as outlined in the paper titled Direct Preference Optimization: Your Language Model is Secretly a Reward Model. This repository provides code pairings crafted by CodeLLaMA-7b. For every LeetCode question posed, CodeLLaMA-7b produces two unique solutions. These are subsequently evaluated and ranked by human experts based on their accuracy, efficiency, and readability.### Usage### Data Fields\n\nEach row of the dataset contains the following fields:\n* id: A distinct identifier assigned to each LeetCode question.\n* title: The official title of the LeetCode question.\n* description: An in-depth prompt offering detailed insights into the respective question.\n* difficulty: Categorized into three tiers, indicating the complexity of the question - Easy, Medium, and Hard.\n* version1: The initial AI-generated code snippet pertaining to the question.\n* version2: A secondary AI-generated code snippet related to the same question.\n* preference: A human-assessed label indicating the preferred code snippet between Version1 and Version2." ]
f0647407a5a2d5c0c47ba8ef454de3c93ff120cd
# Dataset of Lily Lipman This is the dataset of Lily Lipman, containing 200 images and their tags. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)). | Name | Images | Download | Description | |:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------| | raw | 200 | [Download](dataset-raw.zip) | Raw data with meta information. | | raw-stage3 | 478 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. | | 384x512 | 200 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. | | 512x512 | 200 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. | | 512x704 | 200 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. | | 640x640 | 200 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. | | 640x880 | 200 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. | | stage3-640 | 478 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. | | stage3-800 | 478 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. | | stage3-1200 | 478 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
CyberHarem/lily_lipman_birdiewinggolfgirlsstory
[ "task_categories:text-to-image", "size_categories:n<1K", "license:mit", "art", "not-for-all-audiences", "region:us" ]
2023-09-03T17:31:55+00:00
{"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]}
2023-09-17T16:27:40+00:00
[]
[]
TAGS #task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
Dataset of Lily Lipman ====================== This is the dataset of Lily Lipman, containing 200 images and their tags. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
[]
[ "TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n" ]
[ 44 ]
[ "passage: TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n" ]
2cbf0db201220d33310eefc0c60eefb809eba0ad
# Dataset Card for Evaluation run of xxyyy123/mc_data_30k_from_platpus_orca_7b_10k_v1_lora_qkvo_rank14_v2 ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/xxyyy123/mc_data_30k_from_platpus_orca_7b_10k_v1_lora_qkvo_rank14_v2 - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [xxyyy123/mc_data_30k_from_platpus_orca_7b_10k_v1_lora_qkvo_rank14_v2](https://huggingface.co/xxyyy123/mc_data_30k_from_platpus_orca_7b_10k_v1_lora_qkvo_rank14_v2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_xxyyy123__mc_data_30k_from_platpus_orca_7b_10k_v1_lora_qkvo_rank14_v2", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-09-17T16:41:24.154084](https://huggingface.co/datasets/open-llm-leaderboard/details_xxyyy123__mc_data_30k_from_platpus_orca_7b_10k_v1_lora_qkvo_rank14_v2/blob/main/results_2023-09-17T16-41-24.154084.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.272126677852349, "em_stderr": 0.004557777416899833, "f1": 0.34851929530201453, "f1_stderr": 0.004500795514577557, "acc": 0.36653625926220684, "acc_stderr": 0.007090302750388251 }, "harness|drop|3": { "em": 0.272126677852349, "em_stderr": 0.004557777416899833, "f1": 0.34851929530201453, "f1_stderr": 0.004500795514577557 }, "harness|gsm8k|5": { "acc": 0.0037907505686125853, "acc_stderr": 0.0016927007401502038 }, "harness|winogrande|5": { "acc": 0.7292817679558011, "acc_stderr": 0.012487904760626299 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_xxyyy123__mc_data_30k_from_platpus_orca_7b_10k_v1_lora_qkvo_rank14_v2
[ "region:us" ]
2023-09-03T17:33:42+00:00
{"pretty_name": "Evaluation run of xxyyy123/mc_data_30k_from_platpus_orca_7b_10k_v1_lora_qkvo_rank14_v2", "dataset_summary": "Dataset automatically created during the evaluation run of model [xxyyy123/mc_data_30k_from_platpus_orca_7b_10k_v1_lora_qkvo_rank14_v2](https://huggingface.co/xxyyy123/mc_data_30k_from_platpus_orca_7b_10k_v1_lora_qkvo_rank14_v2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_xxyyy123__mc_data_30k_from_platpus_orca_7b_10k_v1_lora_qkvo_rank14_v2\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-09-17T16:41:24.154084](https://huggingface.co/datasets/open-llm-leaderboard/details_xxyyy123__mc_data_30k_from_platpus_orca_7b_10k_v1_lora_qkvo_rank14_v2/blob/main/results_2023-09-17T16-41-24.154084.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.272126677852349,\n \"em_stderr\": 0.004557777416899833,\n \"f1\": 0.34851929530201453,\n \"f1_stderr\": 0.004500795514577557,\n \"acc\": 0.36653625926220684,\n \"acc_stderr\": 0.007090302750388251\n },\n \"harness|drop|3\": {\n \"em\": 0.272126677852349,\n \"em_stderr\": 0.004557777416899833,\n \"f1\": 0.34851929530201453,\n \"f1_stderr\": 0.004500795514577557\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0037907505686125853,\n \"acc_stderr\": 0.0016927007401502038\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7292817679558011,\n \"acc_stderr\": 0.012487904760626299\n }\n}\n```", "repo_url": "https://huggingface.co/xxyyy123/mc_data_30k_from_platpus_orca_7b_10k_v1_lora_qkvo_rank14_v2", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_09_03T18_33_19.019825", "path": ["**/details_harness|arc:challenge|25_2023-09-03T18:33:19.019825.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-09-03T18:33:19.019825.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_09_17T16_41_24.154084", "path": ["**/details_harness|drop|3_2023-09-17T16-41-24.154084.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-09-17T16-41-24.154084.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_09_17T16_41_24.154084", "path": ["**/details_harness|gsm8k|5_2023-09-17T16-41-24.154084.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-09-17T16-41-24.154084.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_09_03T18_33_19.019825", "path": ["**/details_harness|hellaswag|10_2023-09-03T18:33:19.019825.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-09-03T18:33:19.019825.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_09_03T18_33_19.019825", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-03T18:33:19.019825.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-09-03T18:33:19.019825.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-09-03T18:33:19.019825.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-09-03T18:33:19.019825.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-03T18:33:19.019825.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-09-03T18:33:19.019825.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-09-03T18:33:19.019825.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-09-03T18:33:19.019825.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-09-03T18:33:19.019825.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-09-03T18:33:19.019825.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-09-03T18:33:19.019825.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-09-03T18:33:19.019825.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-03T18:33:19.019825.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-09-03T18:33:19.019825.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-03T18:33:19.019825.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-03T18:33:19.019825.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-09-03T18:33:19.019825.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-09-03T18:33:19.019825.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-09-03T18:33:19.019825.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-03T18:33:19.019825.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-03T18:33:19.019825.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-03T18:33:19.019825.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-09-03T18:33:19.019825.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-03T18:33:19.019825.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-03T18:33:19.019825.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-03T18:33:19.019825.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-03T18:33:19.019825.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-09-03T18:33:19.019825.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-03T18:33:19.019825.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-03T18:33:19.019825.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-03T18:33:19.019825.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-03T18:33:19.019825.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-09-03T18:33:19.019825.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-09-03T18:33:19.019825.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-09-03T18:33:19.019825.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-09-03T18:33:19.019825.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-03T18:33:19.019825.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-09-03T18:33:19.019825.parquet", "**/details_harness|hendrycksTest-management|5_2023-09-03T18:33:19.019825.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-09-03T18:33:19.019825.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-09-03T18:33:19.019825.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-09-03T18:33:19.019825.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-09-03T18:33:19.019825.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-03T18:33:19.019825.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-09-03T18:33:19.019825.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-09-03T18:33:19.019825.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-09-03T18:33:19.019825.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-09-03T18:33:19.019825.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-09-03T18:33:19.019825.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-09-03T18:33:19.019825.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-09-03T18:33:19.019825.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-09-03T18:33:19.019825.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-09-03T18:33:19.019825.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-09-03T18:33:19.019825.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-03T18:33:19.019825.parquet", "**/details_harness|hendrycksTest-virology|5_2023-09-03T18:33:19.019825.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-09-03T18:33:19.019825.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-03T18:33:19.019825.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-09-03T18:33:19.019825.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-09-03T18:33:19.019825.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-09-03T18:33:19.019825.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-03T18:33:19.019825.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-09-03T18:33:19.019825.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-09-03T18:33:19.019825.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-09-03T18:33:19.019825.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-09-03T18:33:19.019825.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-09-03T18:33:19.019825.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-09-03T18:33:19.019825.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-09-03T18:33:19.019825.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-03T18:33:19.019825.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-09-03T18:33:19.019825.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-03T18:33:19.019825.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-03T18:33:19.019825.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-09-03T18:33:19.019825.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-09-03T18:33:19.019825.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-09-03T18:33:19.019825.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-03T18:33:19.019825.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-03T18:33:19.019825.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-03T18:33:19.019825.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-09-03T18:33:19.019825.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-03T18:33:19.019825.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-03T18:33:19.019825.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-03T18:33:19.019825.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-03T18:33:19.019825.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-09-03T18:33:19.019825.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-03T18:33:19.019825.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-03T18:33:19.019825.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-03T18:33:19.019825.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-03T18:33:19.019825.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-09-03T18:33:19.019825.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-09-03T18:33:19.019825.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-09-03T18:33:19.019825.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-09-03T18:33:19.019825.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-03T18:33:19.019825.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-09-03T18:33:19.019825.parquet", "**/details_harness|hendrycksTest-management|5_2023-09-03T18:33:19.019825.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-09-03T18:33:19.019825.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-09-03T18:33:19.019825.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-09-03T18:33:19.019825.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-09-03T18:33:19.019825.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-03T18:33:19.019825.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-09-03T18:33:19.019825.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-09-03T18:33:19.019825.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-09-03T18:33:19.019825.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-09-03T18:33:19.019825.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-09-03T18:33:19.019825.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-09-03T18:33:19.019825.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-09-03T18:33:19.019825.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-09-03T18:33:19.019825.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-09-03T18:33:19.019825.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-09-03T18:33:19.019825.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-03T18:33:19.019825.parquet", "**/details_harness|hendrycksTest-virology|5_2023-09-03T18:33:19.019825.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-09-03T18:33:19.019825.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_09_03T18_33_19.019825", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-03T18:33:19.019825.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-03T18:33:19.019825.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_09_03T18_33_19.019825", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-09-03T18:33:19.019825.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-09-03T18:33:19.019825.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_09_03T18_33_19.019825", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-09-03T18:33:19.019825.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-09-03T18:33:19.019825.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_09_03T18_33_19.019825", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-09-03T18:33:19.019825.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-09-03T18:33:19.019825.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_09_03T18_33_19.019825", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-03T18:33:19.019825.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-03T18:33:19.019825.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_09_03T18_33_19.019825", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-09-03T18:33:19.019825.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-09-03T18:33:19.019825.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_09_03T18_33_19.019825", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-09-03T18:33:19.019825.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-09-03T18:33:19.019825.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_09_03T18_33_19.019825", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-09-03T18:33:19.019825.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-09-03T18:33:19.019825.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_09_03T18_33_19.019825", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-09-03T18:33:19.019825.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-09-03T18:33:19.019825.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_09_03T18_33_19.019825", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-09-03T18:33:19.019825.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-09-03T18:33:19.019825.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_09_03T18_33_19.019825", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-09-03T18:33:19.019825.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-09-03T18:33:19.019825.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_09_03T18_33_19.019825", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-09-03T18:33:19.019825.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-09-03T18:33:19.019825.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_09_03T18_33_19.019825", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-03T18:33:19.019825.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-03T18:33:19.019825.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_09_03T18_33_19.019825", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-09-03T18:33:19.019825.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-09-03T18:33:19.019825.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_09_03T18_33_19.019825", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-03T18:33:19.019825.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-03T18:33:19.019825.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_09_03T18_33_19.019825", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-03T18:33:19.019825.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-03T18:33:19.019825.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_09_03T18_33_19.019825", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-09-03T18:33:19.019825.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-09-03T18:33:19.019825.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_09_03T18_33_19.019825", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-09-03T18:33:19.019825.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-09-03T18:33:19.019825.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_09_03T18_33_19.019825", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-09-03T18:33:19.019825.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-09-03T18:33:19.019825.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_09_03T18_33_19.019825", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-03T18:33:19.019825.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-03T18:33:19.019825.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_09_03T18_33_19.019825", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-03T18:33:19.019825.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-03T18:33:19.019825.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_09_03T18_33_19.019825", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-03T18:33:19.019825.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-03T18:33:19.019825.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_09_03T18_33_19.019825", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-09-03T18:33:19.019825.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-09-03T18:33:19.019825.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_09_03T18_33_19.019825", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-03T18:33:19.019825.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-03T18:33:19.019825.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_09_03T18_33_19.019825", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-03T18:33:19.019825.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-03T18:33:19.019825.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_09_03T18_33_19.019825", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-03T18:33:19.019825.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-03T18:33:19.019825.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_09_03T18_33_19.019825", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-03T18:33:19.019825.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-03T18:33:19.019825.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_09_03T18_33_19.019825", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-09-03T18:33:19.019825.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-09-03T18:33:19.019825.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_09_03T18_33_19.019825", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-03T18:33:19.019825.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-03T18:33:19.019825.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_09_03T18_33_19.019825", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-03T18:33:19.019825.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-03T18:33:19.019825.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_09_03T18_33_19.019825", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-03T18:33:19.019825.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-03T18:33:19.019825.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_09_03T18_33_19.019825", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-03T18:33:19.019825.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-03T18:33:19.019825.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_09_03T18_33_19.019825", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-09-03T18:33:19.019825.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-09-03T18:33:19.019825.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_09_03T18_33_19.019825", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-09-03T18:33:19.019825.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-09-03T18:33:19.019825.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_09_03T18_33_19.019825", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-09-03T18:33:19.019825.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-09-03T18:33:19.019825.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_09_03T18_33_19.019825", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-09-03T18:33:19.019825.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-09-03T18:33:19.019825.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_09_03T18_33_19.019825", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-03T18:33:19.019825.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-03T18:33:19.019825.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_09_03T18_33_19.019825", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-09-03T18:33:19.019825.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-09-03T18:33:19.019825.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_09_03T18_33_19.019825", "path": ["**/details_harness|hendrycksTest-management|5_2023-09-03T18:33:19.019825.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-09-03T18:33:19.019825.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_09_03T18_33_19.019825", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-09-03T18:33:19.019825.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-09-03T18:33:19.019825.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_09_03T18_33_19.019825", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-09-03T18:33:19.019825.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-09-03T18:33:19.019825.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_09_03T18_33_19.019825", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-09-03T18:33:19.019825.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-09-03T18:33:19.019825.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_09_03T18_33_19.019825", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-09-03T18:33:19.019825.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-09-03T18:33:19.019825.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_09_03T18_33_19.019825", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-03T18:33:19.019825.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-03T18:33:19.019825.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_09_03T18_33_19.019825", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-09-03T18:33:19.019825.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-09-03T18:33:19.019825.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_09_03T18_33_19.019825", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-09-03T18:33:19.019825.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-09-03T18:33:19.019825.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_09_03T18_33_19.019825", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-09-03T18:33:19.019825.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-09-03T18:33:19.019825.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_09_03T18_33_19.019825", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-09-03T18:33:19.019825.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-09-03T18:33:19.019825.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_09_03T18_33_19.019825", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-09-03T18:33:19.019825.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-09-03T18:33:19.019825.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_09_03T18_33_19.019825", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-09-03T18:33:19.019825.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-09-03T18:33:19.019825.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_09_03T18_33_19.019825", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-09-03T18:33:19.019825.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-09-03T18:33:19.019825.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_09_03T18_33_19.019825", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-09-03T18:33:19.019825.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-09-03T18:33:19.019825.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_09_03T18_33_19.019825", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-09-03T18:33:19.019825.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-09-03T18:33:19.019825.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_09_03T18_33_19.019825", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-09-03T18:33:19.019825.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-09-03T18:33:19.019825.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_09_03T18_33_19.019825", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-03T18:33:19.019825.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-03T18:33:19.019825.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_09_03T18_33_19.019825", "path": ["**/details_harness|hendrycksTest-virology|5_2023-09-03T18:33:19.019825.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-09-03T18:33:19.019825.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_09_03T18_33_19.019825", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-09-03T18:33:19.019825.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-09-03T18:33:19.019825.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_09_03T18_33_19.019825", "path": ["**/details_harness|truthfulqa:mc|0_2023-09-03T18:33:19.019825.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-09-03T18:33:19.019825.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_09_17T16_41_24.154084", "path": ["**/details_harness|winogrande|5_2023-09-17T16-41-24.154084.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-09-17T16-41-24.154084.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_09_03T18_33_19.019825", "path": ["results_2023-09-03T18:33:19.019825.parquet"]}, {"split": "2023_09_17T16_41_24.154084", "path": ["results_2023-09-17T16-41-24.154084.parquet"]}, {"split": "latest", "path": ["results_2023-09-17T16-41-24.154084.parquet"]}]}]}
2023-09-17T15:41:36+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of xxyyy123/mc_data_30k_from_platpus_orca_7b_10k_v1_lora_qkvo_rank14_v2 ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model xxyyy123/mc_data_30k_from_platpus_orca_7b_10k_v1_lora_qkvo_rank14_v2 on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-09-17T16:41:24.154084(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of xxyyy123/mc_data_30k_from_platpus_orca_7b_10k_v1_lora_qkvo_rank14_v2", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model xxyyy123/mc_data_30k_from_platpus_orca_7b_10k_v1_lora_qkvo_rank14_v2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-09-17T16:41:24.154084(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of xxyyy123/mc_data_30k_from_platpus_orca_7b_10k_v1_lora_qkvo_rank14_v2", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model xxyyy123/mc_data_30k_from_platpus_orca_7b_10k_v1_lora_qkvo_rank14_v2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-09-17T16:41:24.154084(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 50, 31, 198, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of xxyyy123/mc_data_30k_from_platpus_orca_7b_10k_v1_lora_qkvo_rank14_v2## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model xxyyy123/mc_data_30k_from_platpus_orca_7b_10k_v1_lora_qkvo_rank14_v2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-09-17T16:41:24.154084(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
806171d9e861fa24d21b793bb12b1086097d70b4
# Bangumi Image Base of Hirogaru Sky! Precure This is the image base of bangumi Hirogaru Sky! Precure, we detected 47 characters, 6541 images in total. The full dataset is [here](all.zip). **Please note that these image bases are not guaranteed to be 100% cleaned, they may be noisy actual.** If you intend to manually train models using this dataset, we recommend performing necessary preprocessing on the downloaded dataset to eliminate potential noisy samples (approximately 1% probability). Here is the characters' preview: | # | Images | Download | Preview 1 | Preview 2 | Preview 3 | Preview 4 | Preview 5 | Preview 6 | Preview 7 | Preview 8 | |:------|---------:|:---------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------| | 0 | 1340 | [Download](0/dataset.zip) | ![preview 1](0/preview_1.png) | ![preview 2](0/preview_2.png) | ![preview 3](0/preview_3.png) | ![preview 4](0/preview_4.png) | ![preview 5](0/preview_5.png) | ![preview 6](0/preview_6.png) | ![preview 7](0/preview_7.png) | ![preview 8](0/preview_8.png) | | 1 | 616 | [Download](1/dataset.zip) | ![preview 1](1/preview_1.png) | ![preview 2](1/preview_2.png) | ![preview 3](1/preview_3.png) | ![preview 4](1/preview_4.png) | ![preview 5](1/preview_5.png) | ![preview 6](1/preview_6.png) | ![preview 7](1/preview_7.png) | ![preview 8](1/preview_8.png) | | 2 | 12 | [Download](2/dataset.zip) | ![preview 1](2/preview_1.png) | ![preview 2](2/preview_2.png) | ![preview 3](2/preview_3.png) | ![preview 4](2/preview_4.png) | ![preview 5](2/preview_5.png) | ![preview 6](2/preview_6.png) | ![preview 7](2/preview_7.png) | ![preview 8](2/preview_8.png) | | 3 | 26 | [Download](3/dataset.zip) | ![preview 1](3/preview_1.png) | ![preview 2](3/preview_2.png) | ![preview 3](3/preview_3.png) | ![preview 4](3/preview_4.png) | ![preview 5](3/preview_5.png) | ![preview 6](3/preview_6.png) | ![preview 7](3/preview_7.png) | ![preview 8](3/preview_8.png) | | 4 | 12 | [Download](4/dataset.zip) | ![preview 1](4/preview_1.png) | ![preview 2](4/preview_2.png) | ![preview 3](4/preview_3.png) | ![preview 4](4/preview_4.png) | ![preview 5](4/preview_5.png) | ![preview 6](4/preview_6.png) | ![preview 7](4/preview_7.png) | ![preview 8](4/preview_8.png) | | 5 | 62 | [Download](5/dataset.zip) | ![preview 1](5/preview_1.png) | ![preview 2](5/preview_2.png) | ![preview 3](5/preview_3.png) | ![preview 4](5/preview_4.png) | ![preview 5](5/preview_5.png) | ![preview 6](5/preview_6.png) | ![preview 7](5/preview_7.png) | ![preview 8](5/preview_8.png) | | 6 | 32 | [Download](6/dataset.zip) | ![preview 1](6/preview_1.png) | ![preview 2](6/preview_2.png) | ![preview 3](6/preview_3.png) | ![preview 4](6/preview_4.png) | ![preview 5](6/preview_5.png) | ![preview 6](6/preview_6.png) | ![preview 7](6/preview_7.png) | ![preview 8](6/preview_8.png) | | 7 | 15 | [Download](7/dataset.zip) | ![preview 1](7/preview_1.png) | ![preview 2](7/preview_2.png) | ![preview 3](7/preview_3.png) | ![preview 4](7/preview_4.png) | ![preview 5](7/preview_5.png) | ![preview 6](7/preview_6.png) | ![preview 7](7/preview_7.png) | ![preview 8](7/preview_8.png) | | 8 | 40 | [Download](8/dataset.zip) | ![preview 1](8/preview_1.png) | ![preview 2](8/preview_2.png) | ![preview 3](8/preview_3.png) | ![preview 4](8/preview_4.png) | ![preview 5](8/preview_5.png) | ![preview 6](8/preview_6.png) | ![preview 7](8/preview_7.png) | ![preview 8](8/preview_8.png) | | 9 | 56 | [Download](9/dataset.zip) | ![preview 1](9/preview_1.png) | ![preview 2](9/preview_2.png) | ![preview 3](9/preview_3.png) | ![preview 4](9/preview_4.png) | ![preview 5](9/preview_5.png) | ![preview 6](9/preview_6.png) | ![preview 7](9/preview_7.png) | ![preview 8](9/preview_8.png) | | 10 | 35 | [Download](10/dataset.zip) | ![preview 1](10/preview_1.png) | ![preview 2](10/preview_2.png) | ![preview 3](10/preview_3.png) | ![preview 4](10/preview_4.png) | ![preview 5](10/preview_5.png) | ![preview 6](10/preview_6.png) | ![preview 7](10/preview_7.png) | ![preview 8](10/preview_8.png) | | 11 | 16 | [Download](11/dataset.zip) | ![preview 1](11/preview_1.png) | ![preview 2](11/preview_2.png) | ![preview 3](11/preview_3.png) | ![preview 4](11/preview_4.png) | ![preview 5](11/preview_5.png) | ![preview 6](11/preview_6.png) | ![preview 7](11/preview_7.png) | ![preview 8](11/preview_8.png) | | 12 | 10 | [Download](12/dataset.zip) | ![preview 1](12/preview_1.png) | ![preview 2](12/preview_2.png) | ![preview 3](12/preview_3.png) | ![preview 4](12/preview_4.png) | ![preview 5](12/preview_5.png) | ![preview 6](12/preview_6.png) | ![preview 7](12/preview_7.png) | ![preview 8](12/preview_8.png) | | 13 | 29 | [Download](13/dataset.zip) | ![preview 1](13/preview_1.png) | ![preview 2](13/preview_2.png) | ![preview 3](13/preview_3.png) | ![preview 4](13/preview_4.png) | ![preview 5](13/preview_5.png) | ![preview 6](13/preview_6.png) | ![preview 7](13/preview_7.png) | ![preview 8](13/preview_8.png) | | 14 | 891 | [Download](14/dataset.zip) | ![preview 1](14/preview_1.png) | ![preview 2](14/preview_2.png) | ![preview 3](14/preview_3.png) | ![preview 4](14/preview_4.png) | ![preview 5](14/preview_5.png) | ![preview 6](14/preview_6.png) | ![preview 7](14/preview_7.png) | ![preview 8](14/preview_8.png) | | 15 | 372 | [Download](15/dataset.zip) | ![preview 1](15/preview_1.png) | ![preview 2](15/preview_2.png) | ![preview 3](15/preview_3.png) | ![preview 4](15/preview_4.png) | ![preview 5](15/preview_5.png) | ![preview 6](15/preview_6.png) | ![preview 7](15/preview_7.png) | ![preview 8](15/preview_8.png) | | 16 | 29 | [Download](16/dataset.zip) | ![preview 1](16/preview_1.png) | ![preview 2](16/preview_2.png) | ![preview 3](16/preview_3.png) | ![preview 4](16/preview_4.png) | ![preview 5](16/preview_5.png) | ![preview 6](16/preview_6.png) | ![preview 7](16/preview_7.png) | ![preview 8](16/preview_8.png) | | 17 | 8 | [Download](17/dataset.zip) | ![preview 1](17/preview_1.png) | ![preview 2](17/preview_2.png) | ![preview 3](17/preview_3.png) | ![preview 4](17/preview_4.png) | ![preview 5](17/preview_5.png) | ![preview 6](17/preview_6.png) | ![preview 7](17/preview_7.png) | ![preview 8](17/preview_8.png) | | 18 | 411 | [Download](18/dataset.zip) | ![preview 1](18/preview_1.png) | ![preview 2](18/preview_2.png) | ![preview 3](18/preview_3.png) | ![preview 4](18/preview_4.png) | ![preview 5](18/preview_5.png) | ![preview 6](18/preview_6.png) | ![preview 7](18/preview_7.png) | ![preview 8](18/preview_8.png) | | 19 | 30 | [Download](19/dataset.zip) | ![preview 1](19/preview_1.png) | ![preview 2](19/preview_2.png) | ![preview 3](19/preview_3.png) | ![preview 4](19/preview_4.png) | ![preview 5](19/preview_5.png) | ![preview 6](19/preview_6.png) | ![preview 7](19/preview_7.png) | ![preview 8](19/preview_8.png) | | 20 | 469 | [Download](20/dataset.zip) | ![preview 1](20/preview_1.png) | ![preview 2](20/preview_2.png) | ![preview 3](20/preview_3.png) | ![preview 4](20/preview_4.png) | ![preview 5](20/preview_5.png) | ![preview 6](20/preview_6.png) | ![preview 7](20/preview_7.png) | ![preview 8](20/preview_8.png) | | 21 | 13 | [Download](21/dataset.zip) | ![preview 1](21/preview_1.png) | ![preview 2](21/preview_2.png) | ![preview 3](21/preview_3.png) | ![preview 4](21/preview_4.png) | ![preview 5](21/preview_5.png) | ![preview 6](21/preview_6.png) | ![preview 7](21/preview_7.png) | ![preview 8](21/preview_8.png) | | 22 | 63 | [Download](22/dataset.zip) | ![preview 1](22/preview_1.png) | ![preview 2](22/preview_2.png) | ![preview 3](22/preview_3.png) | ![preview 4](22/preview_4.png) | ![preview 5](22/preview_5.png) | ![preview 6](22/preview_6.png) | ![preview 7](22/preview_7.png) | ![preview 8](22/preview_8.png) | | 23 | 11 | [Download](23/dataset.zip) | ![preview 1](23/preview_1.png) | ![preview 2](23/preview_2.png) | ![preview 3](23/preview_3.png) | ![preview 4](23/preview_4.png) | ![preview 5](23/preview_5.png) | ![preview 6](23/preview_6.png) | ![preview 7](23/preview_7.png) | ![preview 8](23/preview_8.png) | | 24 | 14 | [Download](24/dataset.zip) | ![preview 1](24/preview_1.png) | ![preview 2](24/preview_2.png) | ![preview 3](24/preview_3.png) | ![preview 4](24/preview_4.png) | ![preview 5](24/preview_5.png) | ![preview 6](24/preview_6.png) | ![preview 7](24/preview_7.png) | ![preview 8](24/preview_8.png) | | 25 | 505 | [Download](25/dataset.zip) | ![preview 1](25/preview_1.png) | ![preview 2](25/preview_2.png) | ![preview 3](25/preview_3.png) | ![preview 4](25/preview_4.png) | ![preview 5](25/preview_5.png) | ![preview 6](25/preview_6.png) | ![preview 7](25/preview_7.png) | ![preview 8](25/preview_8.png) | | 26 | 346 | [Download](26/dataset.zip) | ![preview 1](26/preview_1.png) | ![preview 2](26/preview_2.png) | ![preview 3](26/preview_3.png) | ![preview 4](26/preview_4.png) | ![preview 5](26/preview_5.png) | ![preview 6](26/preview_6.png) | ![preview 7](26/preview_7.png) | ![preview 8](26/preview_8.png) | | 27 | 20 | [Download](27/dataset.zip) | ![preview 1](27/preview_1.png) | ![preview 2](27/preview_2.png) | ![preview 3](27/preview_3.png) | ![preview 4](27/preview_4.png) | ![preview 5](27/preview_5.png) | ![preview 6](27/preview_6.png) | ![preview 7](27/preview_7.png) | ![preview 8](27/preview_8.png) | | 28 | 13 | [Download](28/dataset.zip) | ![preview 1](28/preview_1.png) | ![preview 2](28/preview_2.png) | ![preview 3](28/preview_3.png) | ![preview 4](28/preview_4.png) | ![preview 5](28/preview_5.png) | ![preview 6](28/preview_6.png) | ![preview 7](28/preview_7.png) | ![preview 8](28/preview_8.png) | | 29 | 6 | [Download](29/dataset.zip) | ![preview 1](29/preview_1.png) | ![preview 2](29/preview_2.png) | ![preview 3](29/preview_3.png) | ![preview 4](29/preview_4.png) | ![preview 5](29/preview_5.png) | ![preview 6](29/preview_6.png) | N/A | N/A | | 30 | 13 | [Download](30/dataset.zip) | ![preview 1](30/preview_1.png) | ![preview 2](30/preview_2.png) | ![preview 3](30/preview_3.png) | ![preview 4](30/preview_4.png) | ![preview 5](30/preview_5.png) | ![preview 6](30/preview_6.png) | ![preview 7](30/preview_7.png) | ![preview 8](30/preview_8.png) | | 31 | 14 | [Download](31/dataset.zip) | ![preview 1](31/preview_1.png) | ![preview 2](31/preview_2.png) | ![preview 3](31/preview_3.png) | ![preview 4](31/preview_4.png) | ![preview 5](31/preview_5.png) | ![preview 6](31/preview_6.png) | ![preview 7](31/preview_7.png) | ![preview 8](31/preview_8.png) | | 32 | 33 | [Download](32/dataset.zip) | ![preview 1](32/preview_1.png) | ![preview 2](32/preview_2.png) | ![preview 3](32/preview_3.png) | ![preview 4](32/preview_4.png) | ![preview 5](32/preview_5.png) | ![preview 6](32/preview_6.png) | ![preview 7](32/preview_7.png) | ![preview 8](32/preview_8.png) | | 33 | 255 | [Download](33/dataset.zip) | ![preview 1](33/preview_1.png) | ![preview 2](33/preview_2.png) | ![preview 3](33/preview_3.png) | ![preview 4](33/preview_4.png) | ![preview 5](33/preview_5.png) | ![preview 6](33/preview_6.png) | ![preview 7](33/preview_7.png) | ![preview 8](33/preview_8.png) | | 34 | 6 | [Download](34/dataset.zip) | ![preview 1](34/preview_1.png) | ![preview 2](34/preview_2.png) | ![preview 3](34/preview_3.png) | ![preview 4](34/preview_4.png) | ![preview 5](34/preview_5.png) | ![preview 6](34/preview_6.png) | N/A | N/A | | 35 | 7 | [Download](35/dataset.zip) | ![preview 1](35/preview_1.png) | ![preview 2](35/preview_2.png) | ![preview 3](35/preview_3.png) | ![preview 4](35/preview_4.png) | ![preview 5](35/preview_5.png) | ![preview 6](35/preview_6.png) | ![preview 7](35/preview_7.png) | N/A | | 36 | 9 | [Download](36/dataset.zip) | ![preview 1](36/preview_1.png) | ![preview 2](36/preview_2.png) | ![preview 3](36/preview_3.png) | ![preview 4](36/preview_4.png) | ![preview 5](36/preview_5.png) | ![preview 6](36/preview_6.png) | ![preview 7](36/preview_7.png) | ![preview 8](36/preview_8.png) | | 37 | 121 | [Download](37/dataset.zip) | ![preview 1](37/preview_1.png) | ![preview 2](37/preview_2.png) | ![preview 3](37/preview_3.png) | ![preview 4](37/preview_4.png) | ![preview 5](37/preview_5.png) | ![preview 6](37/preview_6.png) | ![preview 7](37/preview_7.png) | ![preview 8](37/preview_8.png) | | 38 | 13 | [Download](38/dataset.zip) | ![preview 1](38/preview_1.png) | ![preview 2](38/preview_2.png) | ![preview 3](38/preview_3.png) | ![preview 4](38/preview_4.png) | ![preview 5](38/preview_5.png) | ![preview 6](38/preview_6.png) | ![preview 7](38/preview_7.png) | ![preview 8](38/preview_8.png) | | 39 | 9 | [Download](39/dataset.zip) | ![preview 1](39/preview_1.png) | ![preview 2](39/preview_2.png) | ![preview 3](39/preview_3.png) | ![preview 4](39/preview_4.png) | ![preview 5](39/preview_5.png) | ![preview 6](39/preview_6.png) | ![preview 7](39/preview_7.png) | ![preview 8](39/preview_8.png) | | 40 | 138 | [Download](40/dataset.zip) | ![preview 1](40/preview_1.png) | ![preview 2](40/preview_2.png) | ![preview 3](40/preview_3.png) | ![preview 4](40/preview_4.png) | ![preview 5](40/preview_5.png) | ![preview 6](40/preview_6.png) | ![preview 7](40/preview_7.png) | ![preview 8](40/preview_8.png) | | 41 | 6 | [Download](41/dataset.zip) | ![preview 1](41/preview_1.png) | ![preview 2](41/preview_2.png) | ![preview 3](41/preview_3.png) | ![preview 4](41/preview_4.png) | ![preview 5](41/preview_5.png) | ![preview 6](41/preview_6.png) | N/A | N/A | | 42 | 8 | [Download](42/dataset.zip) | ![preview 1](42/preview_1.png) | ![preview 2](42/preview_2.png) | ![preview 3](42/preview_3.png) | ![preview 4](42/preview_4.png) | ![preview 5](42/preview_5.png) | ![preview 6](42/preview_6.png) | ![preview 7](42/preview_7.png) | ![preview 8](42/preview_8.png) | | 43 | 51 | [Download](43/dataset.zip) | ![preview 1](43/preview_1.png) | ![preview 2](43/preview_2.png) | ![preview 3](43/preview_3.png) | ![preview 4](43/preview_4.png) | ![preview 5](43/preview_5.png) | ![preview 6](43/preview_6.png) | ![preview 7](43/preview_7.png) | ![preview 8](43/preview_8.png) | | 44 | 69 | [Download](44/dataset.zip) | ![preview 1](44/preview_1.png) | ![preview 2](44/preview_2.png) | ![preview 3](44/preview_3.png) | ![preview 4](44/preview_4.png) | ![preview 5](44/preview_5.png) | ![preview 6](44/preview_6.png) | ![preview 7](44/preview_7.png) | ![preview 8](44/preview_8.png) | | 45 | 40 | [Download](45/dataset.zip) | ![preview 1](45/preview_1.png) | ![preview 2](45/preview_2.png) | ![preview 3](45/preview_3.png) | ![preview 4](45/preview_4.png) | ![preview 5](45/preview_5.png) | ![preview 6](45/preview_6.png) | ![preview 7](45/preview_7.png) | ![preview 8](45/preview_8.png) | | noise | 247 | [Download](-1/dataset.zip) | ![preview 1](-1/preview_1.png) | ![preview 2](-1/preview_2.png) | ![preview 3](-1/preview_3.png) | ![preview 4](-1/preview_4.png) | ![preview 5](-1/preview_5.png) | ![preview 6](-1/preview_6.png) | ![preview 7](-1/preview_7.png) | ![preview 8](-1/preview_8.png) |
BangumiBase/hirogaruskyprecure
[ "size_categories:1K<n<10K", "license:mit", "art", "region:us" ]
2023-09-03T17:37:56+00:00
{"license": "mit", "size_categories": ["1K<n<10K"], "tags": ["art"]}
2023-09-29T04:42:06+00:00
[]
[]
TAGS #size_categories-1K<n<10K #license-mit #art #region-us
Bangumi Image Base of Hirogaru Sky! Precure =========================================== This is the image base of bangumi Hirogaru Sky! Precure, we detected 47 characters, 6541 images in total. The full dataset is here. Please note that these image bases are not guaranteed to be 100% cleaned, they may be noisy actual. If you intend to manually train models using this dataset, we recommend performing necessary preprocessing on the downloaded dataset to eliminate potential noisy samples (approximately 1% probability). Here is the characters' preview:
[]
[ "TAGS\n#size_categories-1K<n<10K #license-mit #art #region-us \n" ]
[ 25 ]
[ "passage: TAGS\n#size_categories-1K<n<10K #license-mit #art #region-us \n" ]
2e13a75b4c63c45ce30f6583a8f15405bf051419
# Dataset Card for Evaluation run of xxyyy123/mc_data_30k_from_platpus_orca_7b_10k_v1_lora_qk_rank14_v2 ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/xxyyy123/mc_data_30k_from_platpus_orca_7b_10k_v1_lora_qk_rank14_v2 - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [xxyyy123/mc_data_30k_from_platpus_orca_7b_10k_v1_lora_qk_rank14_v2](https://huggingface.co/xxyyy123/mc_data_30k_from_platpus_orca_7b_10k_v1_lora_qk_rank14_v2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_xxyyy123__mc_data_30k_from_platpus_orca_7b_10k_v1_lora_qk_rank14_v2", "harness_truthfulqa_mc_0", split="train") ``` ## Latest results These are the [latest results from run 2023-09-03T18:41:04.280567](https://huggingface.co/datasets/open-llm-leaderboard/details_xxyyy123__mc_data_30k_from_platpus_orca_7b_10k_v1_lora_qk_rank14_v2/blob/main/results_2023-09-03T18%3A41%3A04.280567.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.5159772470651705, "acc_stderr": 0.03490050368845693, "acc_norm": 0.5196198874675843, "acc_norm_stderr": 0.03488383911166199, "mc1": 0.3574051407588739, "mc1_stderr": 0.0167765996767294, "mc2": 0.5084843623108531, "mc2_stderr": 0.015788699144390992 }, "harness|arc:challenge|25": { "acc": 0.5537542662116041, "acc_stderr": 0.014526705548539982, "acc_norm": 0.5810580204778157, "acc_norm_stderr": 0.014418106953639013 }, "harness|hellaswag|10": { "acc": 0.6132244572794264, "acc_stderr": 0.004860162076330978, "acc_norm": 0.8008364867556264, "acc_norm_stderr": 0.0039855506403304606 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.28, "acc_stderr": 0.04512608598542128, "acc_norm": 0.28, "acc_norm_stderr": 0.04512608598542128 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.48148148148148145, "acc_stderr": 0.043163785995113245, "acc_norm": 0.48148148148148145, "acc_norm_stderr": 0.043163785995113245 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.47368421052631576, "acc_stderr": 0.04063302731486671, "acc_norm": 0.47368421052631576, "acc_norm_stderr": 0.04063302731486671 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.5, "acc_stderr": 0.050251890762960605, "acc_norm": 0.5, "acc_norm_stderr": 0.050251890762960605 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.6, "acc_stderr": 0.030151134457776285, "acc_norm": 0.6, "acc_norm_stderr": 0.030151134457776285 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.5625, "acc_stderr": 0.04148415739394154, "acc_norm": 0.5625, "acc_norm_stderr": 0.04148415739394154 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.38, "acc_stderr": 0.04878317312145632, "acc_norm": 0.38, "acc_norm_stderr": 0.04878317312145632 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.36, "acc_stderr": 0.048241815132442176, "acc_norm": 0.36, "acc_norm_stderr": 0.048241815132442176 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.28, "acc_stderr": 0.04512608598542128, "acc_norm": 0.28, "acc_norm_stderr": 0.04512608598542128 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.4797687861271676, "acc_stderr": 0.03809342081273957, "acc_norm": 0.4797687861271676, "acc_norm_stderr": 0.03809342081273957 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.29411764705882354, "acc_stderr": 0.04533838195929775, "acc_norm": 0.29411764705882354, "acc_norm_stderr": 0.04533838195929775 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.62, "acc_stderr": 0.048783173121456316, "acc_norm": 0.62, "acc_norm_stderr": 0.048783173121456316 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.4851063829787234, "acc_stderr": 0.032671518489247764, "acc_norm": 0.4851063829787234, "acc_norm_stderr": 0.032671518489247764 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.32456140350877194, "acc_stderr": 0.044045561573747664, "acc_norm": 0.32456140350877194, "acc_norm_stderr": 0.044045561573747664 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.45517241379310347, "acc_stderr": 0.04149886942192117, "acc_norm": 0.45517241379310347, "acc_norm_stderr": 0.04149886942192117 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.30952380952380953, "acc_stderr": 0.023809523809523867, "acc_norm": 0.30952380952380953, "acc_norm_stderr": 0.023809523809523867 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.2777777777777778, "acc_stderr": 0.040061680838488774, "acc_norm": 0.2777777777777778, "acc_norm_stderr": 0.040061680838488774 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.37, "acc_stderr": 0.04852365870939099, "acc_norm": 0.37, "acc_norm_stderr": 0.04852365870939099 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.5612903225806452, "acc_stderr": 0.028229497320317216, "acc_norm": 0.5612903225806452, "acc_norm_stderr": 0.028229497320317216 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.3842364532019704, "acc_stderr": 0.0342239856565755, "acc_norm": 0.3842364532019704, "acc_norm_stderr": 0.0342239856565755 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.41, "acc_stderr": 0.04943110704237102, "acc_norm": 0.41, "acc_norm_stderr": 0.04943110704237102 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.7090909090909091, "acc_stderr": 0.03546563019624336, "acc_norm": 0.7090909090909091, "acc_norm_stderr": 0.03546563019624336 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.6818181818181818, "acc_stderr": 0.0331847733384533, "acc_norm": 0.6818181818181818, "acc_norm_stderr": 0.0331847733384533 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.7512953367875648, "acc_stderr": 0.031195840877700286, "acc_norm": 0.7512953367875648, "acc_norm_stderr": 0.031195840877700286 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.48205128205128206, "acc_stderr": 0.02533466708095495, "acc_norm": 0.48205128205128206, "acc_norm_stderr": 0.02533466708095495 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.2518518518518518, "acc_stderr": 0.02646611753895991, "acc_norm": 0.2518518518518518, "acc_norm_stderr": 0.02646611753895991 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.5126050420168067, "acc_stderr": 0.03246816765752174, "acc_norm": 0.5126050420168067, "acc_norm_stderr": 0.03246816765752174 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.3576158940397351, "acc_stderr": 0.03913453431177258, "acc_norm": 0.3576158940397351, "acc_norm_stderr": 0.03913453431177258 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.7192660550458716, "acc_stderr": 0.019266055045871623, "acc_norm": 0.7192660550458716, "acc_norm_stderr": 0.019266055045871623 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.375, "acc_stderr": 0.033016908987210894, "acc_norm": 0.375, "acc_norm_stderr": 0.033016908987210894 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.696078431372549, "acc_stderr": 0.03228210387037892, "acc_norm": 0.696078431372549, "acc_norm_stderr": 0.03228210387037892 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.7215189873417721, "acc_stderr": 0.029178682304842548, "acc_norm": 0.7215189873417721, "acc_norm_stderr": 0.029178682304842548 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.5964125560538116, "acc_stderr": 0.03292802819330314, "acc_norm": 0.5964125560538116, "acc_norm_stderr": 0.03292802819330314 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.5954198473282443, "acc_stderr": 0.043046937953806645, "acc_norm": 0.5954198473282443, "acc_norm_stderr": 0.043046937953806645 }, "harness|hendrycksTest-international_law|5": { "acc": 0.6446280991735537, "acc_stderr": 0.0436923632657398, "acc_norm": 0.6446280991735537, "acc_norm_stderr": 0.0436923632657398 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.6481481481481481, "acc_stderr": 0.046166311118017125, "acc_norm": 0.6481481481481481, "acc_norm_stderr": 0.046166311118017125 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.5705521472392638, "acc_stderr": 0.03889066619112723, "acc_norm": 0.5705521472392638, "acc_norm_stderr": 0.03889066619112723 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.375, "acc_stderr": 0.04595091388086298, "acc_norm": 0.375, "acc_norm_stderr": 0.04595091388086298 }, "harness|hendrycksTest-management|5": { "acc": 0.7087378640776699, "acc_stderr": 0.04498676320572924, "acc_norm": 0.7087378640776699, "acc_norm_stderr": 0.04498676320572924 }, "harness|hendrycksTest-marketing|5": { "acc": 0.7777777777777778, "acc_stderr": 0.027236013946196704, "acc_norm": 0.7777777777777778, "acc_norm_stderr": 0.027236013946196704 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.55, "acc_stderr": 0.05, "acc_norm": 0.55, "acc_norm_stderr": 0.05 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.7100893997445722, "acc_stderr": 0.01622501794477098, "acc_norm": 0.7100893997445722, "acc_norm_stderr": 0.01622501794477098 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.5809248554913294, "acc_stderr": 0.02656417811142262, "acc_norm": 0.5809248554913294, "acc_norm_stderr": 0.02656417811142262 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.264804469273743, "acc_stderr": 0.014756906483260664, "acc_norm": 0.264804469273743, "acc_norm_stderr": 0.014756906483260664 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.5261437908496732, "acc_stderr": 0.028590752958852394, "acc_norm": 0.5261437908496732, "acc_norm_stderr": 0.028590752958852394 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.5884244372990354, "acc_stderr": 0.027950481494401266, "acc_norm": 0.5884244372990354, "acc_norm_stderr": 0.027950481494401266 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.5648148148148148, "acc_stderr": 0.027586006221607708, "acc_norm": 0.5648148148148148, "acc_norm_stderr": 0.027586006221607708 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.375886524822695, "acc_stderr": 0.028893955412115882, "acc_norm": 0.375886524822695, "acc_norm_stderr": 0.028893955412115882 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.38396349413298564, "acc_stderr": 0.01242158783313423, "acc_norm": 0.38396349413298564, "acc_norm_stderr": 0.01242158783313423 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.4889705882352941, "acc_stderr": 0.03036544647727568, "acc_norm": 0.4889705882352941, "acc_norm_stderr": 0.03036544647727568 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.4869281045751634, "acc_stderr": 0.020220920829626912, "acc_norm": 0.4869281045751634, "acc_norm_stderr": 0.020220920829626912 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.5909090909090909, "acc_stderr": 0.04709306978661896, "acc_norm": 0.5909090909090909, "acc_norm_stderr": 0.04709306978661896 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.6122448979591837, "acc_stderr": 0.031192230726795656, "acc_norm": 0.6122448979591837, "acc_norm_stderr": 0.031192230726795656 }, "harness|hendrycksTest-sociology|5": { "acc": 0.6119402985074627, "acc_stderr": 0.034457899643627506, "acc_norm": 0.6119402985074627, "acc_norm_stderr": 0.034457899643627506 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.77, "acc_stderr": 0.042295258468165065, "acc_norm": 0.77, "acc_norm_stderr": 0.042295258468165065 }, "harness|hendrycksTest-virology|5": { "acc": 0.40963855421686746, "acc_stderr": 0.03828401115079022, "acc_norm": 0.40963855421686746, "acc_norm_stderr": 0.03828401115079022 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.695906432748538, "acc_stderr": 0.03528211258245229, "acc_norm": 0.695906432748538, "acc_norm_stderr": 0.03528211258245229 }, "harness|truthfulqa:mc|0": { "mc1": 0.3574051407588739, "mc1_stderr": 0.0167765996767294, "mc2": 0.5084843623108531, "mc2_stderr": 0.015788699144390992 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_xxyyy123__mc_data_30k_from_platpus_orca_7b_10k_v1_lora_qk_rank14_v2
[ "region:us" ]
2023-09-03T17:41:28+00:00
{"pretty_name": "Evaluation run of xxyyy123/mc_data_30k_from_platpus_orca_7b_10k_v1_lora_qk_rank14_v2", "dataset_summary": "Dataset automatically created during the evaluation run of model [xxyyy123/mc_data_30k_from_platpus_orca_7b_10k_v1_lora_qk_rank14_v2](https://huggingface.co/xxyyy123/mc_data_30k_from_platpus_orca_7b_10k_v1_lora_qk_rank14_v2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_xxyyy123__mc_data_30k_from_platpus_orca_7b_10k_v1_lora_qk_rank14_v2\",\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-09-03T18:41:04.280567](https://huggingface.co/datasets/open-llm-leaderboard/details_xxyyy123__mc_data_30k_from_platpus_orca_7b_10k_v1_lora_qk_rank14_v2/blob/main/results_2023-09-03T18%3A41%3A04.280567.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5159772470651705,\n \"acc_stderr\": 0.03490050368845693,\n \"acc_norm\": 0.5196198874675843,\n \"acc_norm_stderr\": 0.03488383911166199,\n \"mc1\": 0.3574051407588739,\n \"mc1_stderr\": 0.0167765996767294,\n \"mc2\": 0.5084843623108531,\n \"mc2_stderr\": 0.015788699144390992\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5537542662116041,\n \"acc_stderr\": 0.014526705548539982,\n \"acc_norm\": 0.5810580204778157,\n \"acc_norm_stderr\": 0.014418106953639013\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6132244572794264,\n \"acc_stderr\": 0.004860162076330978,\n \"acc_norm\": 0.8008364867556264,\n \"acc_norm_stderr\": 0.0039855506403304606\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.48148148148148145,\n \"acc_stderr\": 0.043163785995113245,\n \"acc_norm\": 0.48148148148148145,\n \"acc_norm_stderr\": 0.043163785995113245\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.47368421052631576,\n \"acc_stderr\": 0.04063302731486671,\n \"acc_norm\": 0.47368421052631576,\n \"acc_norm_stderr\": 0.04063302731486671\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.030151134457776285,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.030151134457776285\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5625,\n \"acc_stderr\": 0.04148415739394154,\n \"acc_norm\": 0.5625,\n \"acc_norm_stderr\": 0.04148415739394154\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145632,\n \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145632\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.4797687861271676,\n \"acc_stderr\": 0.03809342081273957,\n \"acc_norm\": 0.4797687861271676,\n \"acc_norm_stderr\": 0.03809342081273957\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.29411764705882354,\n \"acc_stderr\": 0.04533838195929775,\n \"acc_norm\": 0.29411764705882354,\n \"acc_norm_stderr\": 0.04533838195929775\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.62,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.62,\n \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.4851063829787234,\n \"acc_stderr\": 0.032671518489247764,\n \"acc_norm\": 0.4851063829787234,\n \"acc_norm_stderr\": 0.032671518489247764\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.32456140350877194,\n \"acc_stderr\": 0.044045561573747664,\n \"acc_norm\": 0.32456140350877194,\n \"acc_norm_stderr\": 0.044045561573747664\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.45517241379310347,\n \"acc_stderr\": 0.04149886942192117,\n \"acc_norm\": 0.45517241379310347,\n \"acc_norm_stderr\": 0.04149886942192117\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.30952380952380953,\n \"acc_stderr\": 0.023809523809523867,\n \"acc_norm\": 0.30952380952380953,\n \"acc_norm_stderr\": 0.023809523809523867\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2777777777777778,\n \"acc_stderr\": 0.040061680838488774,\n \"acc_norm\": 0.2777777777777778,\n \"acc_norm_stderr\": 0.040061680838488774\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.5612903225806452,\n \"acc_stderr\": 0.028229497320317216,\n \"acc_norm\": 0.5612903225806452,\n \"acc_norm_stderr\": 0.028229497320317216\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.3842364532019704,\n \"acc_stderr\": 0.0342239856565755,\n \"acc_norm\": 0.3842364532019704,\n \"acc_norm_stderr\": 0.0342239856565755\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.41,\n \"acc_stderr\": 0.04943110704237102,\n \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.04943110704237102\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7090909090909091,\n \"acc_stderr\": 0.03546563019624336,\n \"acc_norm\": 0.7090909090909091,\n \"acc_norm_stderr\": 0.03546563019624336\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.6818181818181818,\n \"acc_stderr\": 0.0331847733384533,\n \"acc_norm\": 0.6818181818181818,\n \"acc_norm_stderr\": 0.0331847733384533\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.7512953367875648,\n \"acc_stderr\": 0.031195840877700286,\n \"acc_norm\": 0.7512953367875648,\n \"acc_norm_stderr\": 0.031195840877700286\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.48205128205128206,\n \"acc_stderr\": 0.02533466708095495,\n \"acc_norm\": 0.48205128205128206,\n \"acc_norm_stderr\": 0.02533466708095495\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.2518518518518518,\n \"acc_stderr\": 0.02646611753895991,\n \"acc_norm\": 0.2518518518518518,\n \"acc_norm_stderr\": 0.02646611753895991\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.5126050420168067,\n \"acc_stderr\": 0.03246816765752174,\n \"acc_norm\": 0.5126050420168067,\n \"acc_norm_stderr\": 0.03246816765752174\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3576158940397351,\n \"acc_stderr\": 0.03913453431177258,\n \"acc_norm\": 0.3576158940397351,\n \"acc_norm_stderr\": 0.03913453431177258\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.7192660550458716,\n \"acc_stderr\": 0.019266055045871623,\n \"acc_norm\": 0.7192660550458716,\n \"acc_norm_stderr\": 0.019266055045871623\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.375,\n \"acc_stderr\": 0.033016908987210894,\n \"acc_norm\": 0.375,\n \"acc_norm_stderr\": 0.033016908987210894\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.696078431372549,\n \"acc_stderr\": 0.03228210387037892,\n \"acc_norm\": 0.696078431372549,\n \"acc_norm_stderr\": 0.03228210387037892\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7215189873417721,\n \"acc_stderr\": 0.029178682304842548,\n \"acc_norm\": 0.7215189873417721,\n \"acc_norm_stderr\": 0.029178682304842548\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5964125560538116,\n \"acc_stderr\": 0.03292802819330314,\n \"acc_norm\": 0.5964125560538116,\n \"acc_norm_stderr\": 0.03292802819330314\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.5954198473282443,\n \"acc_stderr\": 0.043046937953806645,\n \"acc_norm\": 0.5954198473282443,\n \"acc_norm_stderr\": 0.043046937953806645\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.6446280991735537,\n \"acc_stderr\": 0.0436923632657398,\n \"acc_norm\": 0.6446280991735537,\n \"acc_norm_stderr\": 0.0436923632657398\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6481481481481481,\n \"acc_stderr\": 0.046166311118017125,\n \"acc_norm\": 0.6481481481481481,\n \"acc_norm_stderr\": 0.046166311118017125\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.5705521472392638,\n \"acc_stderr\": 0.03889066619112723,\n \"acc_norm\": 0.5705521472392638,\n \"acc_norm_stderr\": 0.03889066619112723\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.375,\n \"acc_stderr\": 0.04595091388086298,\n \"acc_norm\": 0.375,\n \"acc_norm_stderr\": 0.04595091388086298\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7087378640776699,\n \"acc_stderr\": 0.04498676320572924,\n \"acc_norm\": 0.7087378640776699,\n \"acc_norm_stderr\": 0.04498676320572924\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.027236013946196704,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.027236013946196704\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.55,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.55,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7100893997445722,\n \"acc_stderr\": 0.01622501794477098,\n \"acc_norm\": 0.7100893997445722,\n \"acc_norm_stderr\": 0.01622501794477098\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.5809248554913294,\n \"acc_stderr\": 0.02656417811142262,\n \"acc_norm\": 0.5809248554913294,\n \"acc_norm_stderr\": 0.02656417811142262\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.264804469273743,\n \"acc_stderr\": 0.014756906483260664,\n \"acc_norm\": 0.264804469273743,\n \"acc_norm_stderr\": 0.014756906483260664\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.5261437908496732,\n \"acc_stderr\": 0.028590752958852394,\n \"acc_norm\": 0.5261437908496732,\n \"acc_norm_stderr\": 0.028590752958852394\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5884244372990354,\n \"acc_stderr\": 0.027950481494401266,\n \"acc_norm\": 0.5884244372990354,\n \"acc_norm_stderr\": 0.027950481494401266\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.5648148148148148,\n \"acc_stderr\": 0.027586006221607708,\n \"acc_norm\": 0.5648148148148148,\n \"acc_norm_stderr\": 0.027586006221607708\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.375886524822695,\n \"acc_stderr\": 0.028893955412115882,\n \"acc_norm\": 0.375886524822695,\n \"acc_norm_stderr\": 0.028893955412115882\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.38396349413298564,\n \"acc_stderr\": 0.01242158783313423,\n \"acc_norm\": 0.38396349413298564,\n \"acc_norm_stderr\": 0.01242158783313423\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.4889705882352941,\n \"acc_stderr\": 0.03036544647727568,\n \"acc_norm\": 0.4889705882352941,\n \"acc_norm_stderr\": 0.03036544647727568\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.4869281045751634,\n \"acc_stderr\": 0.020220920829626912,\n \"acc_norm\": 0.4869281045751634,\n \"acc_norm_stderr\": 0.020220920829626912\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5909090909090909,\n \"acc_stderr\": 0.04709306978661896,\n \"acc_norm\": 0.5909090909090909,\n \"acc_norm_stderr\": 0.04709306978661896\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.6122448979591837,\n \"acc_stderr\": 0.031192230726795656,\n \"acc_norm\": 0.6122448979591837,\n \"acc_norm_stderr\": 0.031192230726795656\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6119402985074627,\n \"acc_stderr\": 0.034457899643627506,\n \"acc_norm\": 0.6119402985074627,\n \"acc_norm_stderr\": 0.034457899643627506\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.77,\n \"acc_stderr\": 0.042295258468165065,\n \"acc_norm\": 0.77,\n \"acc_norm_stderr\": 0.042295258468165065\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.40963855421686746,\n \"acc_stderr\": 0.03828401115079022,\n \"acc_norm\": 0.40963855421686746,\n \"acc_norm_stderr\": 0.03828401115079022\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.695906432748538,\n \"acc_stderr\": 0.03528211258245229,\n \"acc_norm\": 0.695906432748538,\n \"acc_norm_stderr\": 0.03528211258245229\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3574051407588739,\n \"mc1_stderr\": 0.0167765996767294,\n \"mc2\": 0.5084843623108531,\n \"mc2_stderr\": 0.015788699144390992\n }\n}\n```", "repo_url": "https://huggingface.co/xxyyy123/mc_data_30k_from_platpus_orca_7b_10k_v1_lora_qk_rank14_v2", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_09_03T18_41_04.280567", "path": ["**/details_harness|arc:challenge|25_2023-09-03T18:41:04.280567.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-09-03T18:41:04.280567.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_09_03T18_41_04.280567", "path": ["**/details_harness|hellaswag|10_2023-09-03T18:41:04.280567.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-09-03T18:41:04.280567.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_09_03T18_41_04.280567", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-03T18:41:04.280567.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-09-03T18:41:04.280567.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-09-03T18:41:04.280567.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-09-03T18:41:04.280567.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-03T18:41:04.280567.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-09-03T18:41:04.280567.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-09-03T18:41:04.280567.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-09-03T18:41:04.280567.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-09-03T18:41:04.280567.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-09-03T18:41:04.280567.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-09-03T18:41:04.280567.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-09-03T18:41:04.280567.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-03T18:41:04.280567.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-09-03T18:41:04.280567.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-03T18:41:04.280567.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-03T18:41:04.280567.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-09-03T18:41:04.280567.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-09-03T18:41:04.280567.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-09-03T18:41:04.280567.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-03T18:41:04.280567.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-03T18:41:04.280567.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-03T18:41:04.280567.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-09-03T18:41:04.280567.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-03T18:41:04.280567.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-03T18:41:04.280567.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-03T18:41:04.280567.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-03T18:41:04.280567.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-09-03T18:41:04.280567.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-03T18:41:04.280567.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-03T18:41:04.280567.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-03T18:41:04.280567.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-03T18:41:04.280567.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-09-03T18:41:04.280567.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-09-03T18:41:04.280567.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-09-03T18:41:04.280567.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-09-03T18:41:04.280567.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-03T18:41:04.280567.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-09-03T18:41:04.280567.parquet", "**/details_harness|hendrycksTest-management|5_2023-09-03T18:41:04.280567.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-09-03T18:41:04.280567.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-09-03T18:41:04.280567.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-09-03T18:41:04.280567.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-09-03T18:41:04.280567.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-03T18:41:04.280567.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-09-03T18:41:04.280567.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-09-03T18:41:04.280567.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-09-03T18:41:04.280567.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-09-03T18:41:04.280567.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-09-03T18:41:04.280567.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-09-03T18:41:04.280567.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-09-03T18:41:04.280567.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-09-03T18:41:04.280567.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-09-03T18:41:04.280567.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-09-03T18:41:04.280567.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-03T18:41:04.280567.parquet", "**/details_harness|hendrycksTest-virology|5_2023-09-03T18:41:04.280567.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-09-03T18:41:04.280567.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-03T18:41:04.280567.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-09-03T18:41:04.280567.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-09-03T18:41:04.280567.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-09-03T18:41:04.280567.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-03T18:41:04.280567.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-09-03T18:41:04.280567.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-09-03T18:41:04.280567.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-09-03T18:41:04.280567.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-09-03T18:41:04.280567.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-09-03T18:41:04.280567.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-09-03T18:41:04.280567.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-09-03T18:41:04.280567.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-03T18:41:04.280567.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-09-03T18:41:04.280567.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-03T18:41:04.280567.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-03T18:41:04.280567.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-09-03T18:41:04.280567.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-09-03T18:41:04.280567.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-09-03T18:41:04.280567.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-03T18:41:04.280567.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-03T18:41:04.280567.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-03T18:41:04.280567.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-09-03T18:41:04.280567.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-03T18:41:04.280567.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-03T18:41:04.280567.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-03T18:41:04.280567.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-03T18:41:04.280567.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-09-03T18:41:04.280567.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-03T18:41:04.280567.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-03T18:41:04.280567.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-03T18:41:04.280567.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-03T18:41:04.280567.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-09-03T18:41:04.280567.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-09-03T18:41:04.280567.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-09-03T18:41:04.280567.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-09-03T18:41:04.280567.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-03T18:41:04.280567.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-09-03T18:41:04.280567.parquet", "**/details_harness|hendrycksTest-management|5_2023-09-03T18:41:04.280567.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-09-03T18:41:04.280567.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-09-03T18:41:04.280567.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-09-03T18:41:04.280567.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-09-03T18:41:04.280567.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-03T18:41:04.280567.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-09-03T18:41:04.280567.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-09-03T18:41:04.280567.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-09-03T18:41:04.280567.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-09-03T18:41:04.280567.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-09-03T18:41:04.280567.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-09-03T18:41:04.280567.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-09-03T18:41:04.280567.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-09-03T18:41:04.280567.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-09-03T18:41:04.280567.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-09-03T18:41:04.280567.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-03T18:41:04.280567.parquet", "**/details_harness|hendrycksTest-virology|5_2023-09-03T18:41:04.280567.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-09-03T18:41:04.280567.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_09_03T18_41_04.280567", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-03T18:41:04.280567.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-03T18:41:04.280567.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_09_03T18_41_04.280567", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-09-03T18:41:04.280567.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-09-03T18:41:04.280567.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_09_03T18_41_04.280567", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-09-03T18:41:04.280567.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-09-03T18:41:04.280567.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_09_03T18_41_04.280567", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-09-03T18:41:04.280567.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-09-03T18:41:04.280567.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_09_03T18_41_04.280567", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-03T18:41:04.280567.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-03T18:41:04.280567.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_09_03T18_41_04.280567", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-09-03T18:41:04.280567.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-09-03T18:41:04.280567.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_09_03T18_41_04.280567", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-09-03T18:41:04.280567.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-09-03T18:41:04.280567.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_09_03T18_41_04.280567", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-09-03T18:41:04.280567.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-09-03T18:41:04.280567.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_09_03T18_41_04.280567", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-09-03T18:41:04.280567.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-09-03T18:41:04.280567.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_09_03T18_41_04.280567", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-09-03T18:41:04.280567.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-09-03T18:41:04.280567.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_09_03T18_41_04.280567", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-09-03T18:41:04.280567.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-09-03T18:41:04.280567.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_09_03T18_41_04.280567", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-09-03T18:41:04.280567.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-09-03T18:41:04.280567.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_09_03T18_41_04.280567", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-03T18:41:04.280567.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-03T18:41:04.280567.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_09_03T18_41_04.280567", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-09-03T18:41:04.280567.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-09-03T18:41:04.280567.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_09_03T18_41_04.280567", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-03T18:41:04.280567.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-03T18:41:04.280567.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_09_03T18_41_04.280567", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-03T18:41:04.280567.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-03T18:41:04.280567.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_09_03T18_41_04.280567", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-09-03T18:41:04.280567.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-09-03T18:41:04.280567.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_09_03T18_41_04.280567", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-09-03T18:41:04.280567.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-09-03T18:41:04.280567.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_09_03T18_41_04.280567", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-09-03T18:41:04.280567.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-09-03T18:41:04.280567.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_09_03T18_41_04.280567", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-03T18:41:04.280567.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-03T18:41:04.280567.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_09_03T18_41_04.280567", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-03T18:41:04.280567.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-03T18:41:04.280567.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_09_03T18_41_04.280567", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-03T18:41:04.280567.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-03T18:41:04.280567.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_09_03T18_41_04.280567", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-09-03T18:41:04.280567.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-09-03T18:41:04.280567.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_09_03T18_41_04.280567", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-03T18:41:04.280567.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-03T18:41:04.280567.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_09_03T18_41_04.280567", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-03T18:41:04.280567.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-03T18:41:04.280567.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_09_03T18_41_04.280567", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-03T18:41:04.280567.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-03T18:41:04.280567.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_09_03T18_41_04.280567", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-03T18:41:04.280567.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-03T18:41:04.280567.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_09_03T18_41_04.280567", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-09-03T18:41:04.280567.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-09-03T18:41:04.280567.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_09_03T18_41_04.280567", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-03T18:41:04.280567.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-03T18:41:04.280567.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_09_03T18_41_04.280567", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-03T18:41:04.280567.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-03T18:41:04.280567.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_09_03T18_41_04.280567", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-03T18:41:04.280567.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-03T18:41:04.280567.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_09_03T18_41_04.280567", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-03T18:41:04.280567.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-03T18:41:04.280567.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_09_03T18_41_04.280567", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-09-03T18:41:04.280567.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-09-03T18:41:04.280567.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_09_03T18_41_04.280567", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-09-03T18:41:04.280567.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-09-03T18:41:04.280567.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_09_03T18_41_04.280567", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-09-03T18:41:04.280567.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-09-03T18:41:04.280567.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_09_03T18_41_04.280567", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-09-03T18:41:04.280567.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-09-03T18:41:04.280567.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_09_03T18_41_04.280567", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-03T18:41:04.280567.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-03T18:41:04.280567.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_09_03T18_41_04.280567", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-09-03T18:41:04.280567.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-09-03T18:41:04.280567.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_09_03T18_41_04.280567", "path": ["**/details_harness|hendrycksTest-management|5_2023-09-03T18:41:04.280567.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-09-03T18:41:04.280567.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_09_03T18_41_04.280567", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-09-03T18:41:04.280567.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-09-03T18:41:04.280567.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_09_03T18_41_04.280567", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-09-03T18:41:04.280567.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-09-03T18:41:04.280567.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_09_03T18_41_04.280567", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-09-03T18:41:04.280567.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-09-03T18:41:04.280567.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_09_03T18_41_04.280567", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-09-03T18:41:04.280567.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-09-03T18:41:04.280567.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_09_03T18_41_04.280567", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-03T18:41:04.280567.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-03T18:41:04.280567.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_09_03T18_41_04.280567", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-09-03T18:41:04.280567.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-09-03T18:41:04.280567.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_09_03T18_41_04.280567", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-09-03T18:41:04.280567.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-09-03T18:41:04.280567.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_09_03T18_41_04.280567", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-09-03T18:41:04.280567.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-09-03T18:41:04.280567.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_09_03T18_41_04.280567", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-09-03T18:41:04.280567.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-09-03T18:41:04.280567.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_09_03T18_41_04.280567", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-09-03T18:41:04.280567.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-09-03T18:41:04.280567.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_09_03T18_41_04.280567", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-09-03T18:41:04.280567.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-09-03T18:41:04.280567.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_09_03T18_41_04.280567", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-09-03T18:41:04.280567.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-09-03T18:41:04.280567.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_09_03T18_41_04.280567", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-09-03T18:41:04.280567.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-09-03T18:41:04.280567.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_09_03T18_41_04.280567", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-09-03T18:41:04.280567.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-09-03T18:41:04.280567.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_09_03T18_41_04.280567", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-09-03T18:41:04.280567.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-09-03T18:41:04.280567.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_09_03T18_41_04.280567", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-03T18:41:04.280567.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-03T18:41:04.280567.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_09_03T18_41_04.280567", "path": ["**/details_harness|hendrycksTest-virology|5_2023-09-03T18:41:04.280567.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-09-03T18:41:04.280567.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_09_03T18_41_04.280567", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-09-03T18:41:04.280567.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-09-03T18:41:04.280567.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_09_03T18_41_04.280567", "path": ["**/details_harness|truthfulqa:mc|0_2023-09-03T18:41:04.280567.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-09-03T18:41:04.280567.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_09_03T18_41_04.280567", "path": ["results_2023-09-03T18:41:04.280567.parquet"]}, {"split": "latest", "path": ["results_2023-09-03T18:41:04.280567.parquet"]}]}]}
2023-09-03T17:42:23+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of xxyyy123/mc_data_30k_from_platpus_orca_7b_10k_v1_lora_qk_rank14_v2 ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model xxyyy123/mc_data_30k_from_platpus_orca_7b_10k_v1_lora_qk_rank14_v2 on the Open LLM Leaderboard. The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-09-03T18:41:04.280567(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of xxyyy123/mc_data_30k_from_platpus_orca_7b_10k_v1_lora_qk_rank14_v2", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model xxyyy123/mc_data_30k_from_platpus_orca_7b_10k_v1_lora_qk_rank14_v2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-09-03T18:41:04.280567(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of xxyyy123/mc_data_30k_from_platpus_orca_7b_10k_v1_lora_qk_rank14_v2", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model xxyyy123/mc_data_30k_from_platpus_orca_7b_10k_v1_lora_qk_rank14_v2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-09-03T18:41:04.280567(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 50, 31, 198, 66, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of xxyyy123/mc_data_30k_from_platpus_orca_7b_10k_v1_lora_qk_rank14_v2## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model xxyyy123/mc_data_30k_from_platpus_orca_7b_10k_v1_lora_qk_rank14_v2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-09-03T18:41:04.280567(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
f00af4de1c30a6f13846415b1621fd21e184a4af
# Dataset of Saotome Ichina This is the dataset of Saotome Ichina, containing 200 images and their tags. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)). | Name | Images | Download | Description | |:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------| | raw | 200 | [Download](dataset-raw.zip) | Raw data with meta information. | | raw-stage3 | 485 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. | | 384x512 | 200 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. | | 512x512 | 200 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. | | 512x704 | 200 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. | | 640x640 | 200 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. | | 640x880 | 200 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. | | stage3-640 | 485 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. | | stage3-800 | 485 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. | | stage3-1200 | 485 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
CyberHarem/saotome_ichina_birdiewinggolfgirlsstory
[ "task_categories:text-to-image", "size_categories:n<1K", "license:mit", "art", "not-for-all-audiences", "region:us" ]
2023-09-03T17:49:47+00:00
{"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]}
2023-09-17T16:27:42+00:00
[]
[]
TAGS #task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
Dataset of Saotome Ichina ========================= This is the dataset of Saotome Ichina, containing 200 images and their tags. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
[]
[ "TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n" ]
[ 44 ]
[ "passage: TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n" ]
fce6e40410751ee85e4f5b627679a2df034f3591
# Dataset of Himekawa Mizuho This is the dataset of Himekawa Mizuho, containing 74 images and their tags. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)). | Name | Images | Download | Description | |:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------| | raw | 74 | [Download](dataset-raw.zip) | Raw data with meta information. | | raw-stage3 | 154 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. | | 384x512 | 74 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. | | 512x512 | 74 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. | | 512x704 | 74 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. | | 640x640 | 74 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. | | 640x880 | 74 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. | | stage3-640 | 154 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. | | stage3-800 | 154 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. | | stage3-1200 | 154 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
CyberHarem/himekawa_mizuho_birdiewinggolfgirlsstory
[ "task_categories:text-to-image", "size_categories:n<1K", "license:mit", "art", "not-for-all-audiences", "region:us" ]
2023-09-03T17:58:45+00:00
{"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]}
2023-09-17T16:27:45+00:00
[]
[]
TAGS #task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
Dataset of Himekawa Mizuho ========================== This is the dataset of Himekawa Mizuho, containing 74 images and their tags. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
[]
[ "TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n" ]
[ 44 ]
[ "passage: TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n" ]
7c40051babd2501b40479ae9c13f251fc92085c9
git lfs install git clone https://huggingface.co/redstonehero/epicphotogasm_v1
solomars/solo123
[ "region:us" ]
2023-09-03T18:04:14+00:00
{}
2023-09-03T18:08:41+00:00
[]
[]
TAGS #region-us
git lfs install git clone URL
[]
[ "TAGS\n#region-us \n" ]
[ 6 ]
[ "passage: TAGS\n#region-us \n" ]
be863472c17e657d83c6e5a922d5ea89bb6cd70c
# Dataset Card for Evaluation run of KnutJaegersberg/LLongMA-3b-LIMA ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/KnutJaegersberg/LLongMA-3b-LIMA - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [KnutJaegersberg/LLongMA-3b-LIMA](https://huggingface.co/KnutJaegersberg/LLongMA-3b-LIMA) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_KnutJaegersberg__LLongMA-3b-LIMA", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-10-27T12:59:36.364632](https://huggingface.co/datasets/open-llm-leaderboard/details_KnutJaegersberg__LLongMA-3b-LIMA/blob/main/results_2023-10-27T12-59-36.364632.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.0007340604026845638, "em_stderr": 0.00027736144573357115, "f1": 0.04566589765100663, "f1_stderr": 0.0012269345796283918, "acc": 0.3184065922558586, "acc_stderr": 0.007527358968906723 }, "harness|drop|3": { "em": 0.0007340604026845638, "em_stderr": 0.00027736144573357115, "f1": 0.04566589765100663, "f1_stderr": 0.0012269345796283918 }, "harness|gsm8k|5": { "acc": 0.003032600454890068, "acc_stderr": 0.00151457356122455 }, "harness|winogrande|5": { "acc": 0.6337805840568271, "acc_stderr": 0.013540144376588896 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_KnutJaegersberg__LLongMA-3b-LIMA
[ "region:us" ]
2023-09-03T19:10:16+00:00
{"pretty_name": "Evaluation run of KnutJaegersberg/LLongMA-3b-LIMA", "dataset_summary": "Dataset automatically created during the evaluation run of model [KnutJaegersberg/LLongMA-3b-LIMA](https://huggingface.co/KnutJaegersberg/LLongMA-3b-LIMA) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_KnutJaegersberg__LLongMA-3b-LIMA\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-27T12:59:36.364632](https://huggingface.co/datasets/open-llm-leaderboard/details_KnutJaegersberg__LLongMA-3b-LIMA/blob/main/results_2023-10-27T12-59-36.364632.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0007340604026845638,\n \"em_stderr\": 0.00027736144573357115,\n \"f1\": 0.04566589765100663,\n \"f1_stderr\": 0.0012269345796283918,\n \"acc\": 0.3184065922558586,\n \"acc_stderr\": 0.007527358968906723\n },\n \"harness|drop|3\": {\n \"em\": 0.0007340604026845638,\n \"em_stderr\": 0.00027736144573357115,\n \"f1\": 0.04566589765100663,\n \"f1_stderr\": 0.0012269345796283918\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.003032600454890068,\n \"acc_stderr\": 0.00151457356122455\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.6337805840568271,\n \"acc_stderr\": 0.013540144376588896\n }\n}\n```", "repo_url": "https://huggingface.co/KnutJaegersberg/LLongMA-3b-LIMA", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_09_03T20_09_53.352642", "path": ["**/details_harness|arc:challenge|25_2023-09-03T20:09:53.352642.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-09-03T20:09:53.352642.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_27T12_59_36.364632", "path": ["**/details_harness|drop|3_2023-10-27T12-59-36.364632.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-27T12-59-36.364632.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_27T12_59_36.364632", "path": ["**/details_harness|gsm8k|5_2023-10-27T12-59-36.364632.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-27T12-59-36.364632.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_09_03T20_09_53.352642", "path": ["**/details_harness|hellaswag|10_2023-09-03T20:09:53.352642.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-09-03T20:09:53.352642.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_09_03T20_09_53.352642", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-03T20:09:53.352642.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-09-03T20:09:53.352642.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-09-03T20:09:53.352642.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-09-03T20:09:53.352642.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-03T20:09:53.352642.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-09-03T20:09:53.352642.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-09-03T20:09:53.352642.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-09-03T20:09:53.352642.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-09-03T20:09:53.352642.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-09-03T20:09:53.352642.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-09-03T20:09:53.352642.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-09-03T20:09:53.352642.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-03T20:09:53.352642.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-09-03T20:09:53.352642.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-03T20:09:53.352642.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-03T20:09:53.352642.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-09-03T20:09:53.352642.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-09-03T20:09:53.352642.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-09-03T20:09:53.352642.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-03T20:09:53.352642.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-03T20:09:53.352642.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-03T20:09:53.352642.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-09-03T20:09:53.352642.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-03T20:09:53.352642.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-03T20:09:53.352642.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-03T20:09:53.352642.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-03T20:09:53.352642.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-09-03T20:09:53.352642.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-03T20:09:53.352642.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-03T20:09:53.352642.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-03T20:09:53.352642.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-03T20:09:53.352642.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-09-03T20:09:53.352642.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-09-03T20:09:53.352642.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-09-03T20:09:53.352642.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-09-03T20:09:53.352642.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-03T20:09:53.352642.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-09-03T20:09:53.352642.parquet", "**/details_harness|hendrycksTest-management|5_2023-09-03T20:09:53.352642.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-09-03T20:09:53.352642.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-09-03T20:09:53.352642.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-09-03T20:09:53.352642.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-09-03T20:09:53.352642.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-03T20:09:53.352642.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-09-03T20:09:53.352642.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-09-03T20:09:53.352642.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-09-03T20:09:53.352642.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-09-03T20:09:53.352642.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-09-03T20:09:53.352642.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-09-03T20:09:53.352642.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-09-03T20:09:53.352642.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-09-03T20:09:53.352642.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-09-03T20:09:53.352642.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-09-03T20:09:53.352642.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-03T20:09:53.352642.parquet", "**/details_harness|hendrycksTest-virology|5_2023-09-03T20:09:53.352642.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-09-03T20:09:53.352642.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-03T20:09:53.352642.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-09-03T20:09:53.352642.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-09-03T20:09:53.352642.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-09-03T20:09:53.352642.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-03T20:09:53.352642.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-09-03T20:09:53.352642.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-09-03T20:09:53.352642.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-09-03T20:09:53.352642.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-09-03T20:09:53.352642.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-09-03T20:09:53.352642.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-09-03T20:09:53.352642.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-09-03T20:09:53.352642.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-03T20:09:53.352642.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-09-03T20:09:53.352642.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-03T20:09:53.352642.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-03T20:09:53.352642.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-09-03T20:09:53.352642.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-09-03T20:09:53.352642.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-09-03T20:09:53.352642.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-03T20:09:53.352642.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-03T20:09:53.352642.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-03T20:09:53.352642.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-09-03T20:09:53.352642.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-03T20:09:53.352642.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-03T20:09:53.352642.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-03T20:09:53.352642.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-03T20:09:53.352642.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-09-03T20:09:53.352642.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-03T20:09:53.352642.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-03T20:09:53.352642.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-03T20:09:53.352642.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-03T20:09:53.352642.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-09-03T20:09:53.352642.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-09-03T20:09:53.352642.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-09-03T20:09:53.352642.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-09-03T20:09:53.352642.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-03T20:09:53.352642.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-09-03T20:09:53.352642.parquet", "**/details_harness|hendrycksTest-management|5_2023-09-03T20:09:53.352642.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-09-03T20:09:53.352642.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-09-03T20:09:53.352642.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-09-03T20:09:53.352642.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-09-03T20:09:53.352642.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-03T20:09:53.352642.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-09-03T20:09:53.352642.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-09-03T20:09:53.352642.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-09-03T20:09:53.352642.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-09-03T20:09:53.352642.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-09-03T20:09:53.352642.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-09-03T20:09:53.352642.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-09-03T20:09:53.352642.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-09-03T20:09:53.352642.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-09-03T20:09:53.352642.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-09-03T20:09:53.352642.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-03T20:09:53.352642.parquet", "**/details_harness|hendrycksTest-virology|5_2023-09-03T20:09:53.352642.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-09-03T20:09:53.352642.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_09_03T20_09_53.352642", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-03T20:09:53.352642.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-03T20:09:53.352642.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_09_03T20_09_53.352642", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-09-03T20:09:53.352642.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-09-03T20:09:53.352642.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_09_03T20_09_53.352642", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-09-03T20:09:53.352642.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-09-03T20:09:53.352642.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_09_03T20_09_53.352642", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-09-03T20:09:53.352642.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-09-03T20:09:53.352642.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_09_03T20_09_53.352642", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-03T20:09:53.352642.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-03T20:09:53.352642.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_09_03T20_09_53.352642", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-09-03T20:09:53.352642.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-09-03T20:09:53.352642.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_09_03T20_09_53.352642", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-09-03T20:09:53.352642.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-09-03T20:09:53.352642.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_09_03T20_09_53.352642", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-09-03T20:09:53.352642.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-09-03T20:09:53.352642.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_09_03T20_09_53.352642", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-09-03T20:09:53.352642.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-09-03T20:09:53.352642.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_09_03T20_09_53.352642", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-09-03T20:09:53.352642.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-09-03T20:09:53.352642.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_09_03T20_09_53.352642", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-09-03T20:09:53.352642.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-09-03T20:09:53.352642.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_09_03T20_09_53.352642", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-09-03T20:09:53.352642.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-09-03T20:09:53.352642.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_09_03T20_09_53.352642", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-03T20:09:53.352642.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-03T20:09:53.352642.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_09_03T20_09_53.352642", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-09-03T20:09:53.352642.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-09-03T20:09:53.352642.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_09_03T20_09_53.352642", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-03T20:09:53.352642.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-03T20:09:53.352642.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_09_03T20_09_53.352642", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-03T20:09:53.352642.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-03T20:09:53.352642.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_09_03T20_09_53.352642", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-09-03T20:09:53.352642.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-09-03T20:09:53.352642.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_09_03T20_09_53.352642", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-09-03T20:09:53.352642.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-09-03T20:09:53.352642.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_09_03T20_09_53.352642", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-09-03T20:09:53.352642.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-09-03T20:09:53.352642.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_09_03T20_09_53.352642", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-03T20:09:53.352642.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-03T20:09:53.352642.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_09_03T20_09_53.352642", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-03T20:09:53.352642.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-03T20:09:53.352642.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_09_03T20_09_53.352642", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-03T20:09:53.352642.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-03T20:09:53.352642.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_09_03T20_09_53.352642", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-09-03T20:09:53.352642.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-09-03T20:09:53.352642.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_09_03T20_09_53.352642", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-03T20:09:53.352642.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-03T20:09:53.352642.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_09_03T20_09_53.352642", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-03T20:09:53.352642.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-03T20:09:53.352642.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_09_03T20_09_53.352642", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-03T20:09:53.352642.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-03T20:09:53.352642.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_09_03T20_09_53.352642", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-03T20:09:53.352642.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-03T20:09:53.352642.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_09_03T20_09_53.352642", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-09-03T20:09:53.352642.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-09-03T20:09:53.352642.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_09_03T20_09_53.352642", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-03T20:09:53.352642.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-03T20:09:53.352642.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_09_03T20_09_53.352642", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-03T20:09:53.352642.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-03T20:09:53.352642.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_09_03T20_09_53.352642", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-03T20:09:53.352642.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-03T20:09:53.352642.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_09_03T20_09_53.352642", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-03T20:09:53.352642.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-03T20:09:53.352642.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_09_03T20_09_53.352642", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-09-03T20:09:53.352642.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-09-03T20:09:53.352642.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_09_03T20_09_53.352642", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-09-03T20:09:53.352642.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-09-03T20:09:53.352642.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_09_03T20_09_53.352642", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-09-03T20:09:53.352642.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-09-03T20:09:53.352642.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_09_03T20_09_53.352642", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-09-03T20:09:53.352642.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-09-03T20:09:53.352642.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_09_03T20_09_53.352642", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-03T20:09:53.352642.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-03T20:09:53.352642.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_09_03T20_09_53.352642", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-09-03T20:09:53.352642.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-09-03T20:09:53.352642.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_09_03T20_09_53.352642", "path": ["**/details_harness|hendrycksTest-management|5_2023-09-03T20:09:53.352642.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-09-03T20:09:53.352642.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_09_03T20_09_53.352642", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-09-03T20:09:53.352642.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-09-03T20:09:53.352642.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_09_03T20_09_53.352642", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-09-03T20:09:53.352642.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-09-03T20:09:53.352642.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_09_03T20_09_53.352642", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-09-03T20:09:53.352642.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-09-03T20:09:53.352642.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_09_03T20_09_53.352642", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-09-03T20:09:53.352642.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-09-03T20:09:53.352642.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_09_03T20_09_53.352642", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-03T20:09:53.352642.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-03T20:09:53.352642.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_09_03T20_09_53.352642", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-09-03T20:09:53.352642.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-09-03T20:09:53.352642.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_09_03T20_09_53.352642", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-09-03T20:09:53.352642.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-09-03T20:09:53.352642.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_09_03T20_09_53.352642", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-09-03T20:09:53.352642.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-09-03T20:09:53.352642.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_09_03T20_09_53.352642", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-09-03T20:09:53.352642.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-09-03T20:09:53.352642.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_09_03T20_09_53.352642", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-09-03T20:09:53.352642.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-09-03T20:09:53.352642.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_09_03T20_09_53.352642", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-09-03T20:09:53.352642.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-09-03T20:09:53.352642.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_09_03T20_09_53.352642", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-09-03T20:09:53.352642.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-09-03T20:09:53.352642.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_09_03T20_09_53.352642", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-09-03T20:09:53.352642.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-09-03T20:09:53.352642.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_09_03T20_09_53.352642", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-09-03T20:09:53.352642.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-09-03T20:09:53.352642.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_09_03T20_09_53.352642", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-09-03T20:09:53.352642.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-09-03T20:09:53.352642.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_09_03T20_09_53.352642", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-03T20:09:53.352642.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-03T20:09:53.352642.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_09_03T20_09_53.352642", "path": ["**/details_harness|hendrycksTest-virology|5_2023-09-03T20:09:53.352642.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-09-03T20:09:53.352642.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_09_03T20_09_53.352642", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-09-03T20:09:53.352642.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-09-03T20:09:53.352642.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_09_03T20_09_53.352642", "path": ["**/details_harness|truthfulqa:mc|0_2023-09-03T20:09:53.352642.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-09-03T20:09:53.352642.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_27T12_59_36.364632", "path": ["**/details_harness|winogrande|5_2023-10-27T12-59-36.364632.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-27T12-59-36.364632.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_09_03T20_09_53.352642", "path": ["results_2023-09-03T20:09:53.352642.parquet"]}, {"split": "2023_10_27T12_59_36.364632", "path": ["results_2023-10-27T12-59-36.364632.parquet"]}, {"split": "latest", "path": ["results_2023-10-27T12-59-36.364632.parquet"]}]}]}
2023-10-27T11:59:49+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of KnutJaegersberg/LLongMA-3b-LIMA ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model KnutJaegersberg/LLongMA-3b-LIMA on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-10-27T12:59:36.364632(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of KnutJaegersberg/LLongMA-3b-LIMA", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model KnutJaegersberg/LLongMA-3b-LIMA on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-27T12:59:36.364632(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of KnutJaegersberg/LLongMA-3b-LIMA", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model KnutJaegersberg/LLongMA-3b-LIMA on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-27T12:59:36.364632(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 22, 31, 170, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of KnutJaegersberg/LLongMA-3b-LIMA## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model KnutJaegersberg/LLongMA-3b-LIMA on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-27T12:59:36.364632(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
e3b8e02d8fb2f032ccab13271443a0f9f6f06f1d
Forrás: https://adatbazis.k-monitor.hu/ ## Használat ```python from datasets import load_dataset dataset = load_dataset('K-Monitor/kmdb_base') # írjuk ki egy cikk címét print(dataset['train'][0]['title']) ``` ## Oszlopok magyarázata `news_id`: a cikk k-monitor-os azonosítója `title`: hírportál által adott cím (hiányos lehet, lásd: kmonitor_title) `source_url`: cikk url-je, ami hírportálra mutat `kmdb_url`: a cikk k-monitor adatbázisba felvett rekordjára mutat (https://adatbazis.k-monitor.hu/hirek/...) `newspaper`: a hírportál neve (a k-monitor adatbázisból származik) `pub_time`: publikálás időpontja (a k-monitor adatbázisból származik) `authors`: a cikk szerzői, ez a hír parsolása közben kinyert információ (lehet, hogy hibás, sokszor hiányzik) `text`: cikk szövegének törzse, tisztítva (üres nem lehet, de előfordulhat, hogy nem teljes) `meta_keywords`: hírportál címkék, ez is a hír parsolása közben kinyert információ (nem mindig van és nem mindig értelmes) `meta_description`: hírportál által adott lead (kis bevezető/összefoglaló az elején) `url`: a cikk url-je, normalizálva (http->https, www kiszedve, egyéb url paraméterek eltávolítva [kivétel pár hírportál esetén]) `kmonitor_title`: ugyanez címmel `publish_date`: cikk megjelenésének ideje, változó pontossággal és ritkán értelmetlen `kmonitor_description`: k-monitor adatbázisában tárolt leírás (sokszor `null`, néha egyezik a rendes leírással) `kmonitor_places`: K-Monitoros önkéntesek által címkézett helyszínek `kmonitor_persons`: K-Monitoros önkéntesek által címkézett személyek `kmonitor_others`: K-Monitoros önkéntesek által címkézett egyéb kulcsszavak (nem feltétlen jelenik meg a cikkben az adott szó)
K-Monitor/kmdb_base
[ "size_categories:10K<n<100K", "language:hu", "region:us" ]
2023-09-03T20:10:30+00:00
{"language": ["hu"], "size_categories": ["10K<n<100K"], "pretty_name": "K-Monitor sajt\u00f3adatb\u00e1zis", "dataset_info": {"features": [{"name": "news_id", "dtype": "int64"}, {"name": "title", "dtype": "string"}, {"name": "source_url", "dtype": "string"}, {"name": "kmdb_url", "dtype": "string"}, {"name": "newspaper", "dtype": "string"}, {"name": "pub_time", "dtype": "string"}, {"name": "authors", "sequence": "string"}, {"name": "text", "dtype": "string"}, {"name": "meta_keywords", "sequence": "string"}, {"name": "meta_description", "dtype": "string"}, {"name": "url", "dtype": "string"}, {"name": "kmonitor_title", "dtype": "string"}, {"name": "publish_date", "dtype": "string"}, {"name": "kmonitor_description", "dtype": "string"}, {"name": "kmonitor_places", "sequence": "string"}, {"name": "kmonitor_persons", "sequence": "string"}, {"name": "kmonitor_others", "sequence": "string"}, {"name": "kmonitor_institutions", "sequence": "string"}], "splits": [{"name": "train", "num_bytes": 243123503.81640086, "num_examples": 46914}], "download_size": 140573780, "dataset_size": 243123503.81640086}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]}
2024-02-16T16:58:06+00:00
[]
[ "hu" ]
TAGS #size_categories-10K<n<100K #language-Hungarian #region-us
Forrás: URL ## Használat ## Oszlopok magyarázata 'news_id': a cikk k-monitor-os azonosítója 'title': hírportál által adott cím (hiányos lehet, lásd: kmonitor_title) 'source_url': cikk url-je, ami hírportálra mutat 'kmdb_url': a cikk k-monitor adatbázisba felvett rekordjára mutat (URL 'newspaper': a hírportál neve (a k-monitor adatbázisból származik) 'pub_time': publikálás időpontja (a k-monitor adatbázisból származik) 'authors': a cikk szerzői, ez a hír parsolása közben kinyert információ (lehet, hogy hibás, sokszor hiányzik) 'text': cikk szövegének törzse, tisztítva (üres nem lehet, de előfordulhat, hogy nem teljes) 'meta_keywords': hírportál címkék, ez is a hír parsolása közben kinyert információ (nem mindig van és nem mindig értelmes) 'meta_description': hírportál által adott lead (kis bevezető/összefoglaló az elején) 'url': a cikk url-je, normalizálva (http->https, www kiszedve, egyéb url paraméterek eltávolítva [kivétel pár hírportál esetén]) 'kmonitor_title': ugyanez címmel 'publish_date': cikk megjelenésének ideje, változó pontossággal és ritkán értelmetlen 'kmonitor_description': k-monitor adatbázisában tárolt leírás (sokszor 'null', néha egyezik a rendes leírással) 'kmonitor_places': K-Monitoros önkéntesek által címkézett helyszínek 'kmonitor_persons': K-Monitoros önkéntesek által címkézett személyek 'kmonitor_others': K-Monitoros önkéntesek által címkézett egyéb kulcsszavak (nem feltétlen jelenik meg a cikkben az adott szó)
[ "## Használat", "## Oszlopok magyarázata\n\n'news_id': a cikk k-monitor-os azonosítója\n\n'title': hírportál által adott cím (hiányos lehet, lásd: kmonitor_title)\n\n'source_url': cikk url-je, ami hírportálra mutat\n\n'kmdb_url': a cikk k-monitor adatbázisba felvett rekordjára mutat (URL\n\n'newspaper': a hírportál neve (a k-monitor adatbázisból származik)\n\n'pub_time': publikálás időpontja (a k-monitor adatbázisból származik)\n\n'authors': a cikk szerzői, ez a hír parsolása közben kinyert információ (lehet, hogy hibás, sokszor hiányzik) \n\n'text': cikk szövegének törzse, tisztítva (üres nem lehet, de előfordulhat, hogy nem teljes)\n\n'meta_keywords': hírportál címkék, ez is a hír parsolása közben kinyert információ (nem mindig van és nem mindig értelmes)\n\n'meta_description': hírportál által adott lead (kis bevezető/összefoglaló az elején)\n\n'url': a cikk url-je, normalizálva (http->https, www kiszedve, egyéb url paraméterek eltávolítva [kivétel pár hírportál esetén])\n\n'kmonitor_title': ugyanez címmel\n\n'publish_date': cikk megjelenésének ideje, változó pontossággal és ritkán értelmetlen\n\n'kmonitor_description': k-monitor adatbázisában tárolt leírás (sokszor 'null', néha egyezik a rendes leírással)\n\n'kmonitor_places': K-Monitoros önkéntesek által címkézett helyszínek\n\n'kmonitor_persons': K-Monitoros önkéntesek által címkézett személyek\n\n'kmonitor_others': K-Monitoros önkéntesek által címkézett egyéb kulcsszavak (nem feltétlen jelenik meg a cikkben az adott szó)" ]
[ "TAGS\n#size_categories-10K<n<100K #language-Hungarian #region-us \n", "## Használat", "## Oszlopok magyarázata\n\n'news_id': a cikk k-monitor-os azonosítója\n\n'title': hírportál által adott cím (hiányos lehet, lásd: kmonitor_title)\n\n'source_url': cikk url-je, ami hírportálra mutat\n\n'kmdb_url': a cikk k-monitor adatbázisba felvett rekordjára mutat (URL\n\n'newspaper': a hírportál neve (a k-monitor adatbázisból származik)\n\n'pub_time': publikálás időpontja (a k-monitor adatbázisból származik)\n\n'authors': a cikk szerzői, ez a hír parsolása közben kinyert információ (lehet, hogy hibás, sokszor hiányzik) \n\n'text': cikk szövegének törzse, tisztítva (üres nem lehet, de előfordulhat, hogy nem teljes)\n\n'meta_keywords': hírportál címkék, ez is a hír parsolása közben kinyert információ (nem mindig van és nem mindig értelmes)\n\n'meta_description': hírportál által adott lead (kis bevezető/összefoglaló az elején)\n\n'url': a cikk url-je, normalizálva (http->https, www kiszedve, egyéb url paraméterek eltávolítva [kivétel pár hírportál esetén])\n\n'kmonitor_title': ugyanez címmel\n\n'publish_date': cikk megjelenésének ideje, változó pontossággal és ritkán értelmetlen\n\n'kmonitor_description': k-monitor adatbázisában tárolt leírás (sokszor 'null', néha egyezik a rendes leírással)\n\n'kmonitor_places': K-Monitoros önkéntesek által címkézett helyszínek\n\n'kmonitor_persons': K-Monitoros önkéntesek által címkézett személyek\n\n'kmonitor_others': K-Monitoros önkéntesek által címkézett egyéb kulcsszavak (nem feltétlen jelenik meg a cikkben az adott szó)" ]
[ 24, 3, 428 ]
[ "passage: TAGS\n#size_categories-10K<n<100K #language-Hungarian #region-us \n## Használat## Oszlopok magyarázata\n\n'news_id': a cikk k-monitor-os azonosítója\n\n'title': hírportál által adott cím (hiányos lehet, lásd: kmonitor_title)\n\n'source_url': cikk url-je, ami hírportálra mutat\n\n'kmdb_url': a cikk k-monitor adatbázisba felvett rekordjára mutat (URL\n\n'newspaper': a hírportál neve (a k-monitor adatbázisból származik)\n\n'pub_time': publikálás időpontja (a k-monitor adatbázisból származik)\n\n'authors': a cikk szerzői, ez a hír parsolása közben kinyert információ (lehet, hogy hibás, sokszor hiányzik) \n\n'text': cikk szövegének törzse, tisztítva (üres nem lehet, de előfordulhat, hogy nem teljes)\n\n'meta_keywords': hírportál címkék, ez is a hír parsolása közben kinyert információ (nem mindig van és nem mindig értelmes)\n\n'meta_description': hírportál által adott lead (kis bevezető/összefoglaló az elején)\n\n'url': a cikk url-je, normalizálva (http->https, www kiszedve, egyéb url paraméterek eltávolítva [kivétel pár hírportál esetén])\n\n'kmonitor_title': ugyanez címmel\n\n'publish_date': cikk megjelenésének ideje, változó pontossággal és ritkán értelmetlen\n\n'kmonitor_description': k-monitor adatbázisában tárolt leírás (sokszor 'null', néha egyezik a rendes leírással)\n\n'kmonitor_places': K-Monitoros önkéntesek által címkézett helyszínek\n\n'kmonitor_persons': K-Monitoros önkéntesek által címkézett személyek\n\n'kmonitor_others': K-Monitoros önkéntesek által címkézett egyéb kulcsszavak (nem feltétlen jelenik meg a cikkben az adott szó)" ]
f091e1d3c242e503af6c7fa38dfdb54de2bba4a0
# Dataset Card for "kmdb_classification" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
boapps/kmdb_classification
[ "region:us" ]
2023-09-03T20:11:21+00:00
{"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "validation", "path": "data/validation-*"}, {"split": "test", "path": "data/test-*"}]}], "dataset_info": {"features": [{"name": "text", "dtype": "string"}, {"name": "title", "dtype": "string"}, {"name": "description", "dtype": "string"}, {"name": "keywords", "sequence": "string"}, {"name": "label", "dtype": "int64"}, {"name": "url", "dtype": "string"}, {"name": "date", "dtype": "string"}, {"name": "is_hand_annoted", "dtype": "bool"}, {"name": "score", "dtype": "float64"}, {"name": "title_score", "dtype": "float64"}], "splits": [{"name": "train", "num_bytes": 187493981, "num_examples": 45683}, {"name": "test", "num_bytes": 13542701, "num_examples": 3605}, {"name": "validation", "num_bytes": 25309037, "num_examples": 6579}], "download_size": 139938458, "dataset_size": 226345719}}
2023-09-21T10:43:34+00:00
[]
[]
TAGS #region-us
# Dataset Card for "kmdb_classification" More Information needed
[ "# Dataset Card for \"kmdb_classification\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"kmdb_classification\"\n\nMore Information needed" ]
[ 6, 15 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"kmdb_classification\"\n\nMore Information needed" ]
9fe7ce2a839046b956264ec9e6fba1fb09b5c3db
# Dataset Card for Evaluation run of PeanutJar/LLaMa-2-PeanutButter_v18_B-7B ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/PeanutJar/LLaMa-2-PeanutButter_v18_B-7B - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [PeanutJar/LLaMa-2-PeanutButter_v18_B-7B](https://huggingface.co/PeanutJar/LLaMa-2-PeanutButter_v18_B-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_PeanutJar__LLaMa-2-PeanutButter_v18_B-7B", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-09-22T18:47:12.642745](https://huggingface.co/datasets/open-llm-leaderboard/details_PeanutJar__LLaMa-2-PeanutButter_v18_B-7B/blob/main/results_2023-09-22T18-47-12.642745.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.0050335570469798654, "em_stderr": 0.0007247385547751907, "f1": 0.060973154362416224, "f1_stderr": 0.0014562854103949273, "acc": 0.40513399869433026, "acc_stderr": 0.009524554979348756 }, "harness|drop|3": { "em": 0.0050335570469798654, "em_stderr": 0.0007247385547751907, "f1": 0.060973154362416224, "f1_stderr": 0.0014562854103949273 }, "harness|gsm8k|5": { "acc": 0.06520090978013647, "acc_stderr": 0.006800302989321091 }, "harness|winogrande|5": { "acc": 0.745067087608524, "acc_stderr": 0.012248806969376422 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_PeanutJar__LLaMa-2-PeanutButter_v18_B-7B
[ "region:us" ]
2023-09-03T21:06:42+00:00
{"pretty_name": "Evaluation run of PeanutJar/LLaMa-2-PeanutButter_v18_B-7B", "dataset_summary": "Dataset automatically created during the evaluation run of model [PeanutJar/LLaMa-2-PeanutButter_v18_B-7B](https://huggingface.co/PeanutJar/LLaMa-2-PeanutButter_v18_B-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_PeanutJar__LLaMa-2-PeanutButter_v18_B-7B\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-09-22T18:47:12.642745](https://huggingface.co/datasets/open-llm-leaderboard/details_PeanutJar__LLaMa-2-PeanutButter_v18_B-7B/blob/main/results_2023-09-22T18-47-12.642745.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0050335570469798654,\n \"em_stderr\": 0.0007247385547751907,\n \"f1\": 0.060973154362416224,\n \"f1_stderr\": 0.0014562854103949273,\n \"acc\": 0.40513399869433026,\n \"acc_stderr\": 0.009524554979348756\n },\n \"harness|drop|3\": {\n \"em\": 0.0050335570469798654,\n \"em_stderr\": 0.0007247385547751907,\n \"f1\": 0.060973154362416224,\n \"f1_stderr\": 0.0014562854103949273\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.06520090978013647,\n \"acc_stderr\": 0.006800302989321091\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.745067087608524,\n \"acc_stderr\": 0.012248806969376422\n }\n}\n```", "repo_url": "https://huggingface.co/PeanutJar/LLaMa-2-PeanutButter_v18_B-7B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_09_03T22_06_17.603163", "path": ["**/details_harness|arc:challenge|25_2023-09-03T22:06:17.603163.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-09-03T22:06:17.603163.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_09_22T18_47_12.642745", "path": ["**/details_harness|drop|3_2023-09-22T18-47-12.642745.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-09-22T18-47-12.642745.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_09_22T18_47_12.642745", "path": ["**/details_harness|gsm8k|5_2023-09-22T18-47-12.642745.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-09-22T18-47-12.642745.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_09_03T22_06_17.603163", "path": ["**/details_harness|hellaswag|10_2023-09-03T22:06:17.603163.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-09-03T22:06:17.603163.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_09_03T22_06_17.603163", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-03T22:06:17.603163.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-09-03T22:06:17.603163.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-09-03T22:06:17.603163.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-09-03T22:06:17.603163.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-03T22:06:17.603163.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-09-03T22:06:17.603163.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-09-03T22:06:17.603163.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-09-03T22:06:17.603163.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-09-03T22:06:17.603163.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-09-03T22:06:17.603163.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-09-03T22:06:17.603163.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-09-03T22:06:17.603163.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-03T22:06:17.603163.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-09-03T22:06:17.603163.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-03T22:06:17.603163.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-03T22:06:17.603163.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-09-03T22:06:17.603163.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-09-03T22:06:17.603163.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-09-03T22:06:17.603163.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-03T22:06:17.603163.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-03T22:06:17.603163.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-03T22:06:17.603163.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-09-03T22:06:17.603163.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-03T22:06:17.603163.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-03T22:06:17.603163.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-03T22:06:17.603163.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-03T22:06:17.603163.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-09-03T22:06:17.603163.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-03T22:06:17.603163.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-03T22:06:17.603163.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-03T22:06:17.603163.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-03T22:06:17.603163.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-09-03T22:06:17.603163.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-09-03T22:06:17.603163.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-09-03T22:06:17.603163.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-09-03T22:06:17.603163.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-03T22:06:17.603163.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-09-03T22:06:17.603163.parquet", "**/details_harness|hendrycksTest-management|5_2023-09-03T22:06:17.603163.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-09-03T22:06:17.603163.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-09-03T22:06:17.603163.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-09-03T22:06:17.603163.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-09-03T22:06:17.603163.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-03T22:06:17.603163.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-09-03T22:06:17.603163.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-09-03T22:06:17.603163.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-09-03T22:06:17.603163.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-09-03T22:06:17.603163.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-09-03T22:06:17.603163.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-09-03T22:06:17.603163.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-09-03T22:06:17.603163.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-09-03T22:06:17.603163.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-09-03T22:06:17.603163.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-09-03T22:06:17.603163.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-03T22:06:17.603163.parquet", "**/details_harness|hendrycksTest-virology|5_2023-09-03T22:06:17.603163.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-09-03T22:06:17.603163.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-03T22:06:17.603163.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-09-03T22:06:17.603163.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-09-03T22:06:17.603163.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-09-03T22:06:17.603163.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-03T22:06:17.603163.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-09-03T22:06:17.603163.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-09-03T22:06:17.603163.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-09-03T22:06:17.603163.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-09-03T22:06:17.603163.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-09-03T22:06:17.603163.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-09-03T22:06:17.603163.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-09-03T22:06:17.603163.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-03T22:06:17.603163.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-09-03T22:06:17.603163.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-03T22:06:17.603163.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-03T22:06:17.603163.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-09-03T22:06:17.603163.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-09-03T22:06:17.603163.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-09-03T22:06:17.603163.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-03T22:06:17.603163.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-03T22:06:17.603163.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-03T22:06:17.603163.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-09-03T22:06:17.603163.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-03T22:06:17.603163.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-03T22:06:17.603163.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-03T22:06:17.603163.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-03T22:06:17.603163.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-09-03T22:06:17.603163.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-03T22:06:17.603163.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-03T22:06:17.603163.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-03T22:06:17.603163.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-03T22:06:17.603163.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-09-03T22:06:17.603163.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-09-03T22:06:17.603163.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-09-03T22:06:17.603163.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-09-03T22:06:17.603163.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-03T22:06:17.603163.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-09-03T22:06:17.603163.parquet", "**/details_harness|hendrycksTest-management|5_2023-09-03T22:06:17.603163.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-09-03T22:06:17.603163.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-09-03T22:06:17.603163.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-09-03T22:06:17.603163.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-09-03T22:06:17.603163.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-03T22:06:17.603163.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-09-03T22:06:17.603163.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-09-03T22:06:17.603163.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-09-03T22:06:17.603163.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-09-03T22:06:17.603163.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-09-03T22:06:17.603163.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-09-03T22:06:17.603163.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-09-03T22:06:17.603163.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-09-03T22:06:17.603163.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-09-03T22:06:17.603163.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-09-03T22:06:17.603163.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-03T22:06:17.603163.parquet", "**/details_harness|hendrycksTest-virology|5_2023-09-03T22:06:17.603163.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-09-03T22:06:17.603163.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_09_03T22_06_17.603163", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-03T22:06:17.603163.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-03T22:06:17.603163.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_09_03T22_06_17.603163", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-09-03T22:06:17.603163.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-09-03T22:06:17.603163.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_09_03T22_06_17.603163", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-09-03T22:06:17.603163.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-09-03T22:06:17.603163.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_09_03T22_06_17.603163", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-09-03T22:06:17.603163.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-09-03T22:06:17.603163.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_09_03T22_06_17.603163", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-03T22:06:17.603163.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-03T22:06:17.603163.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_09_03T22_06_17.603163", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-09-03T22:06:17.603163.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-09-03T22:06:17.603163.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_09_03T22_06_17.603163", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-09-03T22:06:17.603163.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-09-03T22:06:17.603163.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_09_03T22_06_17.603163", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-09-03T22:06:17.603163.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-09-03T22:06:17.603163.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_09_03T22_06_17.603163", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-09-03T22:06:17.603163.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-09-03T22:06:17.603163.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_09_03T22_06_17.603163", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-09-03T22:06:17.603163.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-09-03T22:06:17.603163.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_09_03T22_06_17.603163", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-09-03T22:06:17.603163.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-09-03T22:06:17.603163.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_09_03T22_06_17.603163", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-09-03T22:06:17.603163.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-09-03T22:06:17.603163.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_09_03T22_06_17.603163", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-03T22:06:17.603163.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-03T22:06:17.603163.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_09_03T22_06_17.603163", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-09-03T22:06:17.603163.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-09-03T22:06:17.603163.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_09_03T22_06_17.603163", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-03T22:06:17.603163.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-03T22:06:17.603163.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_09_03T22_06_17.603163", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-03T22:06:17.603163.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-03T22:06:17.603163.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_09_03T22_06_17.603163", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-09-03T22:06:17.603163.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-09-03T22:06:17.603163.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_09_03T22_06_17.603163", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-09-03T22:06:17.603163.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-09-03T22:06:17.603163.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_09_03T22_06_17.603163", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-09-03T22:06:17.603163.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-09-03T22:06:17.603163.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_09_03T22_06_17.603163", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-03T22:06:17.603163.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-03T22:06:17.603163.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_09_03T22_06_17.603163", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-03T22:06:17.603163.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-03T22:06:17.603163.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_09_03T22_06_17.603163", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-03T22:06:17.603163.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-03T22:06:17.603163.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_09_03T22_06_17.603163", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-09-03T22:06:17.603163.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-09-03T22:06:17.603163.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_09_03T22_06_17.603163", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-03T22:06:17.603163.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-03T22:06:17.603163.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_09_03T22_06_17.603163", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-03T22:06:17.603163.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-03T22:06:17.603163.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_09_03T22_06_17.603163", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-03T22:06:17.603163.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-03T22:06:17.603163.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_09_03T22_06_17.603163", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-03T22:06:17.603163.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-03T22:06:17.603163.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_09_03T22_06_17.603163", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-09-03T22:06:17.603163.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-09-03T22:06:17.603163.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_09_03T22_06_17.603163", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-03T22:06:17.603163.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-03T22:06:17.603163.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_09_03T22_06_17.603163", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-03T22:06:17.603163.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-03T22:06:17.603163.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_09_03T22_06_17.603163", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-03T22:06:17.603163.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-03T22:06:17.603163.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_09_03T22_06_17.603163", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-03T22:06:17.603163.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-03T22:06:17.603163.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_09_03T22_06_17.603163", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-09-03T22:06:17.603163.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-09-03T22:06:17.603163.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_09_03T22_06_17.603163", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-09-03T22:06:17.603163.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-09-03T22:06:17.603163.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_09_03T22_06_17.603163", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-09-03T22:06:17.603163.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-09-03T22:06:17.603163.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_09_03T22_06_17.603163", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-09-03T22:06:17.603163.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-09-03T22:06:17.603163.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_09_03T22_06_17.603163", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-03T22:06:17.603163.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-03T22:06:17.603163.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_09_03T22_06_17.603163", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-09-03T22:06:17.603163.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-09-03T22:06:17.603163.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_09_03T22_06_17.603163", "path": ["**/details_harness|hendrycksTest-management|5_2023-09-03T22:06:17.603163.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-09-03T22:06:17.603163.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_09_03T22_06_17.603163", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-09-03T22:06:17.603163.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-09-03T22:06:17.603163.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_09_03T22_06_17.603163", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-09-03T22:06:17.603163.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-09-03T22:06:17.603163.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_09_03T22_06_17.603163", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-09-03T22:06:17.603163.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-09-03T22:06:17.603163.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_09_03T22_06_17.603163", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-09-03T22:06:17.603163.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-09-03T22:06:17.603163.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_09_03T22_06_17.603163", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-03T22:06:17.603163.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-03T22:06:17.603163.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_09_03T22_06_17.603163", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-09-03T22:06:17.603163.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-09-03T22:06:17.603163.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_09_03T22_06_17.603163", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-09-03T22:06:17.603163.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-09-03T22:06:17.603163.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_09_03T22_06_17.603163", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-09-03T22:06:17.603163.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-09-03T22:06:17.603163.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_09_03T22_06_17.603163", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-09-03T22:06:17.603163.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-09-03T22:06:17.603163.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_09_03T22_06_17.603163", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-09-03T22:06:17.603163.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-09-03T22:06:17.603163.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_09_03T22_06_17.603163", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-09-03T22:06:17.603163.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-09-03T22:06:17.603163.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_09_03T22_06_17.603163", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-09-03T22:06:17.603163.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-09-03T22:06:17.603163.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_09_03T22_06_17.603163", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-09-03T22:06:17.603163.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-09-03T22:06:17.603163.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_09_03T22_06_17.603163", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-09-03T22:06:17.603163.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-09-03T22:06:17.603163.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_09_03T22_06_17.603163", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-09-03T22:06:17.603163.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-09-03T22:06:17.603163.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_09_03T22_06_17.603163", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-03T22:06:17.603163.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-03T22:06:17.603163.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_09_03T22_06_17.603163", "path": ["**/details_harness|hendrycksTest-virology|5_2023-09-03T22:06:17.603163.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-09-03T22:06:17.603163.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_09_03T22_06_17.603163", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-09-03T22:06:17.603163.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-09-03T22:06:17.603163.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_09_03T22_06_17.603163", "path": ["**/details_harness|truthfulqa:mc|0_2023-09-03T22:06:17.603163.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-09-03T22:06:17.603163.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_09_22T18_47_12.642745", "path": ["**/details_harness|winogrande|5_2023-09-22T18-47-12.642745.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-09-22T18-47-12.642745.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_09_03T22_06_17.603163", "path": ["results_2023-09-03T22:06:17.603163.parquet"]}, {"split": "2023_09_22T18_47_12.642745", "path": ["results_2023-09-22T18-47-12.642745.parquet"]}, {"split": "latest", "path": ["results_2023-09-22T18-47-12.642745.parquet"]}]}]}
2023-09-22T17:47:24+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of PeanutJar/LLaMa-2-PeanutButter_v18_B-7B ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model PeanutJar/LLaMa-2-PeanutButter_v18_B-7B on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-09-22T18:47:12.642745(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of PeanutJar/LLaMa-2-PeanutButter_v18_B-7B", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model PeanutJar/LLaMa-2-PeanutButter_v18_B-7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-09-22T18:47:12.642745(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of PeanutJar/LLaMa-2-PeanutButter_v18_B-7B", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model PeanutJar/LLaMa-2-PeanutButter_v18_B-7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-09-22T18:47:12.642745(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 31, 31, 179, 66, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of PeanutJar/LLaMa-2-PeanutButter_v18_B-7B## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model PeanutJar/LLaMa-2-PeanutButter_v18_B-7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-09-22T18:47:12.642745(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
c6e95ad2144d13200b04e9c028f35eb4c0f904f1
# Dataset Card for "kirsten_v5" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
profetize/kirsten_v5
[ "region:us" ]
2023-09-03T21:09:34+00:00
{"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "test", "path": "data/test-*"}, {"split": "validate", "path": "data/validate-*"}]}], "dataset_info": {"features": [{"name": "Filename", "dtype": "string"}, {"name": "URL", "dtype": "string"}, {"name": "Content", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 243360307.832761, "num_examples": 10482}, {"name": "test", "num_bytes": 81143319.5836195, "num_examples": 3495}, {"name": "validate", "num_bytes": 81143319.5836195, "num_examples": 3495}], "download_size": 237943569, "dataset_size": 405646947.0}}
2023-09-03T21:09:47+00:00
[]
[]
TAGS #region-us
# Dataset Card for "kirsten_v5" More Information needed
[ "# Dataset Card for \"kirsten_v5\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"kirsten_v5\"\n\nMore Information needed" ]
[ 6, 15 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"kirsten_v5\"\n\nMore Information needed" ]
c412a8c51c88933d6ce7338fa7bcddfd8c5bca30
# Dataset Card for "OpenOrca-GPT4-standardized" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
HydraLM/OpenOrca-GPT4-standardized
[ "region:us" ]
2023-09-03T21:35:39+00:00
{"dataset_info": {"features": [{"name": "message", "dtype": "string"}, {"name": "message_type", "dtype": "string"}, {"name": "message_id", "dtype": "int64"}, {"name": "conversation_id", "dtype": "int64"}], "splits": [{"name": "train", "num_bytes": 1856699239, "num_examples": 2984688}], "download_size": 979202725, "dataset_size": 1856699239}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]}
2023-09-03T21:40:11+00:00
[]
[]
TAGS #region-us
# Dataset Card for "OpenOrca-GPT4-standardized" More Information needed
[ "# Dataset Card for \"OpenOrca-GPT4-standardized\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"OpenOrca-GPT4-standardized\"\n\nMore Information needed" ]
[ 6, 19 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"OpenOrca-GPT4-standardized\"\n\nMore Information needed" ]
6491957dbe8664ca42e1f7a422e3b511f6f4c5b7
# Dataset Card for "mr-golem" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
volvoDon/mr-golem
[ "region:us" ]
2023-09-03T23:06:42+00:00
{"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "test", "path": "data/test-*"}]}], "dataset_info": {"features": [{"name": "input_ids", "sequence": "int32"}], "splits": [{"name": "train", "num_bytes": 155724.0, "num_examples": 19}, {"name": "test", "num_bytes": 24588.0, "num_examples": 3}], "download_size": 103142, "dataset_size": 180312.0}}
2023-09-03T23:06:47+00:00
[]
[]
TAGS #region-us
# Dataset Card for "mr-golem" More Information needed
[ "# Dataset Card for \"mr-golem\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"mr-golem\"\n\nMore Information needed" ]
[ 6, 14 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"mr-golem\"\n\nMore Information needed" ]
2b631f02a8836b253e5e8eea89281026ded4ceb5
# Dataset Card for "prm800k_processed_preference" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
miazhao/prm800k_processed_preference
[ "region:us" ]
2023-09-03T23:10:15+00:00
{"dataset_info": {"features": [{"name": "instruction", "dtype": "string"}, {"name": "responses", "sequence": "string"}, {"name": "chosen", "dtype": "string"}, {"name": "rejected", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 23805614, "num_examples": 22036}], "download_size": 9396871, "dataset_size": 23805614}}
2023-09-03T23:10:16+00:00
[]
[]
TAGS #region-us
# Dataset Card for "prm800k_processed_preference" More Information needed
[ "# Dataset Card for \"prm800k_processed_preference\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"prm800k_processed_preference\"\n\nMore Information needed" ]
[ 6, 20 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"prm800k_processed_preference\"\n\nMore Information needed" ]
d96d64073dcfe99f21df023bec3f0c7a48e19bca
This dataset is a partial computation of metrics (memorized token frequencies, non-memorized token frequencies, sequence frequencies) needed for [research](https://github.com/EleutherAI/semantic-memorization).
alvin319/semantic-memorization-partial-2023-09-03
[ "license:mit", "region:us" ]
2023-09-04T00:07:54+00:00
{"license": "mit", "configs": [{"config_name": "default", "data_files": [{"split": "pile_deduped_70m", "path": "data/pile_deduped_70m-*"}, {"split": "memories_deduped_70m", "path": "data/memories_deduped_70m-*"}, {"split": "pile_deduped_160m", "path": "data/pile_deduped_160m-*"}, {"split": "memories_deduped_160m", "path": "data/memories_deduped_160m-*"}, {"split": "pile_deduped_410m", "path": "data/pile_deduped_410m-*"}, {"split": "memories_deduped_410m", "path": "data/memories_deduped_410m-*"}, {"split": "pile_deduped_1b", "path": "data/pile_deduped_1b-*"}, {"split": "memories_deduped_1b", "path": "data/memories_deduped_1b-*"}, {"split": "pile_deduped_1.4b", "path": "data/pile_deduped_1.4b-*"}, {"split": "memories_deduped_1.4b", "path": "data/memories_deduped_1.4b-*"}, {"split": "pile_deduped_2.8b", "path": "data/pile_deduped_2.8b-*"}, {"split": "memories_deduped_2.8b", "path": "data/memories_deduped_2.8b-*"}, {"split": "pile_deduped_6.9b", "path": "data/pile_deduped_6.9b-*"}, {"split": "memories_deduped_6.9b", "path": "data/memories_deduped_6.9b-*"}, {"split": "pile_deduped_12b", "path": "data/pile_deduped_12b-*"}, {"split": "memories_deduped_12b", "path": "data/memories_deduped_12b-*"}, {"split": "pile_duped_70m", "path": "data/pile_duped_70m-*"}, {"split": "memories_duped_70m", "path": "data/memories_duped_70m-*"}, {"split": "pile_duped_160m", "path": "data/pile_duped_160m-*"}, {"split": "memories_duped_160m", "path": "data/memories_duped_160m-*"}, {"split": "pile_duped_410m", "path": "data/pile_duped_410m-*"}, {"split": "memories_duped_410m", "path": "data/memories_duped_410m-*"}, {"split": "pile_duped_1b", "path": "data/pile_duped_1b-*"}, {"split": "memories_duped_1b", "path": "data/memories_duped_1b-*"}, {"split": "pile_duped_1.4b", "path": "data/pile_duped_1.4b-*"}, {"split": "memories_duped_1.4b", "path": "data/memories_duped_1.4b-*"}, {"split": "pile_duped_2.8b", "path": "data/pile_duped_2.8b-*"}, {"split": "memories_duped_2.8b", "path": "data/memories_duped_2.8b-*"}, {"split": "pile_duped_6.9b", "path": "data/pile_duped_6.9b-*"}, {"split": "memories_duped_6.9b", "path": "data/memories_duped_6.9b-*"}, {"split": "pile_duped_12b", "path": "data/pile_duped_12b-*"}, {"split": "memories_duped_12b", "path": "data/memories_duped_12b-*"}]}], "dataset_info": {"features": [{"name": "sequence_id", "dtype": "int64"}, {"name": "tokens", "sequence": "int64"}, {"name": "memorized_frequencies", "sequence": "int64"}, {"name": "non_memorized_frequencies", "sequence": "int64"}, {"name": "memorization_score", "dtype": "float64"}, {"name": "sequence_frequency", "dtype": "int64"}], "splits": [{"name": "pile_deduped_70m", "num_bytes": 7860000000, "num_examples": 5000000}, {"name": "memories_deduped_70m", "num_bytes": 646796256, "num_examples": 411448}, {"name": "pile_deduped_160m", "num_bytes": 7860000000, "num_examples": 5000000}, {"name": "memories_deduped_160m", "num_bytes": 913638540, "num_examples": 581195}, {"name": "pile_deduped_410m", "num_bytes": 7860000000, "num_examples": 5000000}, {"name": "memories_deduped_410m", "num_bytes": 1274953308, "num_examples": 811039}, {"name": "pile_deduped_1b", "num_bytes": 7860000000, "num_examples": 5000000}, {"name": "memories_deduped_1b", "num_bytes": 1623663780, "num_examples": 1032865}, {"name": "pile_deduped_1.4b", "num_bytes": 7860000000, "num_examples": 5000000}, {"name": "memories_deduped_1.4b", "num_bytes": 1647608484, "num_examples": 1048097}, {"name": "pile_deduped_2.8b", "num_bytes": 7860000000, "num_examples": 5000000}, {"name": "memories_deduped_2.8b", "num_bytes": 2130391692, "num_examples": 1355211}, {"name": "pile_deduped_6.9b", "num_bytes": 7860000000, "num_examples": 5000000}, {"name": "memories_deduped_6.9b", "num_bytes": 2641422168, "num_examples": 1680294}, {"name": "pile_deduped_12b", "num_bytes": 7860000000, "num_examples": 5000000}, {"name": "memories_deduped_12b", "num_bytes": 2941549980, "num_examples": 1871215}, {"name": "pile_duped_70m", "num_bytes": 7860000000, "num_examples": 5000000}, {"name": "memories_duped_70m", "num_bytes": 729334116, "num_examples": 463953}, {"name": "pile_duped_160m", "num_bytes": 7860000000, "num_examples": 5000000}, {"name": "memories_duped_160m", "num_bytes": 1084165956, "num_examples": 689673}, {"name": "pile_duped_410m", "num_bytes": 7860000000, "num_examples": 5000000}, {"name": "memories_duped_410m", "num_bytes": 1525376052, "num_examples": 970341}, {"name": "pile_duped_1b", "num_bytes": 7860000000, "num_examples": 5000000}, {"name": "memories_duped_1b", "num_bytes": 1974653652, "num_examples": 1256141}, {"name": "pile_duped_1.4b", "num_bytes": 7860000000, "num_examples": 5000000}, {"name": "memories_duped_1.4b", "num_bytes": 2159490984, "num_examples": 1373722}, {"name": "pile_duped_2.8b", "num_bytes": 7860000000, "num_examples": 5000000}, {"name": "memories_duped_2.8b", "num_bytes": 2633221044, "num_examples": 1675077}, {"name": "pile_duped_6.9b", "num_bytes": 7860000000, "num_examples": 5000000}, {"name": "memories_duped_6.9b", "num_bytes": 3334163268, "num_examples": 2120969}, {"name": "pile_duped_12b", "num_bytes": 7860000000, "num_examples": 5000000}, {"name": "memories_duped_12b", "num_bytes": 3745016472, "num_examples": 2382326}], "download_size": 11256676441, "dataset_size": 156765445752}}
2023-09-04T08:39:21+00:00
[]
[]
TAGS #license-mit #region-us
This dataset is a partial computation of metrics (memorized token frequencies, non-memorized token frequencies, sequence frequencies) needed for research.
[]
[ "TAGS\n#license-mit #region-us \n" ]
[ 11 ]
[ "passage: TAGS\n#license-mit #region-us \n" ]
cbba06aea4dd066d19dc0985a930c13f803d3db3
# OpenHermes Dataset ![image/png](https://cdn-uploads.huggingface.co/production/uploads/6317aade83d8d2fd903192d9/XIiSwLP1Uu94IUucGypyl.png) The OpenHermes dataset is composed of 242,000 entries of primarily GPT-4 generated data, from open datasets across the AI landscape, including: OpenHermes 13B is the first fine tune of the Hermes dataset that has a fully open source dataset! OpenHermes was trained on 242,000 entries of primarily GPT-4 generated data, from open datasets across the AI landscape, including: - GPTeacher - General Instruct, Roleplay v1, Roleplay v2, and Code Instruct Datasets, by Teknium - WizardLM (v1, evol_instruct 70k), by WizardLM Team/nlpxucan - Airoboros GPT-4 (v1.0), by JonDurbin - Camel-AI's domain expert datasets, by the Camel-AI Team - CodeAlpaca, by Sahil2801 - GPT4-LLM and Unnatural Instructions, by Microsoft Filtering included removal of OpenAI refusals, disclaimers, and "As an AI" type examples and more The base dataset mix is identical to the original Nous-Hermes', minus the Nous-Instruct and PDACTL datasets which were private datasets.
teknium/openhermes
[ "task_categories:text-generation", "language:eng", "distillation", "synthetic data", "gpt", "region:us" ]
2023-09-04T00:31:26+00:00
{"language": ["eng"], "task_categories": ["text-generation"], "pretty_name": "OpenHermes-v1.0", "tags": ["distillation", "synthetic data", "gpt"]}
2023-09-07T19:41:05+00:00
[]
[ "eng" ]
TAGS #task_categories-text-generation #language-English #distillation #synthetic data #gpt #region-us
# OpenHermes Dataset !image/png The OpenHermes dataset is composed of 242,000 entries of primarily GPT-4 generated data, from open datasets across the AI landscape, including: OpenHermes 13B is the first fine tune of the Hermes dataset that has a fully open source dataset! OpenHermes was trained on 242,000 entries of primarily GPT-4 generated data, from open datasets across the AI landscape, including: - GPTeacher - General Instruct, Roleplay v1, Roleplay v2, and Code Instruct Datasets, by Teknium - WizardLM (v1, evol_instruct 70k), by WizardLM Team/nlpxucan - Airoboros GPT-4 (v1.0), by JonDurbin - Camel-AI's domain expert datasets, by the Camel-AI Team - CodeAlpaca, by Sahil2801 - GPT4-LLM and Unnatural Instructions, by Microsoft Filtering included removal of OpenAI refusals, disclaimers, and "As an AI" type examples and more The base dataset mix is identical to the original Nous-Hermes', minus the Nous-Instruct and PDACTL datasets which were private datasets.
[ "# OpenHermes Dataset\n\n!image/png\n\nThe OpenHermes dataset is composed of 242,000 entries of primarily GPT-4 generated data, from open datasets across the AI landscape, including:\n\nOpenHermes 13B is the first fine tune of the Hermes dataset that has a fully open source dataset!\n\nOpenHermes was trained on 242,000 entries of primarily GPT-4 generated data, from open datasets across the AI landscape, including:\n\n- GPTeacher - General Instruct, Roleplay v1, Roleplay v2, and Code Instruct Datasets, by Teknium\n- WizardLM (v1, evol_instruct 70k), by WizardLM Team/nlpxucan\n- Airoboros GPT-4 (v1.0), by JonDurbin\n- Camel-AI's domain expert datasets, by the Camel-AI Team\n- CodeAlpaca, by Sahil2801\n- GPT4-LLM and Unnatural Instructions, by Microsoft\n\nFiltering included removal of OpenAI refusals, disclaimers, and \"As an AI\" type examples and more\n\n\nThe base dataset mix is identical to the original Nous-Hermes', minus the Nous-Instruct and PDACTL datasets which were private datasets." ]
[ "TAGS\n#task_categories-text-generation #language-English #distillation #synthetic data #gpt #region-us \n", "# OpenHermes Dataset\n\n!image/png\n\nThe OpenHermes dataset is composed of 242,000 entries of primarily GPT-4 generated data, from open datasets across the AI landscape, including:\n\nOpenHermes 13B is the first fine tune of the Hermes dataset that has a fully open source dataset!\n\nOpenHermes was trained on 242,000 entries of primarily GPT-4 generated data, from open datasets across the AI landscape, including:\n\n- GPTeacher - General Instruct, Roleplay v1, Roleplay v2, and Code Instruct Datasets, by Teknium\n- WizardLM (v1, evol_instruct 70k), by WizardLM Team/nlpxucan\n- Airoboros GPT-4 (v1.0), by JonDurbin\n- Camel-AI's domain expert datasets, by the Camel-AI Team\n- CodeAlpaca, by Sahil2801\n- GPT4-LLM and Unnatural Instructions, by Microsoft\n\nFiltering included removal of OpenAI refusals, disclaimers, and \"As an AI\" type examples and more\n\n\nThe base dataset mix is identical to the original Nous-Hermes', minus the Nous-Instruct and PDACTL datasets which were private datasets." ]
[ 33, 284 ]
[ "passage: TAGS\n#task_categories-text-generation #language-English #distillation #synthetic data #gpt #region-us \n# OpenHermes Dataset\n\n!image/png\n\nThe OpenHermes dataset is composed of 242,000 entries of primarily GPT-4 generated data, from open datasets across the AI landscape, including:\n\nOpenHermes 13B is the first fine tune of the Hermes dataset that has a fully open source dataset!\n\nOpenHermes was trained on 242,000 entries of primarily GPT-4 generated data, from open datasets across the AI landscape, including:\n\n- GPTeacher - General Instruct, Roleplay v1, Roleplay v2, and Code Instruct Datasets, by Teknium\n- WizardLM (v1, evol_instruct 70k), by WizardLM Team/nlpxucan\n- Airoboros GPT-4 (v1.0), by JonDurbin\n- Camel-AI's domain expert datasets, by the Camel-AI Team\n- CodeAlpaca, by Sahil2801\n- GPT4-LLM and Unnatural Instructions, by Microsoft\n\nFiltering included removal of OpenAI refusals, disclaimers, and \"As an AI\" type examples and more\n\n\nThe base dataset mix is identical to the original Nous-Hermes', minus the Nous-Instruct and PDACTL datasets which were private datasets." ]
7eaa9705a23025d381f2ad5fcb9e044adff1b4b5
annotations_creators: [] language: - en language_creators: - other - found license: - apache-2.0 multilinguality: - monolingual pretty_name: Medical Conversational Dataset size_categories: - 1K<n<10K source_datasets: - original tags: [] task_categories: - conversational task_ids: - dialogue-generation
taaredikahan23/Medical_dataset
[ "region:us" ]
2023-09-04T00:33:49+00:00
{}
2023-09-04T00:35:06+00:00
[]
[]
TAGS #region-us
annotations_creators: [] language: - en language_creators: - other - found license: - apache-2.0 multilinguality: - monolingual pretty_name: Medical Conversational Dataset size_categories: - 1K<n<10K source_datasets: - original tags: [] task_categories: - conversational task_ids: - dialogue-generation
[]
[ "TAGS\n#region-us \n" ]
[ 6 ]
[ "passage: TAGS\n#region-us \n" ]
99ee60f9798d75b2cc9662124ea07e402fe7699d
# Dataset Card for "medical-llama2-1k" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
taaredikahan23/medical-llama2-1k
[ "region:us" ]
2023-09-04T00:45:30+00:00
{"dataset_info": {"features": [{"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 394047, "num_examples": 1000}], "download_size": 185327, "dataset_size": 394047}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]}
2023-09-04T00:45:32+00:00
[]
[]
TAGS #region-us
# Dataset Card for "medical-llama2-1k" More Information needed
[ "# Dataset Card for \"medical-llama2-1k\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"medical-llama2-1k\"\n\nMore Information needed" ]
[ 6, 17 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"medical-llama2-1k\"\n\nMore Information needed" ]
d4a0f4b44c784ec46b382dc8606a7c3bf79b7047
# Dataset Card for "prm800k_rating_cls" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
miazhao/prm800k_rating_cls
[ "region:us" ]
2023-09-04T00:54:55+00:00
{"dataset_info": {"features": [{"name": "input", "dtype": "string"}, {"name": "rating", "dtype": "int64"}], "splits": [{"name": "train", "num_bytes": 649475535, "num_examples": 801063}], "download_size": 94263081, "dataset_size": 649475535}}
2023-09-04T00:54:58+00:00
[]
[]
TAGS #region-us
# Dataset Card for "prm800k_rating_cls" More Information needed
[ "# Dataset Card for \"prm800k_rating_cls\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"prm800k_rating_cls\"\n\nMore Information needed" ]
[ 6, 19 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"prm800k_rating_cls\"\n\nMore Information needed" ]
bc8b0f7ac0d8051843f2f8e53b236b9298619f41
# Dataset Card for Evaluation run of CHIH-HUNG/llama-2-13b-huangyt_Fintune_1_17w-q_k_v_o_proj ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/CHIH-HUNG/llama-2-13b-huangyt_Fintune_1_17w-q_k_v_o_proj - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [CHIH-HUNG/llama-2-13b-huangyt_Fintune_1_17w-q_k_v_o_proj](https://huggingface.co/CHIH-HUNG/llama-2-13b-huangyt_Fintune_1_17w-q_k_v_o_proj) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-huangyt_Fintune_1_17w-q_k_v_o_proj", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-10-18T03:41:50.218488](https://huggingface.co/datasets/open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-huangyt_Fintune_1_17w-q_k_v_o_proj/blob/main/results_2023-10-18T03-41-50.218488.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.35979446308724833, "em_stderr": 0.0049150348561349125, "f1": 0.39900062919463114, "f1_stderr": 0.004825664226319409, "acc": 0.46081554692422627, "acc_stderr": 0.01059065324860096 }, "harness|drop|3": { "em": 0.35979446308724833, "em_stderr": 0.0049150348561349125, "f1": 0.39900062919463114, "f1_stderr": 0.004825664226319409 }, "harness|gsm8k|5": { "acc": 0.14025777103866566, "acc_stderr": 0.009565108281428673 }, "harness|winogrande|5": { "acc": 0.7813733228097869, "acc_stderr": 0.011616198215773246 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-huangyt_Fintune_1_17w-q_k_v_o_proj
[ "region:us" ]
2023-09-04T01:20:08+00:00
{"pretty_name": "Evaluation run of CHIH-HUNG/llama-2-13b-huangyt_Fintune_1_17w-q_k_v_o_proj", "dataset_summary": "Dataset automatically created during the evaluation run of model [CHIH-HUNG/llama-2-13b-huangyt_Fintune_1_17w-q_k_v_o_proj](https://huggingface.co/CHIH-HUNG/llama-2-13b-huangyt_Fintune_1_17w-q_k_v_o_proj) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-huangyt_Fintune_1_17w-q_k_v_o_proj\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-18T03:41:50.218488](https://huggingface.co/datasets/open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-huangyt_Fintune_1_17w-q_k_v_o_proj/blob/main/results_2023-10-18T03-41-50.218488.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.35979446308724833,\n \"em_stderr\": 0.0049150348561349125,\n \"f1\": 0.39900062919463114,\n \"f1_stderr\": 0.004825664226319409,\n \"acc\": 0.46081554692422627,\n \"acc_stderr\": 0.01059065324860096\n },\n \"harness|drop|3\": {\n \"em\": 0.35979446308724833,\n \"em_stderr\": 0.0049150348561349125,\n \"f1\": 0.39900062919463114,\n \"f1_stderr\": 0.004825664226319409\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.14025777103866566,\n \"acc_stderr\": 0.009565108281428673\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7813733228097869,\n \"acc_stderr\": 0.011616198215773246\n }\n}\n```", "repo_url": "https://huggingface.co/CHIH-HUNG/llama-2-13b-huangyt_Fintune_1_17w-q_k_v_o_proj", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_09_04T02_19_44.261303", "path": ["**/details_harness|arc:challenge|25_2023-09-04T02:19:44.261303.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-09-04T02:19:44.261303.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_18T03_41_50.218488", "path": ["**/details_harness|drop|3_2023-10-18T03-41-50.218488.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-18T03-41-50.218488.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_18T03_41_50.218488", "path": ["**/details_harness|gsm8k|5_2023-10-18T03-41-50.218488.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-18T03-41-50.218488.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_09_04T02_19_44.261303", "path": ["**/details_harness|hellaswag|10_2023-09-04T02:19:44.261303.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-09-04T02:19:44.261303.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_09_04T02_19_44.261303", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-04T02:19:44.261303.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-09-04T02:19:44.261303.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-09-04T02:19:44.261303.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-09-04T02:19:44.261303.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-04T02:19:44.261303.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-09-04T02:19:44.261303.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-09-04T02:19:44.261303.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-09-04T02:19:44.261303.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-09-04T02:19:44.261303.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-09-04T02:19:44.261303.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-09-04T02:19:44.261303.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-09-04T02:19:44.261303.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-04T02:19:44.261303.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-09-04T02:19:44.261303.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-04T02:19:44.261303.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-04T02:19:44.261303.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-09-04T02:19:44.261303.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-09-04T02:19:44.261303.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-09-04T02:19:44.261303.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-04T02:19:44.261303.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-04T02:19:44.261303.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-04T02:19:44.261303.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-09-04T02:19:44.261303.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-04T02:19:44.261303.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-04T02:19:44.261303.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-04T02:19:44.261303.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-04T02:19:44.261303.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-09-04T02:19:44.261303.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-04T02:19:44.261303.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-04T02:19:44.261303.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-04T02:19:44.261303.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-04T02:19:44.261303.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-09-04T02:19:44.261303.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-09-04T02:19:44.261303.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-09-04T02:19:44.261303.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-09-04T02:19:44.261303.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-04T02:19:44.261303.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-09-04T02:19:44.261303.parquet", "**/details_harness|hendrycksTest-management|5_2023-09-04T02:19:44.261303.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-09-04T02:19:44.261303.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-09-04T02:19:44.261303.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-09-04T02:19:44.261303.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-09-04T02:19:44.261303.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-04T02:19:44.261303.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-09-04T02:19:44.261303.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-09-04T02:19:44.261303.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-09-04T02:19:44.261303.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-09-04T02:19:44.261303.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-09-04T02:19:44.261303.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-09-04T02:19:44.261303.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-09-04T02:19:44.261303.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-09-04T02:19:44.261303.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-09-04T02:19:44.261303.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-09-04T02:19:44.261303.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-04T02:19:44.261303.parquet", "**/details_harness|hendrycksTest-virology|5_2023-09-04T02:19:44.261303.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-09-04T02:19:44.261303.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-04T02:19:44.261303.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-09-04T02:19:44.261303.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-09-04T02:19:44.261303.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-09-04T02:19:44.261303.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-04T02:19:44.261303.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-09-04T02:19:44.261303.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-09-04T02:19:44.261303.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-09-04T02:19:44.261303.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-09-04T02:19:44.261303.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-09-04T02:19:44.261303.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-09-04T02:19:44.261303.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-09-04T02:19:44.261303.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-04T02:19:44.261303.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-09-04T02:19:44.261303.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-04T02:19:44.261303.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-04T02:19:44.261303.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-09-04T02:19:44.261303.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-09-04T02:19:44.261303.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-09-04T02:19:44.261303.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-04T02:19:44.261303.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-04T02:19:44.261303.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-04T02:19:44.261303.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-09-04T02:19:44.261303.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-04T02:19:44.261303.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-04T02:19:44.261303.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-04T02:19:44.261303.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-04T02:19:44.261303.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-09-04T02:19:44.261303.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-04T02:19:44.261303.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-04T02:19:44.261303.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-04T02:19:44.261303.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-04T02:19:44.261303.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-09-04T02:19:44.261303.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-09-04T02:19:44.261303.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-09-04T02:19:44.261303.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-09-04T02:19:44.261303.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-04T02:19:44.261303.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-09-04T02:19:44.261303.parquet", "**/details_harness|hendrycksTest-management|5_2023-09-04T02:19:44.261303.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-09-04T02:19:44.261303.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-09-04T02:19:44.261303.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-09-04T02:19:44.261303.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-09-04T02:19:44.261303.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-04T02:19:44.261303.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-09-04T02:19:44.261303.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-09-04T02:19:44.261303.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-09-04T02:19:44.261303.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-09-04T02:19:44.261303.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-09-04T02:19:44.261303.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-09-04T02:19:44.261303.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-09-04T02:19:44.261303.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-09-04T02:19:44.261303.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-09-04T02:19:44.261303.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-09-04T02:19:44.261303.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-04T02:19:44.261303.parquet", "**/details_harness|hendrycksTest-virology|5_2023-09-04T02:19:44.261303.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-09-04T02:19:44.261303.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_09_04T02_19_44.261303", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-04T02:19:44.261303.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-04T02:19:44.261303.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_09_04T02_19_44.261303", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-09-04T02:19:44.261303.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-09-04T02:19:44.261303.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_09_04T02_19_44.261303", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-09-04T02:19:44.261303.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-09-04T02:19:44.261303.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_09_04T02_19_44.261303", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-09-04T02:19:44.261303.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-09-04T02:19:44.261303.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_09_04T02_19_44.261303", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-04T02:19:44.261303.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-04T02:19:44.261303.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_09_04T02_19_44.261303", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-09-04T02:19:44.261303.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-09-04T02:19:44.261303.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_09_04T02_19_44.261303", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-09-04T02:19:44.261303.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-09-04T02:19:44.261303.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_09_04T02_19_44.261303", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-09-04T02:19:44.261303.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-09-04T02:19:44.261303.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_09_04T02_19_44.261303", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-09-04T02:19:44.261303.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-09-04T02:19:44.261303.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_09_04T02_19_44.261303", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-09-04T02:19:44.261303.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-09-04T02:19:44.261303.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_09_04T02_19_44.261303", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-09-04T02:19:44.261303.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-09-04T02:19:44.261303.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_09_04T02_19_44.261303", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-09-04T02:19:44.261303.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-09-04T02:19:44.261303.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_09_04T02_19_44.261303", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-04T02:19:44.261303.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-04T02:19:44.261303.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_09_04T02_19_44.261303", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-09-04T02:19:44.261303.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-09-04T02:19:44.261303.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_09_04T02_19_44.261303", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-04T02:19:44.261303.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-04T02:19:44.261303.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_09_04T02_19_44.261303", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-04T02:19:44.261303.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-04T02:19:44.261303.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_09_04T02_19_44.261303", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-09-04T02:19:44.261303.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-09-04T02:19:44.261303.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_09_04T02_19_44.261303", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-09-04T02:19:44.261303.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-09-04T02:19:44.261303.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_09_04T02_19_44.261303", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-09-04T02:19:44.261303.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-09-04T02:19:44.261303.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_09_04T02_19_44.261303", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-04T02:19:44.261303.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-04T02:19:44.261303.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_09_04T02_19_44.261303", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-04T02:19:44.261303.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-04T02:19:44.261303.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_09_04T02_19_44.261303", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-04T02:19:44.261303.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-04T02:19:44.261303.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_09_04T02_19_44.261303", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-09-04T02:19:44.261303.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-09-04T02:19:44.261303.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_09_04T02_19_44.261303", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-04T02:19:44.261303.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-04T02:19:44.261303.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_09_04T02_19_44.261303", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-04T02:19:44.261303.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-04T02:19:44.261303.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_09_04T02_19_44.261303", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-04T02:19:44.261303.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-04T02:19:44.261303.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_09_04T02_19_44.261303", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-04T02:19:44.261303.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-04T02:19:44.261303.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_09_04T02_19_44.261303", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-09-04T02:19:44.261303.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-09-04T02:19:44.261303.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_09_04T02_19_44.261303", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-04T02:19:44.261303.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-04T02:19:44.261303.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_09_04T02_19_44.261303", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-04T02:19:44.261303.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-04T02:19:44.261303.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_09_04T02_19_44.261303", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-04T02:19:44.261303.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-04T02:19:44.261303.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_09_04T02_19_44.261303", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-04T02:19:44.261303.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-04T02:19:44.261303.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_09_04T02_19_44.261303", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-09-04T02:19:44.261303.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-09-04T02:19:44.261303.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_09_04T02_19_44.261303", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-09-04T02:19:44.261303.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-09-04T02:19:44.261303.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_09_04T02_19_44.261303", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-09-04T02:19:44.261303.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-09-04T02:19:44.261303.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_09_04T02_19_44.261303", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-09-04T02:19:44.261303.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-09-04T02:19:44.261303.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_09_04T02_19_44.261303", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-04T02:19:44.261303.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-04T02:19:44.261303.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_09_04T02_19_44.261303", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-09-04T02:19:44.261303.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-09-04T02:19:44.261303.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_09_04T02_19_44.261303", "path": ["**/details_harness|hendrycksTest-management|5_2023-09-04T02:19:44.261303.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-09-04T02:19:44.261303.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_09_04T02_19_44.261303", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-09-04T02:19:44.261303.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-09-04T02:19:44.261303.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_09_04T02_19_44.261303", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-09-04T02:19:44.261303.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-09-04T02:19:44.261303.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_09_04T02_19_44.261303", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-09-04T02:19:44.261303.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-09-04T02:19:44.261303.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_09_04T02_19_44.261303", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-09-04T02:19:44.261303.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-09-04T02:19:44.261303.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_09_04T02_19_44.261303", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-04T02:19:44.261303.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-04T02:19:44.261303.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_09_04T02_19_44.261303", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-09-04T02:19:44.261303.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-09-04T02:19:44.261303.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_09_04T02_19_44.261303", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-09-04T02:19:44.261303.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-09-04T02:19:44.261303.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_09_04T02_19_44.261303", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-09-04T02:19:44.261303.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-09-04T02:19:44.261303.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_09_04T02_19_44.261303", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-09-04T02:19:44.261303.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-09-04T02:19:44.261303.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_09_04T02_19_44.261303", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-09-04T02:19:44.261303.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-09-04T02:19:44.261303.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_09_04T02_19_44.261303", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-09-04T02:19:44.261303.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-09-04T02:19:44.261303.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_09_04T02_19_44.261303", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-09-04T02:19:44.261303.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-09-04T02:19:44.261303.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_09_04T02_19_44.261303", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-09-04T02:19:44.261303.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-09-04T02:19:44.261303.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_09_04T02_19_44.261303", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-09-04T02:19:44.261303.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-09-04T02:19:44.261303.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_09_04T02_19_44.261303", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-09-04T02:19:44.261303.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-09-04T02:19:44.261303.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_09_04T02_19_44.261303", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-04T02:19:44.261303.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-04T02:19:44.261303.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_09_04T02_19_44.261303", "path": ["**/details_harness|hendrycksTest-virology|5_2023-09-04T02:19:44.261303.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-09-04T02:19:44.261303.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_09_04T02_19_44.261303", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-09-04T02:19:44.261303.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-09-04T02:19:44.261303.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_09_04T02_19_44.261303", "path": ["**/details_harness|truthfulqa:mc|0_2023-09-04T02:19:44.261303.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-09-04T02:19:44.261303.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_18T03_41_50.218488", "path": ["**/details_harness|winogrande|5_2023-10-18T03-41-50.218488.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-18T03-41-50.218488.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_09_04T02_19_44.261303", "path": ["results_2023-09-04T02:19:44.261303.parquet"]}, {"split": "2023_10_18T03_41_50.218488", "path": ["results_2023-10-18T03-41-50.218488.parquet"]}, {"split": "latest", "path": ["results_2023-10-18T03-41-50.218488.parquet"]}]}]}
2023-10-18T02:42:09+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of CHIH-HUNG/llama-2-13b-huangyt_Fintune_1_17w-q_k_v_o_proj ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model CHIH-HUNG/llama-2-13b-huangyt_Fintune_1_17w-q_k_v_o_proj on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-10-18T03:41:50.218488(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of CHIH-HUNG/llama-2-13b-huangyt_Fintune_1_17w-q_k_v_o_proj", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model CHIH-HUNG/llama-2-13b-huangyt_Fintune_1_17w-q_k_v_o_proj on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-18T03:41:50.218488(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of CHIH-HUNG/llama-2-13b-huangyt_Fintune_1_17w-q_k_v_o_proj", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model CHIH-HUNG/llama-2-13b-huangyt_Fintune_1_17w-q_k_v_o_proj on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-18T03:41:50.218488(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 43, 31, 191, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of CHIH-HUNG/llama-2-13b-huangyt_Fintune_1_17w-q_k_v_o_proj## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model CHIH-HUNG/llama-2-13b-huangyt_Fintune_1_17w-q_k_v_o_proj on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-18T03:41:50.218488(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
8a721b57d7f6de6a58a35da8ed0c9163d0c6ad8c
# Dataset Card for "big-golem" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
volvoDon/big-golem
[ "region:us" ]
2023-09-04T01:28:28+00:00
{"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "test", "path": "data/test-*"}]}], "dataset_info": {"features": [{"name": "input_ids", "sequence": "int32"}], "splits": [{"name": "train", "num_bytes": 155724.0, "num_examples": 19}, {"name": "test", "num_bytes": 24588.0, "num_examples": 3}], "download_size": 102998, "dataset_size": 180312.0}}
2023-09-04T01:28:31+00:00
[]
[]
TAGS #region-us
# Dataset Card for "big-golem" More Information needed
[ "# Dataset Card for \"big-golem\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"big-golem\"\n\nMore Information needed" ]
[ 6, 14 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"big-golem\"\n\nMore Information needed" ]
df9d077eee4bbe66cece2ccc42961c60833c3c85
# Dataset Card for Evaluation run of yeontaek/llama-2-70B-ensemble-v7 ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/yeontaek/llama-2-70B-ensemble-v7 - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [yeontaek/llama-2-70B-ensemble-v7](https://huggingface.co/yeontaek/llama-2-70B-ensemble-v7) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_yeontaek__llama-2-70B-ensemble-v7", "harness_truthfulqa_mc_0", split="train") ``` ## Latest results These are the [latest results from run 2023-09-04T02:38:01.038212](https://huggingface.co/datasets/open-llm-leaderboard/details_yeontaek__llama-2-70B-ensemble-v7/blob/main/results_2023-09-04T02%3A38%3A01.038212.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.6832397060915553, "acc_stderr": 0.031693477754770626, "acc_norm": 0.6869592578044069, "acc_norm_stderr": 0.03166529474407705, "mc1": 0.4418604651162791, "mc1_stderr": 0.017384767478986214, "mc2": 0.6310264033909807, "mc2_stderr": 0.01502146266727205 }, "harness|arc:challenge|25": { "acc": 0.6749146757679181, "acc_stderr": 0.01368814730972912, "acc_norm": 0.7030716723549488, "acc_norm_stderr": 0.013352025976725227 }, "harness|hellaswag|10": { "acc": 0.6818362875921131, "acc_stderr": 0.004648115322328777, "acc_norm": 0.873132842063334, "acc_norm_stderr": 0.0033214390244115494 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.36, "acc_stderr": 0.04824181513244218, "acc_norm": 0.36, "acc_norm_stderr": 0.04824181513244218 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.6370370370370371, "acc_stderr": 0.04153948404742398, "acc_norm": 0.6370370370370371, "acc_norm_stderr": 0.04153948404742398 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.7828947368421053, "acc_stderr": 0.03355045304882923, "acc_norm": 0.7828947368421053, "acc_norm_stderr": 0.03355045304882923 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.75, "acc_stderr": 0.04351941398892446, "acc_norm": 0.75, "acc_norm_stderr": 0.04351941398892446 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.7283018867924528, "acc_stderr": 0.027377706624670713, "acc_norm": 0.7283018867924528, "acc_norm_stderr": 0.027377706624670713 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.8194444444444444, "acc_stderr": 0.032166008088022675, "acc_norm": 0.8194444444444444, "acc_norm_stderr": 0.032166008088022675 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.5, "acc_stderr": 0.050251890762960605, "acc_norm": 0.5, "acc_norm_stderr": 0.050251890762960605 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.55, "acc_stderr": 0.05, "acc_norm": 0.55, "acc_norm_stderr": 0.05 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.35, "acc_stderr": 0.047937248544110196, "acc_norm": 0.35, "acc_norm_stderr": 0.047937248544110196 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.630057803468208, "acc_stderr": 0.0368122963339432, "acc_norm": 0.630057803468208, "acc_norm_stderr": 0.0368122963339432 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.3627450980392157, "acc_stderr": 0.04784060704105653, "acc_norm": 0.3627450980392157, "acc_norm_stderr": 0.04784060704105653 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.76, "acc_stderr": 0.042923469599092816, "acc_norm": 0.76, "acc_norm_stderr": 0.042923469599092816 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.6595744680851063, "acc_stderr": 0.030976692998534422, "acc_norm": 0.6595744680851063, "acc_norm_stderr": 0.030976692998534422 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.43859649122807015, "acc_stderr": 0.04668000738510455, "acc_norm": 0.43859649122807015, "acc_norm_stderr": 0.04668000738510455 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.5862068965517241, "acc_stderr": 0.04104269211806232, "acc_norm": 0.5862068965517241, "acc_norm_stderr": 0.04104269211806232 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.48148148148148145, "acc_stderr": 0.025733641991838987, "acc_norm": 0.48148148148148145, "acc_norm_stderr": 0.025733641991838987 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.46825396825396826, "acc_stderr": 0.04463112720677172, "acc_norm": 0.46825396825396826, "acc_norm_stderr": 0.04463112720677172 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.47, "acc_stderr": 0.05016135580465919, "acc_norm": 0.47, "acc_norm_stderr": 0.05016135580465919 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.8032258064516129, "acc_stderr": 0.022616409420742025, "acc_norm": 0.8032258064516129, "acc_norm_stderr": 0.022616409420742025 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.5270935960591133, "acc_stderr": 0.03512819077876106, "acc_norm": 0.5270935960591133, "acc_norm_stderr": 0.03512819077876106 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.75, "acc_stderr": 0.04351941398892446, "acc_norm": 0.75, "acc_norm_stderr": 0.04351941398892446 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.8545454545454545, "acc_stderr": 0.027530196355066573, "acc_norm": 0.8545454545454545, "acc_norm_stderr": 0.027530196355066573 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.8939393939393939, "acc_stderr": 0.021938047738853113, "acc_norm": 0.8939393939393939, "acc_norm_stderr": 0.021938047738853113 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.927461139896373, "acc_stderr": 0.018718998520678178, "acc_norm": 0.927461139896373, "acc_norm_stderr": 0.018718998520678178 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.6948717948717948, "acc_stderr": 0.023346335293325887, "acc_norm": 0.6948717948717948, "acc_norm_stderr": 0.023346335293325887 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.3333333333333333, "acc_stderr": 0.028742040903948492, "acc_norm": 0.3333333333333333, "acc_norm_stderr": 0.028742040903948492 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.7478991596638656, "acc_stderr": 0.028205545033277726, "acc_norm": 0.7478991596638656, "acc_norm_stderr": 0.028205545033277726 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.4503311258278146, "acc_stderr": 0.04062290018683775, "acc_norm": 0.4503311258278146, "acc_norm_stderr": 0.04062290018683775 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.8880733944954129, "acc_stderr": 0.013517352714958792, "acc_norm": 0.8880733944954129, "acc_norm_stderr": 0.013517352714958792 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.5416666666666666, "acc_stderr": 0.03398110890294636, "acc_norm": 0.5416666666666666, "acc_norm_stderr": 0.03398110890294636 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.8970588235294118, "acc_stderr": 0.02132833757080438, "acc_norm": 0.8970588235294118, "acc_norm_stderr": 0.02132833757080438 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.8818565400843882, "acc_stderr": 0.021011052659878456, "acc_norm": 0.8818565400843882, "acc_norm_stderr": 0.021011052659878456 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.7623318385650224, "acc_stderr": 0.02856807946471428, "acc_norm": 0.7623318385650224, "acc_norm_stderr": 0.02856807946471428 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.8015267175572519, "acc_stderr": 0.0349814938546247, "acc_norm": 0.8015267175572519, "acc_norm_stderr": 0.0349814938546247 }, "harness|hendrycksTest-international_law|5": { "acc": 0.8264462809917356, "acc_stderr": 0.03457272836917671, "acc_norm": 0.8264462809917356, "acc_norm_stderr": 0.03457272836917671 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.8240740740740741, "acc_stderr": 0.036809181416738807, "acc_norm": 0.8240740740740741, "acc_norm_stderr": 0.036809181416738807 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.8098159509202454, "acc_stderr": 0.03083349114628123, "acc_norm": 0.8098159509202454, "acc_norm_stderr": 0.03083349114628123 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.48214285714285715, "acc_stderr": 0.047427623612430116, "acc_norm": 0.48214285714285715, "acc_norm_stderr": 0.047427623612430116 }, "harness|hendrycksTest-management|5": { "acc": 0.7961165048543689, "acc_stderr": 0.0398913985953177, "acc_norm": 0.7961165048543689, "acc_norm_stderr": 0.0398913985953177 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8846153846153846, "acc_stderr": 0.020930193185179333, "acc_norm": 0.8846153846153846, "acc_norm_stderr": 0.020930193185179333 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.66, "acc_stderr": 0.04760952285695237, "acc_norm": 0.66, "acc_norm_stderr": 0.04760952285695237 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.8646232439335888, "acc_stderr": 0.012234384586856491, "acc_norm": 0.8646232439335888, "acc_norm_stderr": 0.012234384586856491 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.7456647398843931, "acc_stderr": 0.023445826276545543, "acc_norm": 0.7456647398843931, "acc_norm_stderr": 0.023445826276545543 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.5910614525139665, "acc_stderr": 0.016442830654715548, "acc_norm": 0.5910614525139665, "acc_norm_stderr": 0.016442830654715548 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.7091503267973857, "acc_stderr": 0.02600480036395213, "acc_norm": 0.7091503267973857, "acc_norm_stderr": 0.02600480036395213 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.7717041800643086, "acc_stderr": 0.023839303311398195, "acc_norm": 0.7717041800643086, "acc_norm_stderr": 0.023839303311398195 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.8024691358024691, "acc_stderr": 0.022152889927898968, "acc_norm": 0.8024691358024691, "acc_norm_stderr": 0.022152889927898968 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.5460992907801419, "acc_stderr": 0.029700453247291474, "acc_norm": 0.5460992907801419, "acc_norm_stderr": 0.029700453247291474 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.5430247718383312, "acc_stderr": 0.012722869501611419, "acc_norm": 0.5430247718383312, "acc_norm_stderr": 0.012722869501611419 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.6838235294117647, "acc_stderr": 0.028245687391462927, "acc_norm": 0.6838235294117647, "acc_norm_stderr": 0.028245687391462927 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.7238562091503268, "acc_stderr": 0.018087276935663137, "acc_norm": 0.7238562091503268, "acc_norm_stderr": 0.018087276935663137 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.7090909090909091, "acc_stderr": 0.043502714429232425, "acc_norm": 0.7090909090909091, "acc_norm_stderr": 0.043502714429232425 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.7877551020408163, "acc_stderr": 0.026176967197866764, "acc_norm": 0.7877551020408163, "acc_norm_stderr": 0.026176967197866764 }, "harness|hendrycksTest-sociology|5": { "acc": 0.8706467661691543, "acc_stderr": 0.023729830881018526, "acc_norm": 0.8706467661691543, "acc_norm_stderr": 0.023729830881018526 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.86, "acc_stderr": 0.03487350880197769, "acc_norm": 0.86, "acc_norm_stderr": 0.03487350880197769 }, "harness|hendrycksTest-virology|5": { "acc": 0.536144578313253, "acc_stderr": 0.03882310850890594, "acc_norm": 0.536144578313253, "acc_norm_stderr": 0.03882310850890594 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.8362573099415205, "acc_stderr": 0.028380919596145866, "acc_norm": 0.8362573099415205, "acc_norm_stderr": 0.028380919596145866 }, "harness|truthfulqa:mc|0": { "mc1": 0.4418604651162791, "mc1_stderr": 0.017384767478986214, "mc2": 0.6310264033909807, "mc2_stderr": 0.01502146266727205 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_yeontaek__llama-2-70B-ensemble-v7
[ "region:us" ]
2023-09-04T01:38:25+00:00
{"pretty_name": "Evaluation run of yeontaek/llama-2-70B-ensemble-v7", "dataset_summary": "Dataset automatically created during the evaluation run of model [yeontaek/llama-2-70B-ensemble-v7](https://huggingface.co/yeontaek/llama-2-70B-ensemble-v7) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_yeontaek__llama-2-70B-ensemble-v7\",\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-09-04T02:38:01.038212](https://huggingface.co/datasets/open-llm-leaderboard/details_yeontaek__llama-2-70B-ensemble-v7/blob/main/results_2023-09-04T02%3A38%3A01.038212.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6832397060915553,\n \"acc_stderr\": 0.031693477754770626,\n \"acc_norm\": 0.6869592578044069,\n \"acc_norm_stderr\": 0.03166529474407705,\n \"mc1\": 0.4418604651162791,\n \"mc1_stderr\": 0.017384767478986214,\n \"mc2\": 0.6310264033909807,\n \"mc2_stderr\": 0.01502146266727205\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6749146757679181,\n \"acc_stderr\": 0.01368814730972912,\n \"acc_norm\": 0.7030716723549488,\n \"acc_norm_stderr\": 0.013352025976725227\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6818362875921131,\n \"acc_stderr\": 0.004648115322328777,\n \"acc_norm\": 0.873132842063334,\n \"acc_norm_stderr\": 0.0033214390244115494\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6370370370370371,\n \"acc_stderr\": 0.04153948404742398,\n \"acc_norm\": 0.6370370370370371,\n \"acc_norm_stderr\": 0.04153948404742398\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.7828947368421053,\n \"acc_stderr\": 0.03355045304882923,\n \"acc_norm\": 0.7828947368421053,\n \"acc_norm_stderr\": 0.03355045304882923\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7283018867924528,\n \"acc_stderr\": 0.027377706624670713,\n \"acc_norm\": 0.7283018867924528,\n \"acc_norm_stderr\": 0.027377706624670713\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8194444444444444,\n \"acc_stderr\": 0.032166008088022675,\n \"acc_norm\": 0.8194444444444444,\n \"acc_norm_stderr\": 0.032166008088022675\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.55,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.55,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.630057803468208,\n \"acc_stderr\": 0.0368122963339432,\n \"acc_norm\": 0.630057803468208,\n \"acc_norm_stderr\": 0.0368122963339432\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.3627450980392157,\n \"acc_stderr\": 0.04784060704105653,\n \"acc_norm\": 0.3627450980392157,\n \"acc_norm_stderr\": 0.04784060704105653\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.76,\n \"acc_stderr\": 0.042923469599092816,\n \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.042923469599092816\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.6595744680851063,\n \"acc_stderr\": 0.030976692998534422,\n \"acc_norm\": 0.6595744680851063,\n \"acc_norm_stderr\": 0.030976692998534422\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.43859649122807015,\n \"acc_stderr\": 0.04668000738510455,\n \"acc_norm\": 0.43859649122807015,\n \"acc_norm_stderr\": 0.04668000738510455\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5862068965517241,\n \"acc_stderr\": 0.04104269211806232,\n \"acc_norm\": 0.5862068965517241,\n \"acc_norm_stderr\": 0.04104269211806232\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.48148148148148145,\n \"acc_stderr\": 0.025733641991838987,\n \"acc_norm\": 0.48148148148148145,\n \"acc_norm_stderr\": 0.025733641991838987\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.46825396825396826,\n \"acc_stderr\": 0.04463112720677172,\n \"acc_norm\": 0.46825396825396826,\n \"acc_norm_stderr\": 0.04463112720677172\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.47,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8032258064516129,\n \"acc_stderr\": 0.022616409420742025,\n \"acc_norm\": 0.8032258064516129,\n \"acc_norm_stderr\": 0.022616409420742025\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5270935960591133,\n \"acc_stderr\": 0.03512819077876106,\n \"acc_norm\": 0.5270935960591133,\n \"acc_norm_stderr\": 0.03512819077876106\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.8545454545454545,\n \"acc_stderr\": 0.027530196355066573,\n \"acc_norm\": 0.8545454545454545,\n \"acc_norm_stderr\": 0.027530196355066573\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.8939393939393939,\n \"acc_stderr\": 0.021938047738853113,\n \"acc_norm\": 0.8939393939393939,\n \"acc_norm_stderr\": 0.021938047738853113\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.927461139896373,\n \"acc_stderr\": 0.018718998520678178,\n \"acc_norm\": 0.927461139896373,\n \"acc_norm_stderr\": 0.018718998520678178\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6948717948717948,\n \"acc_stderr\": 0.023346335293325887,\n \"acc_norm\": 0.6948717948717948,\n \"acc_norm_stderr\": 0.023346335293325887\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.028742040903948492,\n \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.028742040903948492\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.7478991596638656,\n \"acc_stderr\": 0.028205545033277726,\n \"acc_norm\": 0.7478991596638656,\n \"acc_norm_stderr\": 0.028205545033277726\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.4503311258278146,\n \"acc_stderr\": 0.04062290018683775,\n \"acc_norm\": 0.4503311258278146,\n \"acc_norm_stderr\": 0.04062290018683775\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8880733944954129,\n \"acc_stderr\": 0.013517352714958792,\n \"acc_norm\": 0.8880733944954129,\n \"acc_norm_stderr\": 0.013517352714958792\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5416666666666666,\n \"acc_stderr\": 0.03398110890294636,\n \"acc_norm\": 0.5416666666666666,\n \"acc_norm_stderr\": 0.03398110890294636\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8970588235294118,\n \"acc_stderr\": 0.02132833757080438,\n \"acc_norm\": 0.8970588235294118,\n \"acc_norm_stderr\": 0.02132833757080438\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8818565400843882,\n \"acc_stderr\": 0.021011052659878456,\n \"acc_norm\": 0.8818565400843882,\n \"acc_norm_stderr\": 0.021011052659878456\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7623318385650224,\n \"acc_stderr\": 0.02856807946471428,\n \"acc_norm\": 0.7623318385650224,\n \"acc_norm_stderr\": 0.02856807946471428\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.8015267175572519,\n \"acc_stderr\": 0.0349814938546247,\n \"acc_norm\": 0.8015267175572519,\n \"acc_norm_stderr\": 0.0349814938546247\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8264462809917356,\n \"acc_stderr\": 0.03457272836917671,\n \"acc_norm\": 0.8264462809917356,\n \"acc_norm_stderr\": 0.03457272836917671\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8240740740740741,\n \"acc_stderr\": 0.036809181416738807,\n \"acc_norm\": 0.8240740740740741,\n \"acc_norm_stderr\": 0.036809181416738807\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.8098159509202454,\n \"acc_stderr\": 0.03083349114628123,\n \"acc_norm\": 0.8098159509202454,\n \"acc_norm_stderr\": 0.03083349114628123\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.48214285714285715,\n \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.48214285714285715,\n \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7961165048543689,\n \"acc_stderr\": 0.0398913985953177,\n \"acc_norm\": 0.7961165048543689,\n \"acc_norm_stderr\": 0.0398913985953177\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8846153846153846,\n \"acc_stderr\": 0.020930193185179333,\n \"acc_norm\": 0.8846153846153846,\n \"acc_norm_stderr\": 0.020930193185179333\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \"acc_norm\": 0.66,\n \"acc_norm_stderr\": 0.04760952285695237\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8646232439335888,\n \"acc_stderr\": 0.012234384586856491,\n \"acc_norm\": 0.8646232439335888,\n \"acc_norm_stderr\": 0.012234384586856491\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7456647398843931,\n \"acc_stderr\": 0.023445826276545543,\n \"acc_norm\": 0.7456647398843931,\n \"acc_norm_stderr\": 0.023445826276545543\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.5910614525139665,\n \"acc_stderr\": 0.016442830654715548,\n \"acc_norm\": 0.5910614525139665,\n \"acc_norm_stderr\": 0.016442830654715548\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7091503267973857,\n \"acc_stderr\": 0.02600480036395213,\n \"acc_norm\": 0.7091503267973857,\n \"acc_norm_stderr\": 0.02600480036395213\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7717041800643086,\n \"acc_stderr\": 0.023839303311398195,\n \"acc_norm\": 0.7717041800643086,\n \"acc_norm_stderr\": 0.023839303311398195\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.8024691358024691,\n \"acc_stderr\": 0.022152889927898968,\n \"acc_norm\": 0.8024691358024691,\n \"acc_norm_stderr\": 0.022152889927898968\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.5460992907801419,\n \"acc_stderr\": 0.029700453247291474,\n \"acc_norm\": 0.5460992907801419,\n \"acc_norm_stderr\": 0.029700453247291474\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5430247718383312,\n \"acc_stderr\": 0.012722869501611419,\n \"acc_norm\": 0.5430247718383312,\n \"acc_norm_stderr\": 0.012722869501611419\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6838235294117647,\n \"acc_stderr\": 0.028245687391462927,\n \"acc_norm\": 0.6838235294117647,\n \"acc_norm_stderr\": 0.028245687391462927\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.7238562091503268,\n \"acc_stderr\": 0.018087276935663137,\n \"acc_norm\": 0.7238562091503268,\n \"acc_norm_stderr\": 0.018087276935663137\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7090909090909091,\n \"acc_stderr\": 0.043502714429232425,\n \"acc_norm\": 0.7090909090909091,\n \"acc_norm_stderr\": 0.043502714429232425\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7877551020408163,\n \"acc_stderr\": 0.026176967197866764,\n \"acc_norm\": 0.7877551020408163,\n \"acc_norm_stderr\": 0.026176967197866764\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8706467661691543,\n \"acc_stderr\": 0.023729830881018526,\n \"acc_norm\": 0.8706467661691543,\n \"acc_norm_stderr\": 0.023729830881018526\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.86,\n \"acc_stderr\": 0.03487350880197769,\n \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.03487350880197769\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.536144578313253,\n \"acc_stderr\": 0.03882310850890594,\n \"acc_norm\": 0.536144578313253,\n \"acc_norm_stderr\": 0.03882310850890594\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4418604651162791,\n \"mc1_stderr\": 0.017384767478986214,\n \"mc2\": 0.6310264033909807,\n \"mc2_stderr\": 0.01502146266727205\n }\n}\n```", "repo_url": "https://huggingface.co/yeontaek/llama-2-70B-ensemble-v7", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_09_04T02_38_01.038212", "path": ["**/details_harness|arc:challenge|25_2023-09-04T02:38:01.038212.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-09-04T02:38:01.038212.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_09_04T02_38_01.038212", "path": ["**/details_harness|hellaswag|10_2023-09-04T02:38:01.038212.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-09-04T02:38:01.038212.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_09_04T02_38_01.038212", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-04T02:38:01.038212.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-09-04T02:38:01.038212.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-09-04T02:38:01.038212.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-09-04T02:38:01.038212.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-04T02:38:01.038212.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-09-04T02:38:01.038212.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-09-04T02:38:01.038212.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-09-04T02:38:01.038212.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-09-04T02:38:01.038212.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-09-04T02:38:01.038212.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-09-04T02:38:01.038212.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-09-04T02:38:01.038212.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-04T02:38:01.038212.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-09-04T02:38:01.038212.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-04T02:38:01.038212.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-04T02:38:01.038212.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-09-04T02:38:01.038212.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-09-04T02:38:01.038212.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-09-04T02:38:01.038212.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-04T02:38:01.038212.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-04T02:38:01.038212.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-04T02:38:01.038212.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-09-04T02:38:01.038212.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-04T02:38:01.038212.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-04T02:38:01.038212.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-04T02:38:01.038212.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-04T02:38:01.038212.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-09-04T02:38:01.038212.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-04T02:38:01.038212.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-04T02:38:01.038212.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-04T02:38:01.038212.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-04T02:38:01.038212.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-09-04T02:38:01.038212.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-09-04T02:38:01.038212.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-09-04T02:38:01.038212.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-09-04T02:38:01.038212.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-04T02:38:01.038212.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-09-04T02:38:01.038212.parquet", "**/details_harness|hendrycksTest-management|5_2023-09-04T02:38:01.038212.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-09-04T02:38:01.038212.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-09-04T02:38:01.038212.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-09-04T02:38:01.038212.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-09-04T02:38:01.038212.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-04T02:38:01.038212.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-09-04T02:38:01.038212.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-09-04T02:38:01.038212.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-09-04T02:38:01.038212.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-09-04T02:38:01.038212.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-09-04T02:38:01.038212.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-09-04T02:38:01.038212.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-09-04T02:38:01.038212.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-09-04T02:38:01.038212.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-09-04T02:38:01.038212.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-09-04T02:38:01.038212.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-04T02:38:01.038212.parquet", "**/details_harness|hendrycksTest-virology|5_2023-09-04T02:38:01.038212.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-09-04T02:38:01.038212.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-04T02:38:01.038212.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-09-04T02:38:01.038212.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-09-04T02:38:01.038212.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-09-04T02:38:01.038212.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-04T02:38:01.038212.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-09-04T02:38:01.038212.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-09-04T02:38:01.038212.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-09-04T02:38:01.038212.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-09-04T02:38:01.038212.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-09-04T02:38:01.038212.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-09-04T02:38:01.038212.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-09-04T02:38:01.038212.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-04T02:38:01.038212.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-09-04T02:38:01.038212.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-04T02:38:01.038212.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-04T02:38:01.038212.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-09-04T02:38:01.038212.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-09-04T02:38:01.038212.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-09-04T02:38:01.038212.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-04T02:38:01.038212.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-04T02:38:01.038212.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-04T02:38:01.038212.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-09-04T02:38:01.038212.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-04T02:38:01.038212.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-04T02:38:01.038212.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-04T02:38:01.038212.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-04T02:38:01.038212.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-09-04T02:38:01.038212.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-04T02:38:01.038212.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-04T02:38:01.038212.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-04T02:38:01.038212.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-04T02:38:01.038212.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-09-04T02:38:01.038212.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-09-04T02:38:01.038212.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-09-04T02:38:01.038212.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-09-04T02:38:01.038212.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-04T02:38:01.038212.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-09-04T02:38:01.038212.parquet", "**/details_harness|hendrycksTest-management|5_2023-09-04T02:38:01.038212.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-09-04T02:38:01.038212.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-09-04T02:38:01.038212.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-09-04T02:38:01.038212.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-09-04T02:38:01.038212.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-04T02:38:01.038212.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-09-04T02:38:01.038212.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-09-04T02:38:01.038212.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-09-04T02:38:01.038212.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-09-04T02:38:01.038212.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-09-04T02:38:01.038212.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-09-04T02:38:01.038212.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-09-04T02:38:01.038212.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-09-04T02:38:01.038212.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-09-04T02:38:01.038212.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-09-04T02:38:01.038212.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-04T02:38:01.038212.parquet", "**/details_harness|hendrycksTest-virology|5_2023-09-04T02:38:01.038212.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-09-04T02:38:01.038212.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_09_04T02_38_01.038212", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-04T02:38:01.038212.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-04T02:38:01.038212.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_09_04T02_38_01.038212", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-09-04T02:38:01.038212.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-09-04T02:38:01.038212.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_09_04T02_38_01.038212", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-09-04T02:38:01.038212.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-09-04T02:38:01.038212.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_09_04T02_38_01.038212", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-09-04T02:38:01.038212.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-09-04T02:38:01.038212.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_09_04T02_38_01.038212", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-04T02:38:01.038212.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-04T02:38:01.038212.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_09_04T02_38_01.038212", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-09-04T02:38:01.038212.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-09-04T02:38:01.038212.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_09_04T02_38_01.038212", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-09-04T02:38:01.038212.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-09-04T02:38:01.038212.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_09_04T02_38_01.038212", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-09-04T02:38:01.038212.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-09-04T02:38:01.038212.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_09_04T02_38_01.038212", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-09-04T02:38:01.038212.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-09-04T02:38:01.038212.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_09_04T02_38_01.038212", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-09-04T02:38:01.038212.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-09-04T02:38:01.038212.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_09_04T02_38_01.038212", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-09-04T02:38:01.038212.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-09-04T02:38:01.038212.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_09_04T02_38_01.038212", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-09-04T02:38:01.038212.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-09-04T02:38:01.038212.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_09_04T02_38_01.038212", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-04T02:38:01.038212.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-04T02:38:01.038212.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_09_04T02_38_01.038212", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-09-04T02:38:01.038212.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-09-04T02:38:01.038212.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_09_04T02_38_01.038212", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-04T02:38:01.038212.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-04T02:38:01.038212.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_09_04T02_38_01.038212", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-04T02:38:01.038212.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-04T02:38:01.038212.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_09_04T02_38_01.038212", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-09-04T02:38:01.038212.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-09-04T02:38:01.038212.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_09_04T02_38_01.038212", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-09-04T02:38:01.038212.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-09-04T02:38:01.038212.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_09_04T02_38_01.038212", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-09-04T02:38:01.038212.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-09-04T02:38:01.038212.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_09_04T02_38_01.038212", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-04T02:38:01.038212.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-04T02:38:01.038212.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_09_04T02_38_01.038212", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-04T02:38:01.038212.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-04T02:38:01.038212.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_09_04T02_38_01.038212", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-04T02:38:01.038212.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-04T02:38:01.038212.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_09_04T02_38_01.038212", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-09-04T02:38:01.038212.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-09-04T02:38:01.038212.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_09_04T02_38_01.038212", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-04T02:38:01.038212.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-04T02:38:01.038212.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_09_04T02_38_01.038212", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-04T02:38:01.038212.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-04T02:38:01.038212.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_09_04T02_38_01.038212", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-04T02:38:01.038212.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-04T02:38:01.038212.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_09_04T02_38_01.038212", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-04T02:38:01.038212.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-04T02:38:01.038212.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_09_04T02_38_01.038212", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-09-04T02:38:01.038212.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-09-04T02:38:01.038212.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_09_04T02_38_01.038212", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-04T02:38:01.038212.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-04T02:38:01.038212.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_09_04T02_38_01.038212", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-04T02:38:01.038212.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-04T02:38:01.038212.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_09_04T02_38_01.038212", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-04T02:38:01.038212.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-04T02:38:01.038212.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_09_04T02_38_01.038212", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-04T02:38:01.038212.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-04T02:38:01.038212.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_09_04T02_38_01.038212", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-09-04T02:38:01.038212.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-09-04T02:38:01.038212.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_09_04T02_38_01.038212", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-09-04T02:38:01.038212.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-09-04T02:38:01.038212.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_09_04T02_38_01.038212", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-09-04T02:38:01.038212.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-09-04T02:38:01.038212.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_09_04T02_38_01.038212", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-09-04T02:38:01.038212.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-09-04T02:38:01.038212.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_09_04T02_38_01.038212", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-04T02:38:01.038212.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-04T02:38:01.038212.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_09_04T02_38_01.038212", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-09-04T02:38:01.038212.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-09-04T02:38:01.038212.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_09_04T02_38_01.038212", "path": ["**/details_harness|hendrycksTest-management|5_2023-09-04T02:38:01.038212.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-09-04T02:38:01.038212.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_09_04T02_38_01.038212", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-09-04T02:38:01.038212.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-09-04T02:38:01.038212.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_09_04T02_38_01.038212", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-09-04T02:38:01.038212.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-09-04T02:38:01.038212.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_09_04T02_38_01.038212", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-09-04T02:38:01.038212.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-09-04T02:38:01.038212.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_09_04T02_38_01.038212", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-09-04T02:38:01.038212.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-09-04T02:38:01.038212.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_09_04T02_38_01.038212", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-04T02:38:01.038212.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-04T02:38:01.038212.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_09_04T02_38_01.038212", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-09-04T02:38:01.038212.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-09-04T02:38:01.038212.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_09_04T02_38_01.038212", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-09-04T02:38:01.038212.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-09-04T02:38:01.038212.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_09_04T02_38_01.038212", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-09-04T02:38:01.038212.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-09-04T02:38:01.038212.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_09_04T02_38_01.038212", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-09-04T02:38:01.038212.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-09-04T02:38:01.038212.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_09_04T02_38_01.038212", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-09-04T02:38:01.038212.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-09-04T02:38:01.038212.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_09_04T02_38_01.038212", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-09-04T02:38:01.038212.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-09-04T02:38:01.038212.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_09_04T02_38_01.038212", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-09-04T02:38:01.038212.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-09-04T02:38:01.038212.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_09_04T02_38_01.038212", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-09-04T02:38:01.038212.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-09-04T02:38:01.038212.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_09_04T02_38_01.038212", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-09-04T02:38:01.038212.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-09-04T02:38:01.038212.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_09_04T02_38_01.038212", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-09-04T02:38:01.038212.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-09-04T02:38:01.038212.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_09_04T02_38_01.038212", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-04T02:38:01.038212.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-04T02:38:01.038212.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_09_04T02_38_01.038212", "path": ["**/details_harness|hendrycksTest-virology|5_2023-09-04T02:38:01.038212.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-09-04T02:38:01.038212.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_09_04T02_38_01.038212", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-09-04T02:38:01.038212.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-09-04T02:38:01.038212.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_09_04T02_38_01.038212", "path": ["**/details_harness|truthfulqa:mc|0_2023-09-04T02:38:01.038212.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-09-04T02:38:01.038212.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_09_04T02_38_01.038212", "path": ["results_2023-09-04T02:38:01.038212.parquet"]}, {"split": "latest", "path": ["results_2023-09-04T02:38:01.038212.parquet"]}]}]}
2023-09-04T01:39:23+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of yeontaek/llama-2-70B-ensemble-v7 ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model yeontaek/llama-2-70B-ensemble-v7 on the Open LLM Leaderboard. The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-09-04T02:38:01.038212(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of yeontaek/llama-2-70B-ensemble-v7", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model yeontaek/llama-2-70B-ensemble-v7 on the Open LLM Leaderboard.\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-09-04T02:38:01.038212(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of yeontaek/llama-2-70B-ensemble-v7", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model yeontaek/llama-2-70B-ensemble-v7 on the Open LLM Leaderboard.\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-09-04T02:38:01.038212(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 23, 31, 171, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of yeontaek/llama-2-70B-ensemble-v7## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model yeontaek/llama-2-70B-ensemble-v7 on the Open LLM Leaderboard.\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-09-04T02:38:01.038212(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
135f2602523906b3595f13e28a3fc589dd83dc65
# Dataset Card for "babylm-100M" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
deven367/babylm-100M
[ "region:us" ]
2023-09-04T01:41:10+00:00
{"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "valid", "path": "data/valid-*"}, {"split": "test", "path": "data/test-*"}]}], "dataset_info": {"features": [{"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 567957485, "num_examples": 10176300}, {"name": "valid", "num_bytes": 54930583, "num_examples": 986022}, {"name": "test", "num_bytes": 59992087, "num_examples": 1008854}], "download_size": 429914407, "dataset_size": 682880155}}
2023-09-06T03:28:32+00:00
[]
[]
TAGS #region-us
# Dataset Card for "babylm-100M" More Information needed
[ "# Dataset Card for \"babylm-100M\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"babylm-100M\"\n\nMore Information needed" ]
[ 6, 14 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"babylm-100M\"\n\nMore Information needed" ]
b2ef03798b09ff40c0a099b4d2e86392d7a62390
# Dataset Card for Evaluation run of porkorbeef/Llama-2-13b-public ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/porkorbeef/Llama-2-13b-public - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [porkorbeef/Llama-2-13b-public](https://huggingface.co/porkorbeef/Llama-2-13b-public) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_porkorbeef__Llama-2-13b-public", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-10-17T07:54:08.425167](https://huggingface.co/datasets/open-llm-leaderboard/details_porkorbeef__Llama-2-13b-public/blob/main/results_2023-10-17T07-54-08.425167.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.0, "em_stderr": 0.0, "f1": 5.76761744966443e-05, "f1_stderr": 2.9264210748527048e-05, "acc": 0.24191002367797948, "acc_stderr": 0.007022563065489301 }, "harness|drop|3": { "em": 0.0, "em_stderr": 0.0, "f1": 5.76761744966443e-05, "f1_stderr": 2.9264210748527048e-05 }, "harness|gsm8k|5": { "acc": 0.0, "acc_stderr": 0.0 }, "harness|winogrande|5": { "acc": 0.48382004735595896, "acc_stderr": 0.014045126130978603 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_porkorbeef__Llama-2-13b-public
[ "region:us" ]
2023-09-04T01:46:12+00:00
{"pretty_name": "Evaluation run of porkorbeef/Llama-2-13b-public", "dataset_summary": "Dataset automatically created during the evaluation run of model [porkorbeef/Llama-2-13b-public](https://huggingface.co/porkorbeef/Llama-2-13b-public) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_porkorbeef__Llama-2-13b-public\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-17T07:54:08.425167](https://huggingface.co/datasets/open-llm-leaderboard/details_porkorbeef__Llama-2-13b-public/blob/main/results_2023-10-17T07-54-08.425167.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0,\n \"em_stderr\": 0.0,\n \"f1\": 5.76761744966443e-05,\n \"f1_stderr\": 2.9264210748527048e-05,\n \"acc\": 0.24191002367797948,\n \"acc_stderr\": 0.007022563065489301\n },\n \"harness|drop|3\": {\n \"em\": 0.0,\n \"em_stderr\": 0.0,\n \"f1\": 5.76761744966443e-05,\n \"f1_stderr\": 2.9264210748527048e-05\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\": 0.0\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.48382004735595896,\n \"acc_stderr\": 0.014045126130978603\n }\n}\n```", "repo_url": "https://huggingface.co/porkorbeef/Llama-2-13b-public", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_09_04T02_45_47.354690", "path": ["**/details_harness|arc:challenge|25_2023-09-04T02:45:47.354690.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-09-04T02:45:47.354690.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_17T07_54_08.425167", "path": ["**/details_harness|drop|3_2023-10-17T07-54-08.425167.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-17T07-54-08.425167.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_17T07_54_08.425167", "path": ["**/details_harness|gsm8k|5_2023-10-17T07-54-08.425167.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-17T07-54-08.425167.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_09_04T02_45_47.354690", "path": ["**/details_harness|hellaswag|10_2023-09-04T02:45:47.354690.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-09-04T02:45:47.354690.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_09_04T02_45_47.354690", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-04T02:45:47.354690.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-09-04T02:45:47.354690.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-09-04T02:45:47.354690.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-09-04T02:45:47.354690.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-04T02:45:47.354690.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-09-04T02:45:47.354690.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-09-04T02:45:47.354690.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-09-04T02:45:47.354690.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-09-04T02:45:47.354690.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-09-04T02:45:47.354690.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-09-04T02:45:47.354690.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-09-04T02:45:47.354690.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-04T02:45:47.354690.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-09-04T02:45:47.354690.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-04T02:45:47.354690.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-04T02:45:47.354690.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-09-04T02:45:47.354690.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-09-04T02:45:47.354690.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-09-04T02:45:47.354690.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-04T02:45:47.354690.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-04T02:45:47.354690.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-04T02:45:47.354690.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-09-04T02:45:47.354690.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-04T02:45:47.354690.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-04T02:45:47.354690.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-04T02:45:47.354690.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-04T02:45:47.354690.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-09-04T02:45:47.354690.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-04T02:45:47.354690.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-04T02:45:47.354690.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-04T02:45:47.354690.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-04T02:45:47.354690.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-09-04T02:45:47.354690.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-09-04T02:45:47.354690.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-09-04T02:45:47.354690.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-09-04T02:45:47.354690.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-04T02:45:47.354690.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-09-04T02:45:47.354690.parquet", "**/details_harness|hendrycksTest-management|5_2023-09-04T02:45:47.354690.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-09-04T02:45:47.354690.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-09-04T02:45:47.354690.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-09-04T02:45:47.354690.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-09-04T02:45:47.354690.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-04T02:45:47.354690.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-09-04T02:45:47.354690.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-09-04T02:45:47.354690.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-09-04T02:45:47.354690.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-09-04T02:45:47.354690.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-09-04T02:45:47.354690.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-09-04T02:45:47.354690.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-09-04T02:45:47.354690.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-09-04T02:45:47.354690.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-09-04T02:45:47.354690.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-09-04T02:45:47.354690.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-04T02:45:47.354690.parquet", "**/details_harness|hendrycksTest-virology|5_2023-09-04T02:45:47.354690.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-09-04T02:45:47.354690.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-04T02:45:47.354690.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-09-04T02:45:47.354690.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-09-04T02:45:47.354690.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-09-04T02:45:47.354690.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-04T02:45:47.354690.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-09-04T02:45:47.354690.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-09-04T02:45:47.354690.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-09-04T02:45:47.354690.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-09-04T02:45:47.354690.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-09-04T02:45:47.354690.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-09-04T02:45:47.354690.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-09-04T02:45:47.354690.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-04T02:45:47.354690.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-09-04T02:45:47.354690.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-04T02:45:47.354690.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-04T02:45:47.354690.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-09-04T02:45:47.354690.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-09-04T02:45:47.354690.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-09-04T02:45:47.354690.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-04T02:45:47.354690.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-04T02:45:47.354690.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-04T02:45:47.354690.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-09-04T02:45:47.354690.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-04T02:45:47.354690.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-04T02:45:47.354690.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-04T02:45:47.354690.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-04T02:45:47.354690.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-09-04T02:45:47.354690.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-04T02:45:47.354690.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-04T02:45:47.354690.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-04T02:45:47.354690.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-04T02:45:47.354690.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-09-04T02:45:47.354690.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-09-04T02:45:47.354690.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-09-04T02:45:47.354690.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-09-04T02:45:47.354690.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-04T02:45:47.354690.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-09-04T02:45:47.354690.parquet", "**/details_harness|hendrycksTest-management|5_2023-09-04T02:45:47.354690.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-09-04T02:45:47.354690.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-09-04T02:45:47.354690.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-09-04T02:45:47.354690.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-09-04T02:45:47.354690.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-04T02:45:47.354690.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-09-04T02:45:47.354690.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-09-04T02:45:47.354690.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-09-04T02:45:47.354690.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-09-04T02:45:47.354690.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-09-04T02:45:47.354690.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-09-04T02:45:47.354690.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-09-04T02:45:47.354690.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-09-04T02:45:47.354690.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-09-04T02:45:47.354690.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-09-04T02:45:47.354690.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-04T02:45:47.354690.parquet", "**/details_harness|hendrycksTest-virology|5_2023-09-04T02:45:47.354690.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-09-04T02:45:47.354690.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_09_04T02_45_47.354690", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-04T02:45:47.354690.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-04T02:45:47.354690.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_09_04T02_45_47.354690", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-09-04T02:45:47.354690.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-09-04T02:45:47.354690.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_09_04T02_45_47.354690", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-09-04T02:45:47.354690.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-09-04T02:45:47.354690.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_09_04T02_45_47.354690", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-09-04T02:45:47.354690.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-09-04T02:45:47.354690.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_09_04T02_45_47.354690", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-04T02:45:47.354690.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-04T02:45:47.354690.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_09_04T02_45_47.354690", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-09-04T02:45:47.354690.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-09-04T02:45:47.354690.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_09_04T02_45_47.354690", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-09-04T02:45:47.354690.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-09-04T02:45:47.354690.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_09_04T02_45_47.354690", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-09-04T02:45:47.354690.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-09-04T02:45:47.354690.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_09_04T02_45_47.354690", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-09-04T02:45:47.354690.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-09-04T02:45:47.354690.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_09_04T02_45_47.354690", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-09-04T02:45:47.354690.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-09-04T02:45:47.354690.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_09_04T02_45_47.354690", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-09-04T02:45:47.354690.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-09-04T02:45:47.354690.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_09_04T02_45_47.354690", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-09-04T02:45:47.354690.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-09-04T02:45:47.354690.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_09_04T02_45_47.354690", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-04T02:45:47.354690.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-04T02:45:47.354690.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_09_04T02_45_47.354690", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-09-04T02:45:47.354690.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-09-04T02:45:47.354690.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_09_04T02_45_47.354690", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-04T02:45:47.354690.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-04T02:45:47.354690.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_09_04T02_45_47.354690", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-04T02:45:47.354690.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-04T02:45:47.354690.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_09_04T02_45_47.354690", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-09-04T02:45:47.354690.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-09-04T02:45:47.354690.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_09_04T02_45_47.354690", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-09-04T02:45:47.354690.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-09-04T02:45:47.354690.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_09_04T02_45_47.354690", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-09-04T02:45:47.354690.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-09-04T02:45:47.354690.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_09_04T02_45_47.354690", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-04T02:45:47.354690.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-04T02:45:47.354690.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_09_04T02_45_47.354690", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-04T02:45:47.354690.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-04T02:45:47.354690.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_09_04T02_45_47.354690", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-04T02:45:47.354690.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-04T02:45:47.354690.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_09_04T02_45_47.354690", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-09-04T02:45:47.354690.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-09-04T02:45:47.354690.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_09_04T02_45_47.354690", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-04T02:45:47.354690.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-04T02:45:47.354690.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_09_04T02_45_47.354690", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-04T02:45:47.354690.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-04T02:45:47.354690.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_09_04T02_45_47.354690", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-04T02:45:47.354690.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-04T02:45:47.354690.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_09_04T02_45_47.354690", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-04T02:45:47.354690.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-04T02:45:47.354690.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_09_04T02_45_47.354690", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-09-04T02:45:47.354690.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-09-04T02:45:47.354690.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_09_04T02_45_47.354690", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-04T02:45:47.354690.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-04T02:45:47.354690.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_09_04T02_45_47.354690", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-04T02:45:47.354690.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-04T02:45:47.354690.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_09_04T02_45_47.354690", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-04T02:45:47.354690.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-04T02:45:47.354690.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_09_04T02_45_47.354690", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-04T02:45:47.354690.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-04T02:45:47.354690.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_09_04T02_45_47.354690", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-09-04T02:45:47.354690.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-09-04T02:45:47.354690.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_09_04T02_45_47.354690", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-09-04T02:45:47.354690.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-09-04T02:45:47.354690.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_09_04T02_45_47.354690", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-09-04T02:45:47.354690.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-09-04T02:45:47.354690.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_09_04T02_45_47.354690", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-09-04T02:45:47.354690.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-09-04T02:45:47.354690.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_09_04T02_45_47.354690", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-04T02:45:47.354690.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-04T02:45:47.354690.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_09_04T02_45_47.354690", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-09-04T02:45:47.354690.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-09-04T02:45:47.354690.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_09_04T02_45_47.354690", "path": ["**/details_harness|hendrycksTest-management|5_2023-09-04T02:45:47.354690.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-09-04T02:45:47.354690.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_09_04T02_45_47.354690", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-09-04T02:45:47.354690.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-09-04T02:45:47.354690.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_09_04T02_45_47.354690", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-09-04T02:45:47.354690.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-09-04T02:45:47.354690.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_09_04T02_45_47.354690", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-09-04T02:45:47.354690.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-09-04T02:45:47.354690.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_09_04T02_45_47.354690", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-09-04T02:45:47.354690.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-09-04T02:45:47.354690.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_09_04T02_45_47.354690", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-04T02:45:47.354690.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-04T02:45:47.354690.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_09_04T02_45_47.354690", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-09-04T02:45:47.354690.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-09-04T02:45:47.354690.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_09_04T02_45_47.354690", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-09-04T02:45:47.354690.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-09-04T02:45:47.354690.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_09_04T02_45_47.354690", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-09-04T02:45:47.354690.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-09-04T02:45:47.354690.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_09_04T02_45_47.354690", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-09-04T02:45:47.354690.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-09-04T02:45:47.354690.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_09_04T02_45_47.354690", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-09-04T02:45:47.354690.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-09-04T02:45:47.354690.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_09_04T02_45_47.354690", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-09-04T02:45:47.354690.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-09-04T02:45:47.354690.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_09_04T02_45_47.354690", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-09-04T02:45:47.354690.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-09-04T02:45:47.354690.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_09_04T02_45_47.354690", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-09-04T02:45:47.354690.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-09-04T02:45:47.354690.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_09_04T02_45_47.354690", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-09-04T02:45:47.354690.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-09-04T02:45:47.354690.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_09_04T02_45_47.354690", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-09-04T02:45:47.354690.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-09-04T02:45:47.354690.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_09_04T02_45_47.354690", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-04T02:45:47.354690.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-04T02:45:47.354690.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_09_04T02_45_47.354690", "path": ["**/details_harness|hendrycksTest-virology|5_2023-09-04T02:45:47.354690.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-09-04T02:45:47.354690.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_09_04T02_45_47.354690", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-09-04T02:45:47.354690.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-09-04T02:45:47.354690.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_09_04T02_45_47.354690", "path": ["**/details_harness|truthfulqa:mc|0_2023-09-04T02:45:47.354690.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-09-04T02:45:47.354690.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_17T07_54_08.425167", "path": ["**/details_harness|winogrande|5_2023-10-17T07-54-08.425167.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-17T07-54-08.425167.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_09_04T02_45_47.354690", "path": ["results_2023-09-04T02:45:47.354690.parquet"]}, {"split": "2023_10_17T07_54_08.425167", "path": ["results_2023-10-17T07-54-08.425167.parquet"]}, {"split": "latest", "path": ["results_2023-10-17T07-54-08.425167.parquet"]}]}]}
2023-10-17T06:54:21+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of porkorbeef/Llama-2-13b-public ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model porkorbeef/Llama-2-13b-public on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-10-17T07:54:08.425167(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of porkorbeef/Llama-2-13b-public", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model porkorbeef/Llama-2-13b-public on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-17T07:54:08.425167(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of porkorbeef/Llama-2-13b-public", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model porkorbeef/Llama-2-13b-public on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-17T07:54:08.425167(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 21, 31, 169, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of porkorbeef/Llama-2-13b-public## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model porkorbeef/Llama-2-13b-public on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-17T07:54:08.425167(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
4712250c270bddbc95f7b0959243e62429f55f94
# Dataset Card for "the-stack-moonscript-clean" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
jlkj/the-stack-moonscript-clean
[ "region:us" ]
2023-09-04T01:56:53+00:00
{"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "test", "path": "data/test-*"}, {"split": "valid", "path": "data/valid-*"}]}], "dataset_info": {"features": [{"name": "hexsha", "dtype": "string"}, {"name": "size", "dtype": "int64"}, {"name": "content", "dtype": "string"}, {"name": "avg_line_length", "dtype": "float64"}, {"name": "max_line_length", "dtype": "int64"}, {"name": "alphanum_fraction", "dtype": "float64"}], "splits": [{"name": "train", "num_bytes": 20211229.498295177, "num_examples": 5520}, {"name": "test", "num_bytes": 1124066.5681117065, "num_examples": 307}, {"name": "valid", "num_bytes": 1124066.5681117065, "num_examples": 307}], "download_size": 9160074, "dataset_size": 22459362.63451859}}
2023-09-04T12:20:39+00:00
[]
[]
TAGS #region-us
# Dataset Card for "the-stack-moonscript-clean" More Information needed
[ "# Dataset Card for \"the-stack-moonscript-clean\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"the-stack-moonscript-clean\"\n\nMore Information needed" ]
[ 6, 20 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"the-stack-moonscript-clean\"\n\nMore Information needed" ]
e2efe1395ab20b735720472ea8d14547b0e16e1e
# Dataset Card for "bus_few35_front" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
FanChen0116/bus_few35_front
[ "region:us" ]
2023-09-04T02:12:22+00:00
{"dataset_info": {"features": [{"name": "id", "dtype": "int64"}, {"name": "tokens", "sequence": "string"}, {"name": "labels", "sequence": {"class_label": {"names": {"0": "O", "1": "I-from_location", "2": "B-from_location", "3": "B-leaving_date", "4": "I-leaving_date", "5": "I-to_location", "6": "B-to_location"}}}}, {"name": "request_slot", "sequence": "string"}], "splits": [{"name": "train", "num_bytes": 6172, "num_examples": 35}, {"name": "validation", "num_bytes": 6900, "num_examples": 35}, {"name": "test", "num_bytes": 6900, "num_examples": 35}], "download_size": 14148, "dataset_size": 19972}}
2023-09-07T02:56:10+00:00
[]
[]
TAGS #region-us
# Dataset Card for "bus_few35_front" More Information needed
[ "# Dataset Card for \"bus_few35_front\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"bus_few35_front\"\n\nMore Information needed" ]
[ 6, 17 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"bus_few35_front\"\n\nMore Information needed" ]
22717a775055d26a34d0286bc7ba7730e4374289
# Dataset Card for "bus_few35_front_empty" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
FanChen0116/bus_few35_front_empty
[ "region:us" ]
2023-09-04T02:15:13+00:00
{"dataset_info": {"features": [{"name": "id", "dtype": "int64"}, {"name": "tokens", "sequence": "string"}, {"name": "labels", "sequence": {"class_label": {"names": {"0": "O", "1": "I-from_location", "2": "B-from_location", "3": "B-leaving_date", "4": "I-leaving_date", "5": "I-to_location", "6": "B-to_location"}}}}, {"name": "request_slot", "sequence": "string"}], "splits": [{"name": "train", "num_bytes": 5491, "num_examples": 35}, {"name": "validation", "num_bytes": 6128, "num_examples": 35}, {"name": "test", "num_bytes": 6900, "num_examples": 35}], "download_size": 0, "dataset_size": 18519}}
2023-09-07T03:41:33+00:00
[]
[]
TAGS #region-us
# Dataset Card for "bus_few35_front_empty" More Information needed
[ "# Dataset Card for \"bus_few35_front_empty\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"bus_few35_front_empty\"\n\nMore Information needed" ]
[ 6, 20 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"bus_few35_front_empty\"\n\nMore Information needed" ]
e5567b990d88eb3e6adf8ed90603c9fea5017c1f
# Dataset Card for "guanaco-llama2-1k" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
Dalamar96/guanaco-llama2-1k
[ "region:us" ]
2023-09-04T02:24:57+00:00
{"dataset_info": {"features": [{"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 1654448, "num_examples": 1000}], "download_size": 0, "dataset_size": 1654448}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]}
2023-09-05T02:57:40+00:00
[]
[]
TAGS #region-us
# Dataset Card for "guanaco-llama2-1k" More Information needed
[ "# Dataset Card for \"guanaco-llama2-1k\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"guanaco-llama2-1k\"\n\nMore Information needed" ]
[ 6, 18 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"guanaco-llama2-1k\"\n\nMore Information needed" ]
7cee9ce78617d5eb6c3d149651e42abf6fa49276
# Dataset Card for Dataset Name ## Dataset Description - **Homepage:** - **Repository:** - **Paper:** - **Leaderboard:** - **Point of Contact:** ### Dataset Summary This dataset card aims to be a base template for new datasets. It has been generated using [this raw template](https://github.com/huggingface/huggingface_hub/blob/main/src/huggingface_hub/templates/datasetcard_template.md?plain=1). ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
cireix/mvml
[ "region:us" ]
2023-09-04T02:25:57+00:00
{}
2023-09-04T02:30:12+00:00
[]
[]
TAGS #region-us
# Dataset Card for Dataset Name ## Dataset Description - Homepage: - Repository: - Paper: - Leaderboard: - Point of Contact: ### Dataset Summary This dataset card aims to be a base template for new datasets. It has been generated using this raw template. ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Dataset Name", "## Dataset Description\n\n- Homepage: \n- Repository: \n- Paper: \n- Leaderboard: \n- Point of Contact:", "### Dataset Summary\n\nThis dataset card aims to be a base template for new datasets. It has been generated using this raw template.", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Dataset Name", "## Dataset Description\n\n- Homepage: \n- Repository: \n- Paper: \n- Leaderboard: \n- Point of Contact:", "### Dataset Summary\n\nThis dataset card aims to be a base template for new datasets. It has been generated using this raw template.", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 8, 24, 32, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Dataset Name## Dataset Description\n\n- Homepage: \n- Repository: \n- Paper: \n- Leaderboard: \n- Point of Contact:### Dataset Summary\n\nThis dataset card aims to be a base template for new datasets. It has been generated using this raw template.### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
32ddd0e39275ea44ec117e831a1626f3c22b5e38
# Dataset Card for "instructpix2pix-demo" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
jijay/instructpix2pix-demo
[ "region:us" ]
2023-09-04T02:37:22+00:00
{"dataset_info": {"features": [{"name": "input_image", "dtype": "image"}, {"name": "edit_prompt", "dtype": "string"}, {"name": "edited_image", "dtype": "image"}], "splits": [{"name": "train", "num_bytes": 4850304.0, "num_examples": 4}], "download_size": 0, "dataset_size": 4850304.0}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]}
2023-09-05T07:03:31+00:00
[]
[]
TAGS #region-us
# Dataset Card for "instructpix2pix-demo" More Information needed
[ "# Dataset Card for \"instructpix2pix-demo\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"instructpix2pix-demo\"\n\nMore Information needed" ]
[ 6, 17 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"instructpix2pix-demo\"\n\nMore Information needed" ]
a4c174151dd5b19c6389b5da80590517bd5e01ea
# Dataset Card for "instructpix2pix-demov2" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
jijay/instructpix2pix-demov2
[ "region:us" ]
2023-09-04T02:45:50+00:00
{"dataset_info": {"features": [{"name": "input_image", "dtype": "image"}, {"name": "edit_prompt", "dtype": "string"}, {"name": "edited_image", "dtype": "image"}], "splits": [{"name": "train", "num_bytes": 4872706.0, "num_examples": 4}], "download_size": 4873592, "dataset_size": 4872706.0}}
2023-09-04T06:37:14+00:00
[]
[]
TAGS #region-us
# Dataset Card for "instructpix2pix-demov2" More Information needed
[ "# Dataset Card for \"instructpix2pix-demov2\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"instructpix2pix-demov2\"\n\nMore Information needed" ]
[ 6, 19 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"instructpix2pix-demov2\"\n\nMore Information needed" ]
7fbfaa4f0210ba870adabee8139c5e8962aa42cc
# Dataset of Yarizui Sen This is the dataset of Yarizui Sen, containing 198 images and their tags. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)). | Name | Images | Download | Description | |:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------| | raw | 198 | [Download](dataset-raw.zip) | Raw data with meta information. | | raw-stage3 | 420 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. | | 384x512 | 198 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. | | 512x512 | 198 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. | | 512x704 | 198 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. | | 640x640 | 198 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. | | 640x880 | 198 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. | | stage3-640 | 420 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. | | stage3-800 | 420 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. | | stage3-1200 | 420 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
CyberHarem/yarizui_sen_bento
[ "task_categories:text-to-image", "size_categories:n<1K", "license:mit", "art", "not-for-all-audiences", "region:us" ]
2023-09-04T02:54:20+00:00
{"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]}
2023-09-17T16:27:47+00:00
[]
[]
TAGS #task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
Dataset of Yarizui Sen ====================== This is the dataset of Yarizui Sen, containing 198 images and their tags. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
[]
[ "TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n" ]
[ 44 ]
[ "passage: TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n" ]
143d5e7e2880dcf3df2df6570984143c2e9a9b45
The images and pretrained-models used in the ICCV oral paper 'High-Quality Entity Segmentation'. The offical link is https://github.com/adobe-research/EntitySeg-Dataset. The code link is https://github.com/qqlu/Entity/tree/main/Entityv2. We noted that we do not own the copyright of the images. It is solely your responsibility to check the original licenses of the images before using them. Any use of the images are at your own discretion and risk.
qqlu1992/Adobe_EntitySeg
[ "region:us" ]
2023-09-04T02:58:26+00:00
{"viewer": false}
2023-09-07T00:03:14+00:00
[]
[]
TAGS #region-us
The images and pretrained-models used in the ICCV oral paper 'High-Quality Entity Segmentation'. The offical link is URL The code link is URL We noted that we do not own the copyright of the images. It is solely your responsibility to check the original licenses of the images before using them. Any use of the images are at your own discretion and risk.
[]
[ "TAGS\n#region-us \n" ]
[ 6 ]
[ "passage: TAGS\n#region-us \n" ]
4f25da8879e2beb3abaf2e5a14d49f626ddf3fe4
# Dataset of Shaga Ayame This is the dataset of Shaga Ayame, containing 171 images and their tags. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)). | Name | Images | Download | Description | |:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------| | raw | 171 | [Download](dataset-raw.zip) | Raw data with meta information. | | raw-stage3 | 374 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. | | 384x512 | 171 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. | | 512x512 | 171 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. | | 512x704 | 171 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. | | 640x640 | 171 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. | | 640x880 | 171 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. | | stage3-640 | 374 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. | | stage3-800 | 374 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. | | stage3-1200 | 374 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
CyberHarem/shaga_ayame_bento
[ "task_categories:text-to-image", "size_categories:n<1K", "license:mit", "art", "not-for-all-audiences", "region:us" ]
2023-09-04T03:10:10+00:00
{"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]}
2023-09-17T16:27:49+00:00
[]
[]
TAGS #task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
Dataset of Shaga Ayame ====================== This is the dataset of Shaga Ayame, containing 171 images and their tags. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
[]
[ "TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n" ]
[ 44 ]
[ "passage: TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n" ]
26e90cfa5409bbf5d3f77b33a00adb8e1443b6c7
# OneOS Dataset The OneOS dataset is a collection of text data for the [OneOS project](https://github.com/wasertech/OneOS). It consists of a large number of text samples that can be used for training and evaluating natural language processing models. ## Dataset Details - Number of Samples: 13,068 - License: CC0* - Language: English, French \* Only unlicensed sentences generated manually fall under CreativeCommon-0. Sentences already licensed under different terms, such as [nl2bash](https://github.com/TellinaTool/nl2bash) or [samantha-data](https://huggingface.co/datasets/ehartford/samantha-data), remain subject to their respective licenses. The same applies to sentences produced using language models operating under special licenses, like LLama or the GPT series. ## Dataset Format Comming soon.
wasertech/OneOS
[ "size_categories:10K<n<100K", "language:en", "language:fr", "license:cc0-1.0", "code", "bash", "python", "Web Search", "Wikipedia", "NLU", "region:us" ]
2023-09-04T03:11:48+00:00
{"language": ["en", "fr"], "license": "cc0-1.0", "size_categories": ["10K<n<100K"], "pretty_name": "OneOS", "dataset_info": {"features": [{"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 34712224, "num_examples": 13640}], "download_size": 3372844, "dataset_size": 34712224}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}], "tags": ["code", "bash", "python", "Web Search", "Wikipedia", "NLU"]}
2024-01-14T23:23:56+00:00
[]
[ "en", "fr" ]
TAGS #size_categories-10K<n<100K #language-English #language-French #license-cc0-1.0 #code #bash #python #Web Search #Wikipedia #NLU #region-us
# OneOS Dataset The OneOS dataset is a collection of text data for the OneOS project. It consists of a large number of text samples that can be used for training and evaluating natural language processing models. ## Dataset Details - Number of Samples: 13,068 - License: CC0* - Language: English, French \* Only unlicensed sentences generated manually fall under CreativeCommon-0. Sentences already licensed under different terms, such as nl2bash or samantha-data, remain subject to their respective licenses. The same applies to sentences produced using language models operating under special licenses, like LLama or the GPT series. ## Dataset Format Comming soon.
[ "# OneOS Dataset\n\nThe OneOS dataset is a collection of text data for the OneOS project. It consists of a large number of text samples that can be used for training and evaluating natural language processing models.", "## Dataset Details\n - Number of Samples: 13,068\n - License: CC0*\n - Language: English, French\n\n \\* Only unlicensed sentences generated manually fall under CreativeCommon-0. Sentences already licensed under different terms, such as nl2bash or samantha-data, remain subject to their respective licenses. The same applies to sentences produced using language models operating under special licenses, like LLama or the GPT series.", "## Dataset Format\n\nComming soon." ]
[ "TAGS\n#size_categories-10K<n<100K #language-English #language-French #license-cc0-1.0 #code #bash #python #Web Search #Wikipedia #NLU #region-us \n", "# OneOS Dataset\n\nThe OneOS dataset is a collection of text data for the OneOS project. It consists of a large number of text samples that can be used for training and evaluating natural language processing models.", "## Dataset Details\n - Number of Samples: 13,068\n - License: CC0*\n - Language: English, French\n\n \\* Only unlicensed sentences generated manually fall under CreativeCommon-0. Sentences already licensed under different terms, such as nl2bash or samantha-data, remain subject to their respective licenses. The same applies to sentences produced using language models operating under special licenses, like LLama or the GPT series.", "## Dataset Format\n\nComming soon." ]
[ 51, 48, 101, 8 ]
[ "passage: TAGS\n#size_categories-10K<n<100K #language-English #language-French #license-cc0-1.0 #code #bash #python #Web Search #Wikipedia #NLU #region-us \n# OneOS Dataset\n\nThe OneOS dataset is a collection of text data for the OneOS project. It consists of a large number of text samples that can be used for training and evaluating natural language processing models.## Dataset Details\n - Number of Samples: 13,068\n - License: CC0*\n - Language: English, French\n\n \\* Only unlicensed sentences generated manually fall under CreativeCommon-0. Sentences already licensed under different terms, such as nl2bash or samantha-data, remain subject to their respective licenses. The same applies to sentences produced using language models operating under special licenses, like LLama or the GPT series.## Dataset Format\n\nComming soon." ]
90aed5b1e2c4a2c602b2d5e6ed6c79b7615128e0
this is fork https://huggingface.co/datasets/globis-university/aozorabunko-clean filtered row["meta"]["文字遣い種別"] == "新字新仮名"
if001/aozorabunko-clean-sin
[ "task_categories:text-generation", "task_categories:text-classification", "size_categories:10K<n<100K", "language:ja", "license:cc-by-4.0", "region:us" ]
2023-09-04T03:22:55+00:00
{"language": ["ja"], "license": "cc-by-4.0", "size_categories": ["10K<n<100K"], "task_categories": ["text-generation", "text-classification"], "dataset_info": {"features": [{"name": "text", "dtype": "string"}, {"name": "footnote", "dtype": "string"}, {"name": "meta", "struct": [{"name": "\u4f5c\u54c1ID", "dtype": "string"}, {"name": "\u4f5c\u54c1\u540d", "dtype": "string"}, {"name": "\u4f5c\u54c1\u540d\u8aad\u307f", "dtype": "string"}, {"name": "\u30bd\u30fc\u30c8\u7528\u8aad\u307f", "dtype": "string"}, {"name": "\u526f\u984c", "dtype": "string"}, {"name": "\u526f\u984c\u8aad\u307f", "dtype": "string"}, {"name": "\u539f\u984c", "dtype": "string"}, {"name": "\u521d\u51fa", "dtype": "string"}, {"name": "\u5206\u985e\u756a\u53f7", "dtype": "string"}, {"name": "\u6587\u5b57\u9063\u3044\u7a2e\u5225", "dtype": "string"}, {"name": "\u4f5c\u54c1\u8457\u4f5c\u6a29\u30d5\u30e9\u30b0", "dtype": "string"}, {"name": "\u516c\u958b\u65e5", "dtype": "timestamp[s]"}, {"name": "\u6700\u7d42\u66f4\u65b0\u65e5", "dtype": "timestamp[s]"}, {"name": "\u56f3\u66f8\u30ab\u30fc\u30c9URL", "dtype": "string"}, {"name": "\u4eba\u7269ID", "dtype": "string"}, {"name": "\u59d3", "dtype": "string"}, {"name": "\u540d", "dtype": "string"}, {"name": "\u59d3\u8aad\u307f", "dtype": "string"}, {"name": "\u540d\u8aad\u307f", "dtype": "string"}, {"name": "\u59d3\u8aad\u307f\u30bd\u30fc\u30c8\u7528", "dtype": "string"}, {"name": "\u540d\u8aad\u307f\u30bd\u30fc\u30c8\u7528", "dtype": "string"}, {"name": "\u59d3\u30ed\u30fc\u30de\u5b57", "dtype": "string"}, {"name": "\u540d\u30ed\u30fc\u30de\u5b57", "dtype": "string"}, {"name": "\u5f79\u5272\u30d5\u30e9\u30b0", "dtype": "string"}, {"name": "\u751f\u5e74\u6708\u65e5", "dtype": "string"}, {"name": "\u6ca1\u5e74\u6708\u65e5", "dtype": "string"}, {"name": "\u4eba\u7269\u8457\u4f5c\u6a29\u30d5\u30e9\u30b0", "dtype": "string"}, {"name": "\u5e95\u672c\u540d1", "dtype": "string"}, {"name": "\u5e95\u672c\u51fa\u7248\u793e\u540d1", "dtype": "string"}, {"name": "\u5e95\u672c\u521d\u7248\u767a\u884c\u5e741", "dtype": "string"}, {"name": "\u5165\u529b\u306b\u4f7f\u7528\u3057\u305f\u72481", "dtype": "string"}, {"name": "\u6821\u6b63\u306b\u4f7f\u7528\u3057\u305f\u72481", "dtype": "string"}, {"name": "\u5e95\u672c\u306e\u89aa\u672c\u540d1", "dtype": "string"}, {"name": "\u5e95\u672c\u306e\u89aa\u672c\u51fa\u7248\u793e\u540d1", "dtype": "string"}, {"name": "\u5e95\u672c\u306e\u89aa\u672c\u521d\u7248\u767a\u884c\u5e741", "dtype": "string"}, {"name": "\u5e95\u672c\u540d2", "dtype": "string"}, {"name": "\u5e95\u672c\u51fa\u7248\u793e\u540d2", "dtype": "string"}, {"name": "\u5e95\u672c\u521d\u7248\u767a\u884c\u5e742", "dtype": "string"}, {"name": "\u5165\u529b\u306b\u4f7f\u7528\u3057\u305f\u72482", "dtype": "string"}, {"name": "\u6821\u6b63\u306b\u4f7f\u7528\u3057\u305f\u72482", "dtype": "string"}, {"name": "\u5e95\u672c\u306e\u89aa\u672c\u540d2", "dtype": "string"}, {"name": "\u5e95\u672c\u306e\u89aa\u672c\u51fa\u7248\u793e\u540d2", "dtype": "string"}, {"name": "\u5e95\u672c\u306e\u89aa\u672c\u521d\u7248\u767a\u884c\u5e742", "dtype": "string"}, {"name": "\u5165\u529b\u8005", "dtype": "string"}, {"name": "\u6821\u6b63\u8005", "dtype": "string"}, {"name": "\u30c6\u30ad\u30b9\u30c8\u30d5\u30a1\u30a4\u30ebURL", "dtype": "string"}, {"name": "\u30c6\u30ad\u30b9\u30c8\u30d5\u30a1\u30a4\u30eb\u6700\u7d42\u66f4\u65b0\u65e5", "dtype": "timestamp[s]"}, {"name": "\u30c6\u30ad\u30b9\u30c8\u30d5\u30a1\u30a4\u30eb\u7b26\u53f7\u5316\u65b9\u5f0f", "dtype": "string"}, {"name": "\u30c6\u30ad\u30b9\u30c8\u30d5\u30a1\u30a4\u30eb\u6587\u5b57\u96c6\u5408", "dtype": "string"}, {"name": "\u30c6\u30ad\u30b9\u30c8\u30d5\u30a1\u30a4\u30eb\u4fee\u6b63\u56de\u6570", "dtype": "string"}, {"name": "XHTML/HTML\u30d5\u30a1\u30a4\u30ebURL", "dtype": "string"}, {"name": "XHTML/HTML\u30d5\u30a1\u30a4\u30eb\u6700\u7d42\u66f4\u65b0\u65e5", "dtype": "timestamp[s]"}, {"name": "XHTML/HTML\u30d5\u30a1\u30a4\u30eb\u7b26\u53f7\u5316\u65b9\u5f0f", "dtype": "string"}, {"name": "XHTML/HTML\u30d5\u30a1\u30a4\u30eb\u6587\u5b57\u96c6\u5408", "dtype": "string"}, {"name": "XHTML/HTML\u30d5\u30a1\u30a4\u30eb\u4fee\u6b63\u56de\u6570", "dtype": "string"}]}]}}
2023-09-04T04:02:32+00:00
[]
[ "ja" ]
TAGS #task_categories-text-generation #task_categories-text-classification #size_categories-10K<n<100K #language-Japanese #license-cc-by-4.0 #region-us
this is fork URL filtered row["meta"]["文字遣い種別"] == "新字新仮名"
[]
[ "TAGS\n#task_categories-text-generation #task_categories-text-classification #size_categories-10K<n<100K #language-Japanese #license-cc-by-4.0 #region-us \n" ]
[ 55 ]
[ "passage: TAGS\n#task_categories-text-generation #task_categories-text-classification #size_categories-10K<n<100K #language-Japanese #license-cc-by-4.0 #region-us \n" ]
9caee59e196de8358ace8e012a6d06794fabd8f2
# Dataset of エルちゃん This is the dataset of エルちゃん, containing 200 images and their tags. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)). | Name | Images | Download | Description | |:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------| | raw | 200 | [Download](dataset-raw.zip) | Raw data with meta information. | | raw-stage3 | 418 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. | | 384x512 | 200 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. | | 512x512 | 200 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. | | 512x704 | 200 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. | | 640x640 | 200 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. | | 640x880 | 200 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. | | stage3-640 | 418 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. | | stage3-800 | 418 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. | | stage3-1200 | 418 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
CyberHarem/eruchiyan_hirogaruskyprecure
[ "task_categories:text-to-image", "size_categories:n<1K", "license:mit", "art", "not-for-all-audiences", "region:us" ]
2023-09-04T03:28:40+00:00
{"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]}
2023-09-17T16:27:51+00:00
[]
[]
TAGS #task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
Dataset of エルちゃん ================ This is the dataset of エルちゃん, containing 200 images and their tags. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
[]
[ "TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n" ]
[ 44 ]
[ "passage: TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n" ]
3d238b4d1b7674d5af472d97c348b35b8f71db3f
# Dataset of Cure Sky This is the dataset of Cure Sky, containing 200 images and their tags. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)). | Name | Images | Download | Description | |:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------| | raw | 200 | [Download](dataset-raw.zip) | Raw data with meta information. | | raw-stage3 | 449 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. | | 384x512 | 200 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. | | 512x512 | 200 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. | | 512x704 | 200 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. | | 640x640 | 200 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. | | 640x880 | 200 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. | | stage3-640 | 449 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. | | stage3-800 | 449 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. | | stage3-1200 | 449 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
CyberHarem/cure_sky_hirogaruskyprecure
[ "task_categories:text-to-image", "size_categories:n<1K", "license:mit", "art", "not-for-all-audiences", "region:us" ]
2023-09-04T03:47:25+00:00
{"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]}
2023-09-17T16:27:53+00:00
[]
[]
TAGS #task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
Dataset of Cure Sky =================== This is the dataset of Cure Sky, containing 200 images and their tags. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
[]
[ "TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n" ]
[ 44 ]
[ "passage: TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n" ]
3052e0ad5aa1f0095c90b72dbf9d1e007d209ef7
This dataset is created for instruction tuning purpose.It is based on the [News Summarization](https://huggingface.co/datasets/sustcsenlp/bn_news_summarization) dataset. The instructions are given in the `inputs` column and their completions/answers are provided in the `targets` column. The `template_id` tracks each input_template-target_template pair. There are 15 template ids (from 1 to 15). The ID and their respective templates are given below. `no_template` indicates that no template was used and only the summary or direct answer was provided for that input. | ID | inputs_template | targets_template | | ----- | ----- | ----- | | 1 | এই সংবাদের জন্য একটি সংবাদ শিরোনাম লেখ: | প্রদত্ত সংবাদের সংবাদ শিরোনাম হলো, | | 2 | একটি বাক্যে লেখাটির মূল বক্তব্য তুলে ধর: | প্রদত্ত অনুচ্ছেদের সংক্ষিপ্ত মূলভাব হলো, | | 3 | নিচের অনুচ্ছেদে কী বলা হয়েছে তা সংক্ষেপে বর্ণনা কর। | প্রদত্ত অনুচ্ছেদের সংক্ষিপ্ত মূলভাব হলো, | | 4 | নিচের অনুচ্ছেদে কী বলা হয়েছে তা সংক্ষেপে বর্ণনা কর। | no_template | | 5 | এক বাক্যে নিচের অনুচ্ছেদের সারাংশ লেখ। | প্রদত্ত অনুচ্ছেদের সংক্ষিপ্ত মূলভাব হলো, | | 6 | সংক্ষেপে বাক্যটির মূলভাব তুলে ধরো: | no_template | | 7 | সংবাদ শিরোনাম লিখুন: | সংবাদটির শিরোনাম হলো, | | 8 | সংক্ষেপে বাক্যটির মূলভাব তুলে ধরো: | বাক্যটির সংক্ষিপ্ত মূলভাব হলো, | | 9 | নিন্মলিখিত সংবাদের শিরোনাম কী হতে পারে? | প্রদত্ত সংবাদের সংবাদ শিরোনাম হলো, | | 10 | এক বাক্যে নিচের অনুচ্ছেদের সারাংশ লেখ। | no_template | | 11 | আরো কম শব্দে বাক্যটির মূলভাব বর্ণনা কর: | no_template | | 12 | প্রদত্ত তথ্য ব্যবহার করে একটি সংবাদ শিরোনাম লিখুন: | সংবাদটির শিরোনাম হলো, | 13 | আরো কম শব্দে বাক্যটির মূলভাব বর্ণনা কর: | বাক্যটির সংক্ষিপ্ত মূলভাব হলো, | | 14 | একটি বাক্যে লেখাটির মূল বক্তব্য তুলে ধর: | no_template | | 15 | নিম্নলিখিত সংবাদের ভিত্তিতে একটি সংবাদ শিরোনাম লিখুন | প্রদত্ত সংবাদের সংবাদ শিরোনাম হলো, |
TahmidH/annotated_news_summary
[ "task_categories:summarization", "size_categories:10K<n<100K", "language:bn", "license:cc0-1.0", "region:us" ]
2023-09-04T04:06:15+00:00
{"language": ["bn"], "license": "cc0-1.0", "size_categories": ["10K<n<100K"], "task_categories": ["summarization"]}
2024-02-04T14:49:53+00:00
[]
[ "bn" ]
TAGS #task_categories-summarization #size_categories-10K<n<100K #language-Bengali #license-cc0-1.0 #region-us
This dataset is created for instruction tuning purpose.It is based on the News Summarization dataset. The instructions are given in the 'inputs' column and their completions/answers are provided in the 'targets' column. The 'template\_id' tracks each input\_template-target\_template pair. There are 15 template ids (from 1 to 15). The ID and their respective templates are given below. 'no\_template' indicates that no template was used and only the summary or direct answer was provided for that input. ID: 1, inputs\_template: এই সংবাদের জন্য একটি সংবাদ শিরোনাম লেখ:, targets\_template: প্রদত্ত সংবাদের সংবাদ শিরোনাম হলো, ID: 2, inputs\_template: একটি বাক্যে লেখাটির মূল বক্তব্য তুলে ধর:, targets\_template: প্রদত্ত অনুচ্ছেদের সংক্ষিপ্ত মূলভাব হলো, ID: 3, inputs\_template: নিচের অনুচ্ছেদে কী বলা হয়েছে তা সংক্ষেপে বর্ণনা কর।, targets\_template: প্রদত্ত অনুচ্ছেদের সংক্ষিপ্ত মূলভাব হলো, ID: 4, inputs\_template: নিচের অনুচ্ছেদে কী বলা হয়েছে তা সংক্ষেপে বর্ণনা কর।, targets\_template: no\_template ID: 5, inputs\_template: এক বাক্যে নিচের অনুচ্ছেদের সারাংশ লেখ।, targets\_template: প্রদত্ত অনুচ্ছেদের সংক্ষিপ্ত মূলভাব হলো, ID: 6, inputs\_template: সংক্ষেপে বাক্যটির মূলভাব তুলে ধরো:, targets\_template: no\_template ID: 7, inputs\_template: সংবাদ শিরোনাম লিখুন:, targets\_template: সংবাদটির শিরোনাম হলো, ID: 8, inputs\_template: সংক্ষেপে বাক্যটির মূলভাব তুলে ধরো:, targets\_template: বাক্যটির সংক্ষিপ্ত মূলভাব হলো, ID: 9, inputs\_template: নিন্মলিখিত সংবাদের শিরোনাম কী হতে পারে?, targets\_template: প্রদত্ত সংবাদের সংবাদ শিরোনাম হলো, ID: 10, inputs\_template: এক বাক্যে নিচের অনুচ্ছেদের সারাংশ লেখ।, targets\_template: no\_template ID: 11, inputs\_template: আরো কম শব্দে বাক্যটির মূলভাব বর্ণনা কর:, targets\_template: no\_template ID: 12, inputs\_template: প্রদত্ত তথ্য ব্যবহার করে একটি সংবাদ শিরোনাম লিখুন:, targets\_template: সংবাদটির শিরোনাম হলো, ID: 13, inputs\_template: আরো কম শব্দে বাক্যটির মূলভাব বর্ণনা কর:, targets\_template: বাক্যটির সংক্ষিপ্ত মূলভাব হলো, ID: 14, inputs\_template: একটি বাক্যে লেখাটির মূল বক্তব্য তুলে ধর:, targets\_template: no\_template ID: 15, inputs\_template: নিম্নলিখিত সংবাদের ভিত্তিতে একটি সংবাদ শিরোনাম লিখুন, targets\_template: প্রদত্ত সংবাদের সংবাদ শিরোনাম হলো,
[]
[ "TAGS\n#task_categories-summarization #size_categories-10K<n<100K #language-Bengali #license-cc0-1.0 #region-us \n" ]
[ 41 ]
[ "passage: TAGS\n#task_categories-summarization #size_categories-10K<n<100K #language-Bengali #license-cc0-1.0 #region-us \n" ]
e759fa0a0c43bbb0e56f21d0e4d3ce47d1455ac8
# Dataset of Cure Prism This is the dataset of Cure Prism, containing 200 images and their tags. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)). | Name | Images | Download | Description | |:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------| | raw | 200 | [Download](dataset-raw.zip) | Raw data with meta information. | | raw-stage3 | 456 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. | | 384x512 | 200 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. | | 512x512 | 200 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. | | 512x704 | 200 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. | | 640x640 | 200 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. | | 640x880 | 200 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. | | stage3-640 | 456 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. | | stage3-800 | 456 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. | | stage3-1200 | 456 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
CyberHarem/cure_prism_hirogaruskyprecure
[ "task_categories:text-to-image", "size_categories:n<1K", "license:mit", "art", "not-for-all-audiences", "region:us" ]
2023-09-04T04:06:32+00:00
{"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]}
2023-09-17T16:27:55+00:00
[]
[]
TAGS #task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
Dataset of Cure Prism ===================== This is the dataset of Cure Prism, containing 200 images and their tags. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
[]
[ "TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n" ]
[ 44 ]
[ "passage: TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n" ]
298a857c3c559052358ddeb35e626410490ca318
# Dataset of Cure Butterfly This is the dataset of Cure Butterfly, containing 200 images and their tags. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)). | Name | Images | Download | Description | |:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------| | raw | 200 | [Download](dataset-raw.zip) | Raw data with meta information. | | raw-stage3 | 431 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. | | 384x512 | 200 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. | | 512x512 | 200 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. | | 512x704 | 200 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. | | 640x640 | 200 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. | | 640x880 | 200 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. | | stage3-640 | 431 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. | | stage3-800 | 431 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. | | stage3-1200 | 431 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
CyberHarem/cure_butterfly_hirogaruskyprecure
[ "task_categories:text-to-image", "size_categories:n<1K", "license:mit", "art", "not-for-all-audiences", "region:us" ]
2023-09-04T04:21:35+00:00
{"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]}
2023-09-17T16:27:57+00:00
[]
[]
TAGS #task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
Dataset of Cure Butterfly ========================= This is the dataset of Cure Butterfly, containing 200 images and their tags. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
[]
[ "TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n" ]
[ 44 ]
[ "passage: TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n" ]
945fc2c56b2709ad53259f85272ddf59f5b22eb6
# Dataset Card for Dataset Name ## Dataset Description - **Homepage:** - **Repository:** - **Paper:** - **Leaderboard:** - **Point of Contact:** ### Dataset Summary This dataset card aims to be a base template for new datasets. It has been generated using [this raw template](https://github.com/huggingface/huggingface_hub/blob/main/src/huggingface_hub/templates/datasetcard_template.md?plain=1). ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
ronny1804/hhh
[ "region:us" ]
2023-09-04T04:22:50+00:00
{}
2023-09-04T04:34:41+00:00
[]
[]
TAGS #region-us
# Dataset Card for Dataset Name ## Dataset Description - Homepage: - Repository: - Paper: - Leaderboard: - Point of Contact: ### Dataset Summary This dataset card aims to be a base template for new datasets. It has been generated using this raw template. ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Dataset Name", "## Dataset Description\n\n- Homepage: \n- Repository: \n- Paper: \n- Leaderboard: \n- Point of Contact:", "### Dataset Summary\n\nThis dataset card aims to be a base template for new datasets. It has been generated using this raw template.", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Dataset Name", "## Dataset Description\n\n- Homepage: \n- Repository: \n- Paper: \n- Leaderboard: \n- Point of Contact:", "### Dataset Summary\n\nThis dataset card aims to be a base template for new datasets. It has been generated using this raw template.", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 8, 24, 32, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Dataset Name## Dataset Description\n\n- Homepage: \n- Repository: \n- Paper: \n- Leaderboard: \n- Point of Contact:### Dataset Summary\n\nThis dataset card aims to be a base template for new datasets. It has been generated using this raw template.### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
1d77fdbb8a18808ca9596b047250d3baf4ecaf93
# Dataset Card for "eli5-subquestion-d2-paired" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
JihyukKim/eli5-subquestion-d2-paired
[ "region:us" ]
2023-09-04T04:25:30+00:00
{"dataset_info": {"features": [{"name": "qid", "dtype": "string"}, {"name": "question", "dtype": "string"}, {"name": "response_j", "dtype": "string"}, {"name": "response_k", "dtype": "string"}, {"name": "gold_claims", "sequence": "string"}, {"name": "response_j_claims", "sequence": "string"}, {"name": "response_k_claims", "sequence": "string"}], "splits": [{"name": "train", "num_bytes": 49866299, "num_examples": 40804}, {"name": "test", "num_bytes": 930052, "num_examples": 770}], "download_size": 8499365, "dataset_size": 50796351}}
2023-09-04T04:26:04+00:00
[]
[]
TAGS #region-us
# Dataset Card for "eli5-subquestion-d2-paired" More Information needed
[ "# Dataset Card for \"eli5-subquestion-d2-paired\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"eli5-subquestion-d2-paired\"\n\nMore Information needed" ]
[ 6, 20 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"eli5-subquestion-d2-paired\"\n\nMore Information needed" ]
a40d4358449f0f1eb3b0bcfabb7e01442ff40fd7
支持ChatHaruhi2 的郭靖数据,可以使用如下方式调用 ```python from chatharuhi import ChatHaruhi chatbot = ChatHaruhi( role_from_hf = 'hhhwmws/guojing', \ llm = 'openai') response = chatbot.chat(role='欧阳锋', text = '是我!') print(response) ``` 上传者: 米唯实 更具体的信息,见 [ChatHaruhi](https://github.com/LC1332/Chat-Haruhi-Suzumiya) 欢迎加入我们的 [众筹角色创建项目](https://github.com/LC1332/Chat-Haruhi-Suzumiya/tree/main/characters/novel_collecting) ### Citation引用 Please cite the repo if you use the data or code in this repo. ``` @misc{li2023chatharuhi, title={ChatHaruhi: Reviving Anime Character in Reality via Large Language Model}, author={Cheng Li and Ziang Leng and Chenxi Yan and Junyi Shen and Hao Wang and Weishi MI and Yaying Fei and Xiaoyang Feng and Song Yan and HaoSheng Wang and Linkang Zhan and Yaokai Jia and Pingyu Wu and Haozhen Sun}, year={2023}, eprint={2308.09597}, archivePrefix={arXiv}, primaryClass={cs.CL} } ```
hhhwmws/guojing
[ "task_categories:text-generation", "size_categories:1K<n<10K", "language:zh", "license:cc-by-4.0", "arxiv:2308.09597", "region:us" ]
2023-09-04T04:26:09+00:00
{"language": ["zh"], "license": "cc-by-4.0", "size_categories": ["1K<n<10K"], "task_categories": ["text-generation"]}
2023-09-04T04:27:46+00:00
[ "2308.09597" ]
[ "zh" ]
TAGS #task_categories-text-generation #size_categories-1K<n<10K #language-Chinese #license-cc-by-4.0 #arxiv-2308.09597 #region-us
支持ChatHaruhi2 的郭靖数据,可以使用如下方式调用 上传者: 米唯实 更具体的信息,见 ChatHaruhi 欢迎加入我们的 众筹角色创建项目 引用 Please cite the repo if you use the data or code in this repo.
[]
[ "TAGS\n#task_categories-text-generation #size_categories-1K<n<10K #language-Chinese #license-cc-by-4.0 #arxiv-2308.09597 #region-us \n" ]
[ 51 ]
[ "passage: TAGS\n#task_categories-text-generation #size_categories-1K<n<10K #language-Chinese #license-cc-by-4.0 #arxiv-2308.09597 #region-us \n" ]
42b37caf695b25bde4685b4344fd28c7d27e17ca
# Dataset Card for "eli5-subquestion-d2-paired-sft" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
JihyukKim/eli5-subquestion-d2-paired-sft
[ "region:us" ]
2023-09-04T04:26:48+00:00
{"dataset_info": {"features": [{"name": "qid", "dtype": "string"}, {"name": "question", "dtype": "string"}, {"name": "response_j", "dtype": "string"}, {"name": "response_k", "dtype": "string"}, {"name": "gold_claims", "sequence": "string"}, {"name": "response_j_claims", "sequence": "string"}, {"name": "response_k_claims", "sequence": "string"}], "splits": [{"name": "train", "num_bytes": 19891518, "num_examples": 16494}, {"name": "test", "num_bytes": 382602, "num_examples": 317}], "download_size": 6291749, "dataset_size": 20274120}}
2023-09-04T04:27:07+00:00
[]
[]
TAGS #region-us
# Dataset Card for "eli5-subquestion-d2-paired-sft" More Information needed
[ "# Dataset Card for \"eli5-subquestion-d2-paired-sft\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"eli5-subquestion-d2-paired-sft\"\n\nMore Information needed" ]
[ 6, 23 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"eli5-subquestion-d2-paired-sft\"\n\nMore Information needed" ]
5c4b30d945e5dba2eda85b20a1b43160991a30ef
支持ChatHaruhi2 的黄蓉数据,可以使用如下方式调用 ```python from chatharuhi import ChatHaruhi chatbot = ChatHaruhi( role_from_hf = 'hhhwmws/huangrong', \ llm = 'openai') response = chatbot.chat(role='郭靖', text = '蓉儿!') print(response) ``` 上传者: 米唯实 更具体的信息,见 [ChatHaruhi](https://github.com/LC1332/Chat-Haruhi-Suzumiya) 欢迎加入我们的 [众筹角色创建项目](https://github.com/LC1332/Chat-Haruhi-Suzumiya/tree/main/characters/novel_collecting) ### Citation引用 Please cite the repo if you use the data or code in this repo. ``` @misc{li2023chatharuhi, title={ChatHaruhi: Reviving Anime Character in Reality via Large Language Model}, author={Cheng Li and Ziang Leng and Chenxi Yan and Junyi Shen and Hao Wang and Weishi MI and Yaying Fei and Xiaoyang Feng and Song Yan and HaoSheng Wang and Linkang Zhan and Yaokai Jia and Pingyu Wu and Haozhen Sun}, year={2023}, eprint={2308.09597}, archivePrefix={arXiv}, primaryClass={cs.CL} } ```
hhhwmws/huangrong
[ "task_categories:text-generation", "size_categories:1K<n<10K", "language:zh", "license:cc-by-4.0", "arxiv:2308.09597", "region:us" ]
2023-09-04T04:28:37+00:00
{"language": ["zh"], "license": "cc-by-4.0", "size_categories": ["1K<n<10K"], "task_categories": ["text-generation"]}
2023-09-04T04:30:26+00:00
[ "2308.09597" ]
[ "zh" ]
TAGS #task_categories-text-generation #size_categories-1K<n<10K #language-Chinese #license-cc-by-4.0 #arxiv-2308.09597 #region-us
支持ChatHaruhi2 的黄蓉数据,可以使用如下方式调用 上传者: 米唯实 更具体的信息,见 ChatHaruhi 欢迎加入我们的 众筹角色创建项目 引用 Please cite the repo if you use the data or code in this repo.
[]
[ "TAGS\n#task_categories-text-generation #size_categories-1K<n<10K #language-Chinese #license-cc-by-4.0 #arxiv-2308.09597 #region-us \n" ]
[ 51 ]
[ "passage: TAGS\n#task_categories-text-generation #size_categories-1K<n<10K #language-Chinese #license-cc-by-4.0 #arxiv-2308.09597 #region-us \n" ]
c3f26505a02c57cd59b04277c04b42f2db9a0dd8
支持ChatHaruhi2 的欧阳锋数据,可以使用如下方式调用 ```python from chatharuhi import ChatHaruhi chatbot = ChatHaruhi( role_from_hf = 'hhhwmws/ouyangfeng', \ llm = 'openai') response = chatbot.chat(role='郭靖', text = '欧阳锋!是你!') print(response) ``` 上传者: 米唯实 更具体的信息,见 [ChatHaruhi](https://github.com/LC1332/Chat-Haruhi-Suzumiya) 欢迎加入我们的 [众筹角色创建项目](https://github.com/LC1332/Chat-Haruhi-Suzumiya/tree/main/characters/novel_collecting) ### Citation引用 Please cite the repo if you use the data or code in this repo. ``` @misc{li2023chatharuhi, title={ChatHaruhi: Reviving Anime Character in Reality via Large Language Model}, author={Cheng Li and Ziang Leng and Chenxi Yan and Junyi Shen and Hao Wang and Weishi MI and Yaying Fei and Xiaoyang Feng and Song Yan and HaoSheng Wang and Linkang Zhan and Yaokai Jia and Pingyu Wu and Haozhen Sun}, year={2023}, eprint={2308.09597}, archivePrefix={arXiv}, primaryClass={cs.CL} } ```
hhhwmws/ouyangfeng
[ "task_categories:text-generation", "size_categories:1K<n<10K", "language:zh", "license:cc-by-4.0", "arxiv:2308.09597", "region:us" ]
2023-09-04T04:31:23+00:00
{"language": ["zh"], "license": "cc-by-4.0", "size_categories": ["1K<n<10K"], "task_categories": ["text-generation"]}
2023-09-04T04:47:43+00:00
[ "2308.09597" ]
[ "zh" ]
TAGS #task_categories-text-generation #size_categories-1K<n<10K #language-Chinese #license-cc-by-4.0 #arxiv-2308.09597 #region-us
支持ChatHaruhi2 的欧阳锋数据,可以使用如下方式调用 上传者: 米唯实 更具体的信息,见 ChatHaruhi 欢迎加入我们的 众筹角色创建项目 引用 Please cite the repo if you use the data or code in this repo.
[]
[ "TAGS\n#task_categories-text-generation #size_categories-1K<n<10K #language-Chinese #license-cc-by-4.0 #arxiv-2308.09597 #region-us \n" ]
[ 51 ]
[ "passage: TAGS\n#task_categories-text-generation #size_categories-1K<n<10K #language-Chinese #license-cc-by-4.0 #arxiv-2308.09597 #region-us \n" ]
55ce6e6e56f8f9538b0fec410b172c67434f0e5a
支持ChatHaruhi2 的黄药师数据,可以使用如下方式调用 ```python from chatharuhi import ChatHaruhi chatbot = ChatHaruhi( role_from_hf = 'hhhwmws/huangyaoshi', \ llm = 'openai') response = chatbot.chat(role='郭靖', text = '黄药师!是你!') print(response) ``` 上传者: 米唯实 更具体的信息,见 [ChatHaruhi](https://github.com/LC1332/Chat-Haruhi-Suzumiya) 欢迎加入我们的 [众筹角色创建项目](https://github.com/LC1332/Chat-Haruhi-Suzumiya/tree/main/characters/novel_collecting) ### Citation引用 Please cite the repo if you use the data or code in this repo. ``` @misc{li2023chatharuhi, title={ChatHaruhi: Reviving Anime Character in Reality via Large Language Model}, author={Cheng Li and Ziang Leng and Chenxi Yan and Junyi Shen and Hao Wang and Weishi MI and Yaying Fei and Xiaoyang Feng and Song Yan and HaoSheng Wang and Linkang Zhan and Yaokai Jia and Pingyu Wu and Haozhen Sun}, year={2023}, eprint={2308.09597}, archivePrefix={arXiv}, primaryClass={cs.CL} } ```
hhhwmws/huangyaoshi
[ "task_categories:text-generation", "size_categories:1K<n<10K", "language:zh", "license:cc-by-4.0", "arxiv:2308.09597", "region:us" ]
2023-09-04T04:48:58+00:00
{"language": ["zh"], "license": "cc-by-4.0", "size_categories": ["1K<n<10K"], "task_categories": ["text-generation"]}
2023-09-04T04:50:27+00:00
[ "2308.09597" ]
[ "zh" ]
TAGS #task_categories-text-generation #size_categories-1K<n<10K #language-Chinese #license-cc-by-4.0 #arxiv-2308.09597 #region-us
支持ChatHaruhi2 的黄药师数据,可以使用如下方式调用 上传者: 米唯实 更具体的信息,见 ChatHaruhi 欢迎加入我们的 众筹角色创建项目 引用 Please cite the repo if you use the data or code in this repo.
[]
[ "TAGS\n#task_categories-text-generation #size_categories-1K<n<10K #language-Chinese #license-cc-by-4.0 #arxiv-2308.09597 #region-us \n" ]
[ 51 ]
[ "passage: TAGS\n#task_categories-text-generation #size_categories-1K<n<10K #language-Chinese #license-cc-by-4.0 #arxiv-2308.09597 #region-us \n" ]
779cffbc504d76d71895ab944e9b8839e209bf8d
# Dataset Card for Evaluation run of CHIH-HUNG/llama-2-13b-FINETUNE2_TEST_2.2w ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/CHIH-HUNG/llama-2-13b-FINETUNE2_TEST_2.2w - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [CHIH-HUNG/llama-2-13b-FINETUNE2_TEST_2.2w](https://huggingface.co/CHIH-HUNG/llama-2-13b-FINETUNE2_TEST_2.2w) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE2_TEST_2.2w", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-09-22T15:56:51.054424](https://huggingface.co/datasets/open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE2_TEST_2.2w/blob/main/results_2023-09-22T15-56-51.054424.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.13653523489932887, "em_stderr": 0.0035162871401896623, "f1": 0.18752202181207997, "f1_stderr": 0.0035554972989016802, "acc": 0.4267978240433516, "acc_stderr": 0.009809122705480169 }, "harness|drop|3": { "em": 0.13653523489932887, "em_stderr": 0.0035162871401896623, "f1": 0.18752202181207997, "f1_stderr": 0.0035554972989016802 }, "harness|gsm8k|5": { "acc": 0.08642911296436695, "acc_stderr": 0.007740044337103793 }, "harness|winogrande|5": { "acc": 0.7671665351223362, "acc_stderr": 0.011878201073856544 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE2_TEST_2.2w
[ "region:us" ]
2023-09-04T05:14:40+00:00
{"pretty_name": "Evaluation run of CHIH-HUNG/llama-2-13b-FINETUNE2_TEST_2.2w", "dataset_summary": "Dataset automatically created during the evaluation run of model [CHIH-HUNG/llama-2-13b-FINETUNE2_TEST_2.2w](https://huggingface.co/CHIH-HUNG/llama-2-13b-FINETUNE2_TEST_2.2w) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE2_TEST_2.2w\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-09-22T15:56:51.054424](https://huggingface.co/datasets/open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE2_TEST_2.2w/blob/main/results_2023-09-22T15-56-51.054424.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.13653523489932887,\n \"em_stderr\": 0.0035162871401896623,\n \"f1\": 0.18752202181207997,\n \"f1_stderr\": 0.0035554972989016802,\n \"acc\": 0.4267978240433516,\n \"acc_stderr\": 0.009809122705480169\n },\n \"harness|drop|3\": {\n \"em\": 0.13653523489932887,\n \"em_stderr\": 0.0035162871401896623,\n \"f1\": 0.18752202181207997,\n \"f1_stderr\": 0.0035554972989016802\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.08642911296436695,\n \"acc_stderr\": 0.007740044337103793\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7671665351223362,\n \"acc_stderr\": 0.011878201073856544\n }\n}\n```", "repo_url": "https://huggingface.co/CHIH-HUNG/llama-2-13b-FINETUNE2_TEST_2.2w", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_09_04T06_14_16.488025", "path": ["**/details_harness|arc:challenge|25_2023-09-04T06:14:16.488025.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-09-04T06:14:16.488025.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_09_22T15_56_51.054424", "path": ["**/details_harness|drop|3_2023-09-22T15-56-51.054424.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-09-22T15-56-51.054424.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_09_22T15_56_51.054424", "path": ["**/details_harness|gsm8k|5_2023-09-22T15-56-51.054424.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-09-22T15-56-51.054424.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_09_04T06_14_16.488025", "path": ["**/details_harness|hellaswag|10_2023-09-04T06:14:16.488025.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-09-04T06:14:16.488025.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_09_04T06_14_16.488025", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-04T06:14:16.488025.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-09-04T06:14:16.488025.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-09-04T06:14:16.488025.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-09-04T06:14:16.488025.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-04T06:14:16.488025.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-09-04T06:14:16.488025.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-09-04T06:14:16.488025.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-09-04T06:14:16.488025.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-09-04T06:14:16.488025.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-09-04T06:14:16.488025.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-09-04T06:14:16.488025.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-09-04T06:14:16.488025.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-04T06:14:16.488025.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-09-04T06:14:16.488025.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-04T06:14:16.488025.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-04T06:14:16.488025.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-09-04T06:14:16.488025.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-09-04T06:14:16.488025.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-09-04T06:14:16.488025.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-04T06:14:16.488025.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-04T06:14:16.488025.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-04T06:14:16.488025.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-09-04T06:14:16.488025.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-04T06:14:16.488025.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-04T06:14:16.488025.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-04T06:14:16.488025.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-04T06:14:16.488025.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-09-04T06:14:16.488025.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-04T06:14:16.488025.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-04T06:14:16.488025.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-04T06:14:16.488025.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-04T06:14:16.488025.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-09-04T06:14:16.488025.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-09-04T06:14:16.488025.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-09-04T06:14:16.488025.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-09-04T06:14:16.488025.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-04T06:14:16.488025.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-09-04T06:14:16.488025.parquet", "**/details_harness|hendrycksTest-management|5_2023-09-04T06:14:16.488025.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-09-04T06:14:16.488025.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-09-04T06:14:16.488025.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-09-04T06:14:16.488025.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-09-04T06:14:16.488025.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-04T06:14:16.488025.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-09-04T06:14:16.488025.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-09-04T06:14:16.488025.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-09-04T06:14:16.488025.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-09-04T06:14:16.488025.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-09-04T06:14:16.488025.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-09-04T06:14:16.488025.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-09-04T06:14:16.488025.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-09-04T06:14:16.488025.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-09-04T06:14:16.488025.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-09-04T06:14:16.488025.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-04T06:14:16.488025.parquet", "**/details_harness|hendrycksTest-virology|5_2023-09-04T06:14:16.488025.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-09-04T06:14:16.488025.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-04T06:14:16.488025.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-09-04T06:14:16.488025.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-09-04T06:14:16.488025.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-09-04T06:14:16.488025.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-04T06:14:16.488025.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-09-04T06:14:16.488025.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-09-04T06:14:16.488025.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-09-04T06:14:16.488025.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-09-04T06:14:16.488025.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-09-04T06:14:16.488025.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-09-04T06:14:16.488025.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-09-04T06:14:16.488025.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-04T06:14:16.488025.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-09-04T06:14:16.488025.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-04T06:14:16.488025.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-04T06:14:16.488025.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-09-04T06:14:16.488025.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-09-04T06:14:16.488025.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-09-04T06:14:16.488025.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-04T06:14:16.488025.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-04T06:14:16.488025.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-04T06:14:16.488025.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-09-04T06:14:16.488025.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-04T06:14:16.488025.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-04T06:14:16.488025.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-04T06:14:16.488025.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-04T06:14:16.488025.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-09-04T06:14:16.488025.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-04T06:14:16.488025.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-04T06:14:16.488025.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-04T06:14:16.488025.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-04T06:14:16.488025.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-09-04T06:14:16.488025.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-09-04T06:14:16.488025.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-09-04T06:14:16.488025.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-09-04T06:14:16.488025.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-04T06:14:16.488025.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-09-04T06:14:16.488025.parquet", "**/details_harness|hendrycksTest-management|5_2023-09-04T06:14:16.488025.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-09-04T06:14:16.488025.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-09-04T06:14:16.488025.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-09-04T06:14:16.488025.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-09-04T06:14:16.488025.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-04T06:14:16.488025.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-09-04T06:14:16.488025.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-09-04T06:14:16.488025.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-09-04T06:14:16.488025.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-09-04T06:14:16.488025.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-09-04T06:14:16.488025.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-09-04T06:14:16.488025.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-09-04T06:14:16.488025.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-09-04T06:14:16.488025.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-09-04T06:14:16.488025.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-09-04T06:14:16.488025.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-04T06:14:16.488025.parquet", "**/details_harness|hendrycksTest-virology|5_2023-09-04T06:14:16.488025.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-09-04T06:14:16.488025.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_09_04T06_14_16.488025", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-04T06:14:16.488025.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-04T06:14:16.488025.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_09_04T06_14_16.488025", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-09-04T06:14:16.488025.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-09-04T06:14:16.488025.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_09_04T06_14_16.488025", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-09-04T06:14:16.488025.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-09-04T06:14:16.488025.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_09_04T06_14_16.488025", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-09-04T06:14:16.488025.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-09-04T06:14:16.488025.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_09_04T06_14_16.488025", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-04T06:14:16.488025.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-04T06:14:16.488025.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_09_04T06_14_16.488025", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-09-04T06:14:16.488025.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-09-04T06:14:16.488025.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_09_04T06_14_16.488025", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-09-04T06:14:16.488025.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-09-04T06:14:16.488025.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_09_04T06_14_16.488025", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-09-04T06:14:16.488025.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-09-04T06:14:16.488025.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_09_04T06_14_16.488025", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-09-04T06:14:16.488025.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-09-04T06:14:16.488025.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_09_04T06_14_16.488025", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-09-04T06:14:16.488025.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-09-04T06:14:16.488025.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_09_04T06_14_16.488025", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-09-04T06:14:16.488025.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-09-04T06:14:16.488025.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_09_04T06_14_16.488025", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-09-04T06:14:16.488025.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-09-04T06:14:16.488025.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_09_04T06_14_16.488025", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-04T06:14:16.488025.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-04T06:14:16.488025.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_09_04T06_14_16.488025", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-09-04T06:14:16.488025.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-09-04T06:14:16.488025.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_09_04T06_14_16.488025", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-04T06:14:16.488025.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-04T06:14:16.488025.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_09_04T06_14_16.488025", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-04T06:14:16.488025.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-04T06:14:16.488025.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_09_04T06_14_16.488025", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-09-04T06:14:16.488025.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-09-04T06:14:16.488025.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_09_04T06_14_16.488025", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-09-04T06:14:16.488025.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-09-04T06:14:16.488025.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_09_04T06_14_16.488025", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-09-04T06:14:16.488025.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-09-04T06:14:16.488025.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_09_04T06_14_16.488025", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-04T06:14:16.488025.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-04T06:14:16.488025.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_09_04T06_14_16.488025", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-04T06:14:16.488025.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-04T06:14:16.488025.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_09_04T06_14_16.488025", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-04T06:14:16.488025.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-04T06:14:16.488025.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_09_04T06_14_16.488025", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-09-04T06:14:16.488025.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-09-04T06:14:16.488025.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_09_04T06_14_16.488025", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-04T06:14:16.488025.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-04T06:14:16.488025.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_09_04T06_14_16.488025", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-04T06:14:16.488025.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-04T06:14:16.488025.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_09_04T06_14_16.488025", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-04T06:14:16.488025.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-04T06:14:16.488025.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_09_04T06_14_16.488025", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-04T06:14:16.488025.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-04T06:14:16.488025.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_09_04T06_14_16.488025", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-09-04T06:14:16.488025.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-09-04T06:14:16.488025.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_09_04T06_14_16.488025", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-04T06:14:16.488025.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-04T06:14:16.488025.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_09_04T06_14_16.488025", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-04T06:14:16.488025.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-04T06:14:16.488025.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_09_04T06_14_16.488025", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-04T06:14:16.488025.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-04T06:14:16.488025.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_09_04T06_14_16.488025", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-04T06:14:16.488025.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-04T06:14:16.488025.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_09_04T06_14_16.488025", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-09-04T06:14:16.488025.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-09-04T06:14:16.488025.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_09_04T06_14_16.488025", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-09-04T06:14:16.488025.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-09-04T06:14:16.488025.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_09_04T06_14_16.488025", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-09-04T06:14:16.488025.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-09-04T06:14:16.488025.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_09_04T06_14_16.488025", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-09-04T06:14:16.488025.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-09-04T06:14:16.488025.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_09_04T06_14_16.488025", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-04T06:14:16.488025.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-04T06:14:16.488025.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_09_04T06_14_16.488025", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-09-04T06:14:16.488025.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-09-04T06:14:16.488025.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_09_04T06_14_16.488025", "path": ["**/details_harness|hendrycksTest-management|5_2023-09-04T06:14:16.488025.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-09-04T06:14:16.488025.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_09_04T06_14_16.488025", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-09-04T06:14:16.488025.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-09-04T06:14:16.488025.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_09_04T06_14_16.488025", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-09-04T06:14:16.488025.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-09-04T06:14:16.488025.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_09_04T06_14_16.488025", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-09-04T06:14:16.488025.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-09-04T06:14:16.488025.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_09_04T06_14_16.488025", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-09-04T06:14:16.488025.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-09-04T06:14:16.488025.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_09_04T06_14_16.488025", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-04T06:14:16.488025.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-04T06:14:16.488025.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_09_04T06_14_16.488025", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-09-04T06:14:16.488025.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-09-04T06:14:16.488025.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_09_04T06_14_16.488025", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-09-04T06:14:16.488025.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-09-04T06:14:16.488025.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_09_04T06_14_16.488025", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-09-04T06:14:16.488025.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-09-04T06:14:16.488025.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_09_04T06_14_16.488025", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-09-04T06:14:16.488025.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-09-04T06:14:16.488025.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_09_04T06_14_16.488025", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-09-04T06:14:16.488025.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-09-04T06:14:16.488025.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_09_04T06_14_16.488025", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-09-04T06:14:16.488025.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-09-04T06:14:16.488025.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_09_04T06_14_16.488025", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-09-04T06:14:16.488025.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-09-04T06:14:16.488025.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_09_04T06_14_16.488025", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-09-04T06:14:16.488025.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-09-04T06:14:16.488025.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_09_04T06_14_16.488025", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-09-04T06:14:16.488025.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-09-04T06:14:16.488025.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_09_04T06_14_16.488025", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-09-04T06:14:16.488025.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-09-04T06:14:16.488025.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_09_04T06_14_16.488025", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-04T06:14:16.488025.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-04T06:14:16.488025.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_09_04T06_14_16.488025", "path": ["**/details_harness|hendrycksTest-virology|5_2023-09-04T06:14:16.488025.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-09-04T06:14:16.488025.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_09_04T06_14_16.488025", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-09-04T06:14:16.488025.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-09-04T06:14:16.488025.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_09_04T06_14_16.488025", "path": ["**/details_harness|truthfulqa:mc|0_2023-09-04T06:14:16.488025.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-09-04T06:14:16.488025.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_09_22T15_56_51.054424", "path": ["**/details_harness|winogrande|5_2023-09-22T15-56-51.054424.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-09-22T15-56-51.054424.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_09_04T06_14_16.488025", "path": ["results_2023-09-04T06:14:16.488025.parquet"]}, {"split": "2023_09_22T15_56_51.054424", "path": ["results_2023-09-22T15-56-51.054424.parquet"]}, {"split": "latest", "path": ["results_2023-09-22T15-56-51.054424.parquet"]}]}]}
2023-09-22T14:57:03+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of CHIH-HUNG/llama-2-13b-FINETUNE2_TEST_2.2w ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model CHIH-HUNG/llama-2-13b-FINETUNE2_TEST_2.2w on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-09-22T15:56:51.054424(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of CHIH-HUNG/llama-2-13b-FINETUNE2_TEST_2.2w", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model CHIH-HUNG/llama-2-13b-FINETUNE2_TEST_2.2w on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-09-22T15:56:51.054424(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of CHIH-HUNG/llama-2-13b-FINETUNE2_TEST_2.2w", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model CHIH-HUNG/llama-2-13b-FINETUNE2_TEST_2.2w on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-09-22T15:56:51.054424(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 31, 31, 179, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of CHIH-HUNG/llama-2-13b-FINETUNE2_TEST_2.2w## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model CHIH-HUNG/llama-2-13b-FINETUNE2_TEST_2.2w on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-09-22T15:56:51.054424(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
1aa63ed4ae6fbf1884b31e46d1d3130e81777467
# Dataset Card for TAIDE-14-tasks ### Dataset Summary The "TAIDE-14-tasks" dataset, derived from the TAIDE project, encompasses 14 prevalent text generation tasks. This dataset features a collection of 140 prompts tailored for assessing Traditional Chinese Large Language Models (LLM). GPT-4 meticulously crafted these prompts using the provided task, domain, and keywords from the instructions, with further validation by human experts. Each data entry not only contains the main content but also offers both positive and negative reference responses. These positive and negative reference responses are generated by GPT-4 and then manually proofread to ensure accuracy and relevance. For those keen on evaluating LLMs, we advocate for the G-Eval methodology. Topics Covered (50): ``` {'人類學和社會學', '心理學和心理健康', '心靈和身心健康', '生物學和生物技術', '地理和地球科學', '老年人和長者議題', '汽車和交通', '宗教和信仰', '法律和法規', '社區和社會發展', '社會和文化議題', '社群媒體和網路文化', '青少年和成年人生活', '品牌和行銷', '建築和設計' '政治和國際關係', '科技和人工智慧', '科學和探索', '音樂和音樂創作', '飛行和航空業', '家庭和家居裝潢', '家庭和親密關係', '旅遊和探險', '時尚和美容', '時尚和配件', '神秘學和靈性', '財務和投資', '假日和節日', '動物和動物保護', '教育和學習', '飲食和烹飪', '傳媒和新聞業', '愛情和浪漫', '節能和可持續發展', '經濟和商業', '資料和統計學', '遊戲和遊戲設計', '電子商務和網路交易', '電影和電影產業', '慈善和志願服務', '演講和公共演講', '網路安全和隱私', '歷史和文化遺產', '環境和氣候變化', '職業和就業市場', '職業道德和商業道德', '醫學和健康', '寵物和動物福利', '藝術和文學', '體育和健身'} ``` Tasks Included (14): ``` {'開放式生成', '分類', '問答', '摘要任務', '寫作', '翻譯', '文本分析', '常識推理', '寫信', '抽取', '推薦', '情緒分析', '提供建議', '對話生成'} ``` ### Language * Predominantly in Traditional Chinese. * Some portions in English. ## Dataset Structure ### Data Columns * Topic * Task * Keywords * Prompt * Input * Positive Response * Negative Response ### Data Instances ``` { 'Topic': '科技和人工智慧', 'Task': '寫信', 'Keywords': '如何?', 'Prompt': '假設你是一名人工智慧專家,一家公司希望了解如何將AI技術引入他們的產品。請寫一封回信,解釋如何將AI技術整合到他們的業務中。', 'Input': '<noinput>', 'Positive Response': '親愛的[公司名稱]: 非常感謝您的來信,我很高興能為您提供有關將AI技術整合到您的產品中的建議。在當今的科技世界中,人工智慧(AI)已經成為許多成功創新產品的核心驅動力之一。 首先,我建議您分析哪些部分的業務流程可以透過使用AI技術得到改進。這可以包括自動化的客戶服務,資料分析和預測,產品推薦等。確定這些範疇後,可以開始研究適用於每個特定問題的AI解決方案。 接下來,您需要建立一個專業的AI團隊,包括資料科學家、機器學習工程師、軟體開發人員等。此外,您可能需要提供相應的培訓,以確保團隊成員具備實現AI整合的技能。 在技術層面上,您可以選擇開源的機器學習框架,如TensorFlow或PyTorch,以編寫和設計自己的機器學習模型。此外,您還可以使用現有的API和AI平台,例如Google Cloud AI或Microsoft Azure AI,這些平台提供了大量預先設計的AI解決方案。 最後,將AI技術整合到產品中的過程應該是可持續且不斷優化的。確保持續收集資料,以便為機器學習模型提供更多資料,並不斷改進其性能。 希望我的建議對您有所幫助。如果您有任何進一步的疑問或需要專業建議,請隨時與我聯繫。 祝商祺! [您的名字]', 'Negative Response': '親愛的[公司名稱]: 嗨,謝謝你的信。把AI加到你的業務裡就是找個電腦高手,讓他們給你做個程式,然後看看有沒有什麼好玩的功能可以加進去,就這麼簡單。 祝好運! [您的名字]', } ``` ## Licensing Information The dataset is available under the [Creative Commons NonCommercial (CC BY-NC 4.0)](https://creativecommons.org/licenses/by-nc/4.0/legalcode).
taide/TAIDE-14-tasks
[ "task_categories:text-generation", "task_categories:question-answering", "task_categories:conversational", "size_categories:n<1K", "language:zh", "language:en", "license:cc-by-nc-4.0", "gpt4", "region:us" ]
2023-09-04T05:21:18+00:00
{"language": ["zh", "en"], "license": "cc-by-nc-4.0", "size_categories": ["n<1K"], "task_categories": ["text-generation", "question-answering", "conversational"], "tags": ["gpt4"]}
2023-10-26T08:14:32+00:00
[]
[ "zh", "en" ]
TAGS #task_categories-text-generation #task_categories-question-answering #task_categories-conversational #size_categories-n<1K #language-Chinese #language-English #license-cc-by-nc-4.0 #gpt4 #region-us
# Dataset Card for TAIDE-14-tasks ### Dataset Summary The "TAIDE-14-tasks" dataset, derived from the TAIDE project, encompasses 14 prevalent text generation tasks. This dataset features a collection of 140 prompts tailored for assessing Traditional Chinese Large Language Models (LLM). GPT-4 meticulously crafted these prompts using the provided task, domain, and keywords from the instructions, with further validation by human experts. Each data entry not only contains the main content but also offers both positive and negative reference responses. These positive and negative reference responses are generated by GPT-4 and then manually proofread to ensure accuracy and relevance. For those keen on evaluating LLMs, we advocate for the G-Eval methodology. Topics Covered (50): Tasks Included (14): ### Language * Predominantly in Traditional Chinese. * Some portions in English. ## Dataset Structure ### Data Columns * Topic * Task * Keywords * Prompt * Input * Positive Response * Negative Response ### Data Instances ## Licensing Information The dataset is available under the Creative Commons NonCommercial (CC BY-NC 4.0).
[ "# Dataset Card for TAIDE-14-tasks", "### Dataset Summary\n\nThe \"TAIDE-14-tasks\" dataset, derived from the TAIDE project, encompasses 14 prevalent text generation tasks. This dataset features a collection of 140 prompts tailored for assessing Traditional Chinese Large Language Models (LLM). GPT-4 meticulously crafted these prompts using the provided task, domain, and keywords from the instructions, with further validation by human experts. Each data entry not only contains the main content but also offers both positive and negative reference responses. These positive and negative reference responses are generated by GPT-4 and then manually proofread to ensure accuracy and relevance. For those keen on evaluating LLMs, we advocate for the G-Eval methodology.\nTopics Covered (50):\n\n\nTasks Included (14):", "### Language\n\n* Predominantly in Traditional Chinese.\n* Some portions in English.", "## Dataset Structure", "### Data Columns\n\n* Topic\n* Task\n* Keywords\n* Prompt\n* Input\n* Positive Response\n* Negative Response", "### Data Instances", "## Licensing Information\n\nThe dataset is available under the Creative Commons NonCommercial (CC BY-NC 4.0)." ]
[ "TAGS\n#task_categories-text-generation #task_categories-question-answering #task_categories-conversational #size_categories-n<1K #language-Chinese #language-English #license-cc-by-nc-4.0 #gpt4 #region-us \n", "# Dataset Card for TAIDE-14-tasks", "### Dataset Summary\n\nThe \"TAIDE-14-tasks\" dataset, derived from the TAIDE project, encompasses 14 prevalent text generation tasks. This dataset features a collection of 140 prompts tailored for assessing Traditional Chinese Large Language Models (LLM). GPT-4 meticulously crafted these prompts using the provided task, domain, and keywords from the instructions, with further validation by human experts. Each data entry not only contains the main content but also offers both positive and negative reference responses. These positive and negative reference responses are generated by GPT-4 and then manually proofread to ensure accuracy and relevance. For those keen on evaluating LLMs, we advocate for the G-Eval methodology.\nTopics Covered (50):\n\n\nTasks Included (14):", "### Language\n\n* Predominantly in Traditional Chinese.\n* Some portions in English.", "## Dataset Structure", "### Data Columns\n\n* Topic\n* Task\n* Keywords\n* Prompt\n* Input\n* Positive Response\n* Negative Response", "### Data Instances", "## Licensing Information\n\nThe dataset is available under the Creative Commons NonCommercial (CC BY-NC 4.0)." ]
[ 73, 11, 184, 20, 6, 28, 6, 25 ]
[ "passage: TAGS\n#task_categories-text-generation #task_categories-question-answering #task_categories-conversational #size_categories-n<1K #language-Chinese #language-English #license-cc-by-nc-4.0 #gpt4 #region-us \n# Dataset Card for TAIDE-14-tasks### Dataset Summary\n\nThe \"TAIDE-14-tasks\" dataset, derived from the TAIDE project, encompasses 14 prevalent text generation tasks. This dataset features a collection of 140 prompts tailored for assessing Traditional Chinese Large Language Models (LLM). GPT-4 meticulously crafted these prompts using the provided task, domain, and keywords from the instructions, with further validation by human experts. Each data entry not only contains the main content but also offers both positive and negative reference responses. These positive and negative reference responses are generated by GPT-4 and then manually proofread to ensure accuracy and relevance. For those keen on evaluating LLMs, we advocate for the G-Eval methodology.\nTopics Covered (50):\n\n\nTasks Included (14):### Language\n\n* Predominantly in Traditional Chinese.\n* Some portions in English.## Dataset Structure### Data Columns\n\n* Topic\n* Task\n* Keywords\n* Prompt\n* Input\n* Positive Response\n* Negative Response### Data Instances## Licensing Information\n\nThe dataset is available under the Creative Commons NonCommercial (CC BY-NC 4.0)." ]
945416fefd6ee84d79710fe870860045bb4e5336
# Dataset Card for Evaluation run of uukuguy/speechless-codellama-orca-13b ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/uukuguy/speechless-codellama-orca-13b - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [uukuguy/speechless-codellama-orca-13b](https://huggingface.co/uukuguy/speechless-codellama-orca-13b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 4 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_uukuguy__speechless-codellama-orca-13b", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-10-23T18:58:19.504304](https://huggingface.co/datasets/open-llm-leaderboard/details_uukuguy__speechless-codellama-orca-13b/blob/main/results_2023-10-23T18-58-19.504304.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.2686661073825503, "em_stderr": 0.004539457381903774, "f1": 0.3305505453020149, "f1_stderr": 0.00452265523617686, "acc": 0.3499942854509976, "acc_stderr": 0.010012878870777758 }, "harness|drop|3": { "em": 0.2686661073825503, "em_stderr": 0.004539457381903774, "f1": 0.3305505453020149, "f1_stderr": 0.00452265523617686 }, "harness|gsm8k|5": { "acc": 0.05989385898407885, "acc_stderr": 0.006536148151288716 }, "harness|winogrande|5": { "acc": 0.6400947119179163, "acc_stderr": 0.013489609590266799 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_uukuguy__speechless-codellama-orca-13b
[ "region:us" ]
2023-09-04T05:25:47+00:00
{"pretty_name": "Evaluation run of uukuguy/speechless-codellama-orca-13b", "dataset_summary": "Dataset automatically created during the evaluation run of model [uukuguy/speechless-codellama-orca-13b](https://huggingface.co/uukuguy/speechless-codellama-orca-13b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 4 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_uukuguy__speechless-codellama-orca-13b\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-23T18:58:19.504304](https://huggingface.co/datasets/open-llm-leaderboard/details_uukuguy__speechless-codellama-orca-13b/blob/main/results_2023-10-23T18-58-19.504304.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.2686661073825503,\n \"em_stderr\": 0.004539457381903774,\n \"f1\": 0.3305505453020149,\n \"f1_stderr\": 0.00452265523617686,\n \"acc\": 0.3499942854509976,\n \"acc_stderr\": 0.010012878870777758\n },\n \"harness|drop|3\": {\n \"em\": 0.2686661073825503,\n \"em_stderr\": 0.004539457381903774,\n \"f1\": 0.3305505453020149,\n \"f1_stderr\": 0.00452265523617686\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.05989385898407885,\n \"acc_stderr\": 0.006536148151288716\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.6400947119179163,\n \"acc_stderr\": 0.013489609590266799\n }\n}\n```", "repo_url": "https://huggingface.co/uukuguy/speechless-codellama-orca-13b", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_09_04T06_25_23.128128", "path": ["**/details_harness|arc:challenge|25_2023-09-04T06:25:23.128128.parquet"]}, {"split": "2023_09_12T14_20_48.062177", "path": ["**/details_harness|arc:challenge|25_2023-09-12T14-20-48.062177.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-09-12T14-20-48.062177.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_17T18_24_08.012097", "path": ["**/details_harness|drop|3_2023-10-17T18-24-08.012097.parquet"]}, {"split": "2023_10_23T18_58_19.504304", "path": ["**/details_harness|drop|3_2023-10-23T18-58-19.504304.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-23T18-58-19.504304.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_17T18_24_08.012097", "path": ["**/details_harness|gsm8k|5_2023-10-17T18-24-08.012097.parquet"]}, {"split": "2023_10_23T18_58_19.504304", "path": ["**/details_harness|gsm8k|5_2023-10-23T18-58-19.504304.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-23T18-58-19.504304.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_09_04T06_25_23.128128", "path": ["**/details_harness|hellaswag|10_2023-09-04T06:25:23.128128.parquet"]}, {"split": "2023_09_12T14_20_48.062177", "path": ["**/details_harness|hellaswag|10_2023-09-12T14-20-48.062177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-09-12T14-20-48.062177.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_09_04T06_25_23.128128", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-04T06:25:23.128128.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-09-04T06:25:23.128128.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-09-04T06:25:23.128128.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-09-04T06:25:23.128128.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-04T06:25:23.128128.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-09-04T06:25:23.128128.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-09-04T06:25:23.128128.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-09-04T06:25:23.128128.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-09-04T06:25:23.128128.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-09-04T06:25:23.128128.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-09-04T06:25:23.128128.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-09-04T06:25:23.128128.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-04T06:25:23.128128.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-09-04T06:25:23.128128.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-04T06:25:23.128128.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-04T06:25:23.128128.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-09-04T06:25:23.128128.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-09-04T06:25:23.128128.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-09-04T06:25:23.128128.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-04T06:25:23.128128.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-04T06:25:23.128128.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-04T06:25:23.128128.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-09-04T06:25:23.128128.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-04T06:25:23.128128.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-04T06:25:23.128128.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-04T06:25:23.128128.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-04T06:25:23.128128.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-09-04T06:25:23.128128.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-04T06:25:23.128128.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-04T06:25:23.128128.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-04T06:25:23.128128.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-04T06:25:23.128128.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-09-04T06:25:23.128128.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-09-04T06:25:23.128128.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-09-04T06:25:23.128128.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-09-04T06:25:23.128128.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-04T06:25:23.128128.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-09-04T06:25:23.128128.parquet", "**/details_harness|hendrycksTest-management|5_2023-09-04T06:25:23.128128.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-09-04T06:25:23.128128.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-09-04T06:25:23.128128.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-09-04T06:25:23.128128.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-09-04T06:25:23.128128.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-04T06:25:23.128128.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-09-04T06:25:23.128128.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-09-04T06:25:23.128128.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-09-04T06:25:23.128128.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-09-04T06:25:23.128128.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-09-04T06:25:23.128128.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-09-04T06:25:23.128128.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-09-04T06:25:23.128128.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-09-04T06:25:23.128128.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-09-04T06:25:23.128128.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-09-04T06:25:23.128128.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-04T06:25:23.128128.parquet", "**/details_harness|hendrycksTest-virology|5_2023-09-04T06:25:23.128128.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-09-04T06:25:23.128128.parquet"]}, {"split": "2023_09_12T14_20_48.062177", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T14-20-48.062177.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-09-12T14-20-48.062177.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-09-12T14-20-48.062177.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T14-20-48.062177.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T14-20-48.062177.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-09-12T14-20-48.062177.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T14-20-48.062177.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T14-20-48.062177.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T14-20-48.062177.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T14-20-48.062177.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-09-12T14-20-48.062177.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-09-12T14-20-48.062177.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T14-20-48.062177.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-09-12T14-20-48.062177.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T14-20-48.062177.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T14-20-48.062177.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T14-20-48.062177.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-09-12T14-20-48.062177.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T14-20-48.062177.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T14-20-48.062177.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T14-20-48.062177.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T14-20-48.062177.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T14-20-48.062177.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T14-20-48.062177.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T14-20-48.062177.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T14-20-48.062177.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T14-20-48.062177.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T14-20-48.062177.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T14-20-48.062177.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T14-20-48.062177.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T14-20-48.062177.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T14-20-48.062177.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-09-12T14-20-48.062177.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T14-20-48.062177.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-09-12T14-20-48.062177.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T14-20-48.062177.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T14-20-48.062177.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T14-20-48.062177.parquet", "**/details_harness|hendrycksTest-management|5_2023-09-12T14-20-48.062177.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-09-12T14-20-48.062177.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T14-20-48.062177.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T14-20-48.062177.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T14-20-48.062177.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T14-20-48.062177.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-09-12T14-20-48.062177.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-09-12T14-20-48.062177.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-09-12T14-20-48.062177.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T14-20-48.062177.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-09-12T14-20-48.062177.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T14-20-48.062177.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T14-20-48.062177.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-09-12T14-20-48.062177.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-09-12T14-20-48.062177.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-09-12T14-20-48.062177.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T14-20-48.062177.parquet", "**/details_harness|hendrycksTest-virology|5_2023-09-12T14-20-48.062177.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-09-12T14-20-48.062177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T14-20-48.062177.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-09-12T14-20-48.062177.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-09-12T14-20-48.062177.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T14-20-48.062177.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T14-20-48.062177.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-09-12T14-20-48.062177.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T14-20-48.062177.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T14-20-48.062177.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T14-20-48.062177.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T14-20-48.062177.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-09-12T14-20-48.062177.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-09-12T14-20-48.062177.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T14-20-48.062177.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-09-12T14-20-48.062177.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T14-20-48.062177.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T14-20-48.062177.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T14-20-48.062177.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-09-12T14-20-48.062177.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T14-20-48.062177.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T14-20-48.062177.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T14-20-48.062177.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T14-20-48.062177.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T14-20-48.062177.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T14-20-48.062177.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T14-20-48.062177.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T14-20-48.062177.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T14-20-48.062177.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T14-20-48.062177.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T14-20-48.062177.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T14-20-48.062177.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T14-20-48.062177.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T14-20-48.062177.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-09-12T14-20-48.062177.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T14-20-48.062177.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-09-12T14-20-48.062177.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T14-20-48.062177.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T14-20-48.062177.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T14-20-48.062177.parquet", "**/details_harness|hendrycksTest-management|5_2023-09-12T14-20-48.062177.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-09-12T14-20-48.062177.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T14-20-48.062177.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T14-20-48.062177.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T14-20-48.062177.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T14-20-48.062177.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-09-12T14-20-48.062177.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-09-12T14-20-48.062177.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-09-12T14-20-48.062177.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T14-20-48.062177.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-09-12T14-20-48.062177.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T14-20-48.062177.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T14-20-48.062177.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-09-12T14-20-48.062177.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-09-12T14-20-48.062177.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-09-12T14-20-48.062177.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T14-20-48.062177.parquet", "**/details_harness|hendrycksTest-virology|5_2023-09-12T14-20-48.062177.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-09-12T14-20-48.062177.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_09_04T06_25_23.128128", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-04T06:25:23.128128.parquet"]}, {"split": "2023_09_12T14_20_48.062177", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T14-20-48.062177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T14-20-48.062177.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_09_04T06_25_23.128128", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-09-04T06:25:23.128128.parquet"]}, {"split": "2023_09_12T14_20_48.062177", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-09-12T14-20-48.062177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-09-12T14-20-48.062177.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_09_04T06_25_23.128128", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-09-04T06:25:23.128128.parquet"]}, {"split": "2023_09_12T14_20_48.062177", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-09-12T14-20-48.062177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-09-12T14-20-48.062177.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_09_04T06_25_23.128128", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-09-04T06:25:23.128128.parquet"]}, {"split": "2023_09_12T14_20_48.062177", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T14-20-48.062177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T14-20-48.062177.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_09_04T06_25_23.128128", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-04T06:25:23.128128.parquet"]}, {"split": "2023_09_12T14_20_48.062177", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T14-20-48.062177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T14-20-48.062177.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_09_04T06_25_23.128128", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-09-04T06:25:23.128128.parquet"]}, {"split": "2023_09_12T14_20_48.062177", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-09-12T14-20-48.062177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-09-12T14-20-48.062177.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_09_04T06_25_23.128128", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-09-04T06:25:23.128128.parquet"]}, {"split": "2023_09_12T14_20_48.062177", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T14-20-48.062177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T14-20-48.062177.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_09_04T06_25_23.128128", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-09-04T06:25:23.128128.parquet"]}, {"split": "2023_09_12T14_20_48.062177", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T14-20-48.062177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T14-20-48.062177.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_09_04T06_25_23.128128", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-09-04T06:25:23.128128.parquet"]}, {"split": "2023_09_12T14_20_48.062177", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T14-20-48.062177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T14-20-48.062177.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_09_04T06_25_23.128128", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-09-04T06:25:23.128128.parquet"]}, {"split": "2023_09_12T14_20_48.062177", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T14-20-48.062177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T14-20-48.062177.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_09_04T06_25_23.128128", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-09-04T06:25:23.128128.parquet"]}, {"split": "2023_09_12T14_20_48.062177", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-09-12T14-20-48.062177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-09-12T14-20-48.062177.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_09_04T06_25_23.128128", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-09-04T06:25:23.128128.parquet"]}, {"split": "2023_09_12T14_20_48.062177", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-09-12T14-20-48.062177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-09-12T14-20-48.062177.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_09_04T06_25_23.128128", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-04T06:25:23.128128.parquet"]}, {"split": "2023_09_12T14_20_48.062177", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T14-20-48.062177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T14-20-48.062177.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_09_04T06_25_23.128128", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-09-04T06:25:23.128128.parquet"]}, {"split": "2023_09_12T14_20_48.062177", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-09-12T14-20-48.062177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-09-12T14-20-48.062177.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_09_04T06_25_23.128128", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-04T06:25:23.128128.parquet"]}, {"split": "2023_09_12T14_20_48.062177", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T14-20-48.062177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T14-20-48.062177.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_09_04T06_25_23.128128", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-04T06:25:23.128128.parquet"]}, {"split": "2023_09_12T14_20_48.062177", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T14-20-48.062177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T14-20-48.062177.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_09_04T06_25_23.128128", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-09-04T06:25:23.128128.parquet"]}, {"split": "2023_09_12T14_20_48.062177", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T14-20-48.062177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T14-20-48.062177.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_09_04T06_25_23.128128", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-09-04T06:25:23.128128.parquet"]}, {"split": "2023_09_12T14_20_48.062177", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-09-12T14-20-48.062177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-09-12T14-20-48.062177.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_09_04T06_25_23.128128", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-09-04T06:25:23.128128.parquet"]}, {"split": "2023_09_12T14_20_48.062177", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T14-20-48.062177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T14-20-48.062177.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_09_04T06_25_23.128128", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-04T06:25:23.128128.parquet"]}, {"split": "2023_09_12T14_20_48.062177", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T14-20-48.062177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T14-20-48.062177.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_09_04T06_25_23.128128", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-04T06:25:23.128128.parquet"]}, {"split": "2023_09_12T14_20_48.062177", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T14-20-48.062177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T14-20-48.062177.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_09_04T06_25_23.128128", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-04T06:25:23.128128.parquet"]}, {"split": "2023_09_12T14_20_48.062177", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T14-20-48.062177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T14-20-48.062177.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_09_04T06_25_23.128128", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-09-04T06:25:23.128128.parquet"]}, {"split": "2023_09_12T14_20_48.062177", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T14-20-48.062177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T14-20-48.062177.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_09_04T06_25_23.128128", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-04T06:25:23.128128.parquet"]}, {"split": "2023_09_12T14_20_48.062177", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T14-20-48.062177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T14-20-48.062177.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_09_04T06_25_23.128128", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-04T06:25:23.128128.parquet"]}, {"split": "2023_09_12T14_20_48.062177", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T14-20-48.062177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T14-20-48.062177.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_09_04T06_25_23.128128", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-04T06:25:23.128128.parquet"]}, {"split": "2023_09_12T14_20_48.062177", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T14-20-48.062177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T14-20-48.062177.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_09_04T06_25_23.128128", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-04T06:25:23.128128.parquet"]}, {"split": "2023_09_12T14_20_48.062177", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T14-20-48.062177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T14-20-48.062177.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_09_04T06_25_23.128128", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-09-04T06:25:23.128128.parquet"]}, {"split": "2023_09_12T14_20_48.062177", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T14-20-48.062177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T14-20-48.062177.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_09_04T06_25_23.128128", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-04T06:25:23.128128.parquet"]}, {"split": "2023_09_12T14_20_48.062177", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T14-20-48.062177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T14-20-48.062177.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_09_04T06_25_23.128128", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-04T06:25:23.128128.parquet"]}, {"split": "2023_09_12T14_20_48.062177", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T14-20-48.062177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T14-20-48.062177.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_09_04T06_25_23.128128", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-04T06:25:23.128128.parquet"]}, {"split": "2023_09_12T14_20_48.062177", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T14-20-48.062177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T14-20-48.062177.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_09_04T06_25_23.128128", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-04T06:25:23.128128.parquet"]}, {"split": "2023_09_12T14_20_48.062177", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T14-20-48.062177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T14-20-48.062177.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_09_04T06_25_23.128128", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-09-04T06:25:23.128128.parquet"]}, {"split": "2023_09_12T14_20_48.062177", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-09-12T14-20-48.062177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-09-12T14-20-48.062177.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_09_04T06_25_23.128128", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-09-04T06:25:23.128128.parquet"]}, {"split": "2023_09_12T14_20_48.062177", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T14-20-48.062177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T14-20-48.062177.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_09_04T06_25_23.128128", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-09-04T06:25:23.128128.parquet"]}, {"split": "2023_09_12T14_20_48.062177", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-09-12T14-20-48.062177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-09-12T14-20-48.062177.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_09_04T06_25_23.128128", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-09-04T06:25:23.128128.parquet"]}, {"split": "2023_09_12T14_20_48.062177", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T14-20-48.062177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T14-20-48.062177.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_09_04T06_25_23.128128", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-04T06:25:23.128128.parquet"]}, {"split": "2023_09_12T14_20_48.062177", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T14-20-48.062177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T14-20-48.062177.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_09_04T06_25_23.128128", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-09-04T06:25:23.128128.parquet"]}, {"split": "2023_09_12T14_20_48.062177", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T14-20-48.062177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T14-20-48.062177.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_09_04T06_25_23.128128", "path": ["**/details_harness|hendrycksTest-management|5_2023-09-04T06:25:23.128128.parquet"]}, {"split": "2023_09_12T14_20_48.062177", "path": ["**/details_harness|hendrycksTest-management|5_2023-09-12T14-20-48.062177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-09-12T14-20-48.062177.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_09_04T06_25_23.128128", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-09-04T06:25:23.128128.parquet"]}, {"split": "2023_09_12T14_20_48.062177", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-09-12T14-20-48.062177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-09-12T14-20-48.062177.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_09_04T06_25_23.128128", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-09-04T06:25:23.128128.parquet"]}, {"split": "2023_09_12T14_20_48.062177", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T14-20-48.062177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T14-20-48.062177.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_09_04T06_25_23.128128", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-09-04T06:25:23.128128.parquet"]}, {"split": "2023_09_12T14_20_48.062177", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T14-20-48.062177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T14-20-48.062177.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_09_04T06_25_23.128128", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-09-04T06:25:23.128128.parquet"]}, {"split": "2023_09_12T14_20_48.062177", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T14-20-48.062177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T14-20-48.062177.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_09_04T06_25_23.128128", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-04T06:25:23.128128.parquet"]}, {"split": "2023_09_12T14_20_48.062177", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T14-20-48.062177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T14-20-48.062177.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_09_04T06_25_23.128128", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-09-04T06:25:23.128128.parquet"]}, {"split": "2023_09_12T14_20_48.062177", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-09-12T14-20-48.062177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-09-12T14-20-48.062177.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_09_04T06_25_23.128128", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-09-04T06:25:23.128128.parquet"]}, {"split": "2023_09_12T14_20_48.062177", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-09-12T14-20-48.062177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-09-12T14-20-48.062177.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_09_04T06_25_23.128128", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-09-04T06:25:23.128128.parquet"]}, {"split": "2023_09_12T14_20_48.062177", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-09-12T14-20-48.062177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-09-12T14-20-48.062177.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_09_04T06_25_23.128128", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-09-04T06:25:23.128128.parquet"]}, {"split": "2023_09_12T14_20_48.062177", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T14-20-48.062177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T14-20-48.062177.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_09_04T06_25_23.128128", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-09-04T06:25:23.128128.parquet"]}, {"split": "2023_09_12T14_20_48.062177", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-09-12T14-20-48.062177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-09-12T14-20-48.062177.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_09_04T06_25_23.128128", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-09-04T06:25:23.128128.parquet"]}, {"split": "2023_09_12T14_20_48.062177", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T14-20-48.062177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T14-20-48.062177.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_09_04T06_25_23.128128", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-09-04T06:25:23.128128.parquet"]}, {"split": "2023_09_12T14_20_48.062177", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T14-20-48.062177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T14-20-48.062177.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_09_04T06_25_23.128128", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-09-04T06:25:23.128128.parquet"]}, {"split": "2023_09_12T14_20_48.062177", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-09-12T14-20-48.062177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-09-12T14-20-48.062177.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_09_04T06_25_23.128128", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-09-04T06:25:23.128128.parquet"]}, {"split": "2023_09_12T14_20_48.062177", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-09-12T14-20-48.062177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-09-12T14-20-48.062177.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_09_04T06_25_23.128128", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-09-04T06:25:23.128128.parquet"]}, {"split": "2023_09_12T14_20_48.062177", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-09-12T14-20-48.062177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-09-12T14-20-48.062177.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_09_04T06_25_23.128128", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-04T06:25:23.128128.parquet"]}, {"split": "2023_09_12T14_20_48.062177", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T14-20-48.062177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T14-20-48.062177.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_09_04T06_25_23.128128", "path": ["**/details_harness|hendrycksTest-virology|5_2023-09-04T06:25:23.128128.parquet"]}, {"split": "2023_09_12T14_20_48.062177", "path": ["**/details_harness|hendrycksTest-virology|5_2023-09-12T14-20-48.062177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-09-12T14-20-48.062177.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_09_04T06_25_23.128128", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-09-04T06:25:23.128128.parquet"]}, {"split": "2023_09_12T14_20_48.062177", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-09-12T14-20-48.062177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-09-12T14-20-48.062177.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_09_04T06_25_23.128128", "path": ["**/details_harness|truthfulqa:mc|0_2023-09-04T06:25:23.128128.parquet"]}, {"split": "2023_09_12T14_20_48.062177", "path": ["**/details_harness|truthfulqa:mc|0_2023-09-12T14-20-48.062177.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-09-12T14-20-48.062177.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_17T18_24_08.012097", "path": ["**/details_harness|winogrande|5_2023-10-17T18-24-08.012097.parquet"]}, {"split": "2023_10_23T18_58_19.504304", "path": ["**/details_harness|winogrande|5_2023-10-23T18-58-19.504304.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-23T18-58-19.504304.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_09_04T06_25_23.128128", "path": ["results_2023-09-04T06:25:23.128128.parquet"]}, {"split": "2023_09_12T14_20_48.062177", "path": ["results_2023-09-12T14-20-48.062177.parquet"]}, {"split": "2023_10_17T18_24_08.012097", "path": ["results_2023-10-17T18-24-08.012097.parquet"]}, {"split": "2023_10_23T18_58_19.504304", "path": ["results_2023-10-23T18-58-19.504304.parquet"]}, {"split": "latest", "path": ["results_2023-10-23T18-58-19.504304.parquet"]}]}]}
2023-10-23T17:58:32+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of uukuguy/speechless-codellama-orca-13b ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model uukuguy/speechless-codellama-orca-13b on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 4 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-10-23T18:58:19.504304(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of uukuguy/speechless-codellama-orca-13b", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model uukuguy/speechless-codellama-orca-13b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 4 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-23T18:58:19.504304(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of uukuguy/speechless-codellama-orca-13b", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model uukuguy/speechless-codellama-orca-13b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 4 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-23T18:58:19.504304(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 26, 31, 174, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of uukuguy/speechless-codellama-orca-13b## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model uukuguy/speechless-codellama-orca-13b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 4 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-23T18:58:19.504304(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
74c93f183100d1f1f502e76203f30cbfc8835796
Purpose: Translate English to German
vhtran/de-en-2023
[ "license:cc-by-4.0", "region:us" ]
2023-09-04T05:32:06+00:00
{"license": "cc-by-4.0"}
2023-09-04T23:55:37+00:00
[]
[]
TAGS #license-cc-by-4.0 #region-us
Purpose: Translate English to German
[]
[ "TAGS\n#license-cc-by-4.0 #region-us \n" ]
[ 15 ]
[ "passage: TAGS\n#license-cc-by-4.0 #region-us \n" ]
6fccfea866ad0845f643b9effd7026a8af88a8a6
# Dataset Card for "indo-snli" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
carles-undergrad-thesis/indo-snli
[ "region:us" ]
2023-09-04T05:51:04+00:00
{"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "validation", "path": "data/validation-*"}, {"split": "test", "path": "data/test-*"}]}], "dataset_info": {"features": [{"name": "premise", "dtype": "string"}, {"name": "hyphothesis", "dtype": "string"}, {"name": "label", "dtype": "int64"}], "splits": [{"name": "train", "num_bytes": 71816446, "num_examples": 550152}, {"name": "validation", "num_bytes": 1378057, "num_examples": 10000}, {"name": "test", "num_bytes": 1373665, "num_examples": 10000}], "download_size": 20415274, "dataset_size": 74568168}}
2023-09-04T05:51:11+00:00
[]
[]
TAGS #region-us
# Dataset Card for "indo-snli" More Information needed
[ "# Dataset Card for \"indo-snli\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"indo-snli\"\n\nMore Information needed" ]
[ 6, 15 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"indo-snli\"\n\nMore Information needed" ]
0cb5bbdc9d9acc5e9bbfea2f7c954c199ec07318
# Powpogy Fine-Tuning Dataset ## License This dataset is licensed under the [Apache-2.0 License](https://www.apache.org/licenses/LICENSE-2.0). ## Introduction This dataset was created to address the need for assessing various fine-tuning methods for machine learning models. The ultimate goal is to use this dataset to fine-tune pre-trained models and evaluate their ability to retain knowledge. ## Objective The primary objective is to offer a dataset with entirely new information that is not part of the training data for any existing models. By using this dataset, you can fine-tune a pre-trained model and assess the effectiveness of various fine-tuning techniques, particularly in terms of knowledge retention. ## About Powpogy Powpogy is a fictional superhero who does not exist in the training data of any current base or fine-tuned models. This dataset contains diverse information about Powpogy, making it an ideal resource for fine-tuning experiments. ## Usage This dataset can be used to: - Fine-tune pre-trained models - Validate the efficacy of different fine-tuning methods - Test the degree of knowledge retention in fine-tuned models ## Contributing If you have suggestions for improvements or additions to the dataset, feel free to open an issue or submit a pull request.
eduardem/powpogy
[ "license:apache-2.0", "region:us" ]
2023-09-04T05:57:52+00:00
{"license": "apache-2.0"}
2023-09-04T06:07:39+00:00
[]
[]
TAGS #license-apache-2.0 #region-us
# Powpogy Fine-Tuning Dataset ## License This dataset is licensed under the Apache-2.0 License. ## Introduction This dataset was created to address the need for assessing various fine-tuning methods for machine learning models. The ultimate goal is to use this dataset to fine-tune pre-trained models and evaluate their ability to retain knowledge. ## Objective The primary objective is to offer a dataset with entirely new information that is not part of the training data for any existing models. By using this dataset, you can fine-tune a pre-trained model and assess the effectiveness of various fine-tuning techniques, particularly in terms of knowledge retention. ## About Powpogy Powpogy is a fictional superhero who does not exist in the training data of any current base or fine-tuned models. This dataset contains diverse information about Powpogy, making it an ideal resource for fine-tuning experiments. ## Usage This dataset can be used to: - Fine-tune pre-trained models - Validate the efficacy of different fine-tuning methods - Test the degree of knowledge retention in fine-tuned models ## Contributing If you have suggestions for improvements or additions to the dataset, feel free to open an issue or submit a pull request.
[ "# Powpogy Fine-Tuning Dataset", "## License\nThis dataset is licensed under the Apache-2.0 License.", "## Introduction\n\nThis dataset was created to address the need for assessing various fine-tuning methods for machine learning models. The ultimate goal is to use this dataset to fine-tune pre-trained models and evaluate their ability to retain knowledge.", "## Objective\n\nThe primary objective is to offer a dataset with entirely new information that is not part of the training data for any existing models. By using this dataset, you can fine-tune a pre-trained model and assess the effectiveness of various fine-tuning techniques, particularly in terms of knowledge retention.", "## About Powpogy\n\nPowpogy is a fictional superhero who does not exist in the training data of any current base or fine-tuned models. This dataset contains diverse information about Powpogy, making it an ideal resource for fine-tuning experiments.", "## Usage\n\nThis dataset can be used to:\n- Fine-tune pre-trained models\n- Validate the efficacy of different fine-tuning methods\n- Test the degree of knowledge retention in fine-tuned models", "## Contributing\n\nIf you have suggestions for improvements or additions to the dataset, feel free to open an issue or submit a pull request." ]
[ "TAGS\n#license-apache-2.0 #region-us \n", "# Powpogy Fine-Tuning Dataset", "## License\nThis dataset is licensed under the Apache-2.0 License.", "## Introduction\n\nThis dataset was created to address the need for assessing various fine-tuning methods for machine learning models. The ultimate goal is to use this dataset to fine-tune pre-trained models and evaluate their ability to retain knowledge.", "## Objective\n\nThe primary objective is to offer a dataset with entirely new information that is not part of the training data for any existing models. By using this dataset, you can fine-tune a pre-trained model and assess the effectiveness of various fine-tuning techniques, particularly in terms of knowledge retention.", "## About Powpogy\n\nPowpogy is a fictional superhero who does not exist in the training data of any current base or fine-tuned models. This dataset contains diverse information about Powpogy, making it an ideal resource for fine-tuning experiments.", "## Usage\n\nThis dataset can be used to:\n- Fine-tune pre-trained models\n- Validate the efficacy of different fine-tuning methods\n- Test the degree of knowledge retention in fine-tuned models", "## Contributing\n\nIf you have suggestions for improvements or additions to the dataset, feel free to open an issue or submit a pull request." ]
[ 14, 10, 16, 55, 67, 58, 48, 31 ]
[ "passage: TAGS\n#license-apache-2.0 #region-us \n# Powpogy Fine-Tuning Dataset## License\nThis dataset is licensed under the Apache-2.0 License.## Introduction\n\nThis dataset was created to address the need for assessing various fine-tuning methods for machine learning models. The ultimate goal is to use this dataset to fine-tune pre-trained models and evaluate their ability to retain knowledge.## Objective\n\nThe primary objective is to offer a dataset with entirely new information that is not part of the training data for any existing models. By using this dataset, you can fine-tune a pre-trained model and assess the effectiveness of various fine-tuning techniques, particularly in terms of knowledge retention.## About Powpogy\n\nPowpogy is a fictional superhero who does not exist in the training data of any current base or fine-tuned models. This dataset contains diverse information about Powpogy, making it an ideal resource for fine-tuning experiments.## Usage\n\nThis dataset can be used to:\n- Fine-tune pre-trained models\n- Validate the efficacy of different fine-tuning methods\n- Test the degree of knowledge retention in fine-tuned models## Contributing\n\nIf you have suggestions for improvements or additions to the dataset, feel free to open an issue or submit a pull request." ]
2a1324aa2d4b21516b72b807aefbd988a3872c64
### 任务介绍 Fin-Eval涵盖了财富管理、保险、投资研究等多个金融领域,是专为金融领域大模型而构建的评估数据集,涵盖认知、生成、金融知识、金融逻辑以及安全合规五大类能力共28个子任务。在设计任务时,充分考虑到大模型在In-Context Learning、工具调用、CoT等方面的特点。 ### 数据读取 ```python from datasets import load_dataset dataset=load_dataset("TracyMc/testdataset", name="test") print(dataset["test"][0]) # {'id': 1, '大类': '认知', '任务': '金融意图理解', '问题': '近期美元汇率有没有大幅波动', '答案': '行情解读', '解释': None} ``` 更多数据集使用细节和评估方法见[github page](https://github.com) 如果需要全量数据集,请发送邮件申请授权:联系邮箱 [email protected]
TracyMc/testdataset
[ "license:mit", "region:us" ]
2023-09-04T06:04:02+00:00
{"license": "mit"}
2023-09-04T08:22:31+00:00
[]
[]
TAGS #license-mit #region-us
### 任务介绍 Fin-Eval涵盖了财富管理、保险、投资研究等多个金融领域,是专为金融领域大模型而构建的评估数据集,涵盖认知、生成、金融知识、金融逻辑以及安全合规五大类能力共28个子任务。在设计任务时,充分考虑到大模型在In-Context Learning、工具调用、CoT等方面的特点。 ### 数据读取 更多数据集使用细节和评估方法见github page 如果需要全量数据集,请发送邮件申请授权:联系邮箱 URL@URL。
[ "### 任务介绍\nFin-Eval涵盖了财富管理、保险、投资研究等多个金融领域,是专为金融领域大模型而构建的评估数据集,涵盖认知、生成、金融知识、金融逻辑以及安全合规五大类能力共28个子任务。在设计任务时,充分考虑到大模型在In-Context Learning、工具调用、CoT等方面的特点。", "### 数据读取\n\n\n更多数据集使用细节和评估方法见github page\n\n如果需要全量数据集,请发送邮件申请授权:联系邮箱 URL@URL。" ]
[ "TAGS\n#license-mit #region-us \n", "### 任务介绍\nFin-Eval涵盖了财富管理、保险、投资研究等多个金融领域,是专为金融领域大模型而构建的评估数据集,涵盖认知、生成、金融知识、金融逻辑以及安全合规五大类能力共28个子任务。在设计任务时,充分考虑到大模型在In-Context Learning、工具调用、CoT等方面的特点。", "### 数据读取\n\n\n更多数据集使用细节和评估方法见github page\n\n如果需要全量数据集,请发送邮件申请授权:联系邮箱 URL@URL。" ]
[ 11, 85, 38 ]
[ "passage: TAGS\n#license-mit #region-us \n### 任务介绍\nFin-Eval涵盖了财富管理、保险、投资研究等多个金融领域,是专为金融领域大模型而构建的评估数据集,涵盖认知、生成、金融知识、金融逻辑以及安全合规五大类能力共28个子任务。在设计任务时,充分考虑到大模型在In-Context Learning、工具调用、CoT等方面的特点。### 数据读取\n\n\n更多数据集使用细节和评估方法见github page\n\n如果需要全量数据集,请发送邮件申请授权:联系邮箱 URL@URL。" ]
ce76a701b74e72fe8df18f3a0bed9be12762d021
# Parrot_en_es Fine-Tuning Dataset ## Introduction This dataset serves as a sample for fine-tuning the `parrot_en_es` translation model. The data is structured in the OpenOrca format and is designed to train the model to perform specific tasks related to translation. ## Objective The primary aim of this dataset is to achieve alignment in the model so that it acts strictly as a translator. The model should output solely the translation, avoiding any additional comments, considerations, extra content, fixes, or refusals. ## Format ### OpenOrca The dataset employs the OpenOrca format and maintains a consistent system prompt across all samples. This uniformity in system prompts is intended to guide the model towards the desired behavior, which is to produce only the translation. ## Usage This dataset can be used to: - Fine-tune the `parrot_en_es` translation model - Validate the model's ability to output clean translations without any superfluous content - Assess the effectiveness of using a consistent system prompt to guide model behavior ## Contributing For suggestions to improve or expand the dataset, feel free to open an issue or submit a pull request.
eduardem/parrot_en_es_sample
[ "license:openrail", "region:us" ]
2023-09-04T06:09:08+00:00
{"license": "openrail"}
2023-09-04T06:14:17+00:00
[]
[]
TAGS #license-openrail #region-us
# Parrot_en_es Fine-Tuning Dataset ## Introduction This dataset serves as a sample for fine-tuning the 'parrot_en_es' translation model. The data is structured in the OpenOrca format and is designed to train the model to perform specific tasks related to translation. ## Objective The primary aim of this dataset is to achieve alignment in the model so that it acts strictly as a translator. The model should output solely the translation, avoiding any additional comments, considerations, extra content, fixes, or refusals. ## Format ### OpenOrca The dataset employs the OpenOrca format and maintains a consistent system prompt across all samples. This uniformity in system prompts is intended to guide the model towards the desired behavior, which is to produce only the translation. ## Usage This dataset can be used to: - Fine-tune the 'parrot_en_es' translation model - Validate the model's ability to output clean translations without any superfluous content - Assess the effectiveness of using a consistent system prompt to guide model behavior ## Contributing For suggestions to improve or expand the dataset, feel free to open an issue or submit a pull request.
[ "# Parrot_en_es Fine-Tuning Dataset", "## Introduction\n\nThis dataset serves as a sample for fine-tuning the 'parrot_en_es' translation model. The data is structured in the OpenOrca format and is designed to train the model to perform specific tasks related to translation.", "## Objective\n\nThe primary aim of this dataset is to achieve alignment in the model so that it acts strictly as a translator. The model should output solely the translation, avoiding any additional comments, considerations, extra content, fixes, or refusals.", "## Format", "### OpenOrca\n\nThe dataset employs the OpenOrca format and maintains a consistent system prompt across all samples. This uniformity in system prompts is intended to guide the model towards the desired behavior, which is to produce only the translation.", "## Usage\n\nThis dataset can be used to:\n- Fine-tune the 'parrot_en_es' translation model\n- Validate the model's ability to output clean translations without any superfluous content\n- Assess the effectiveness of using a consistent system prompt to guide model behavior", "## Contributing\n\nFor suggestions to improve or expand the dataset, feel free to open an issue or submit a pull request." ]
[ "TAGS\n#license-openrail #region-us \n", "# Parrot_en_es Fine-Tuning Dataset", "## Introduction\n\nThis dataset serves as a sample for fine-tuning the 'parrot_en_es' translation model. The data is structured in the OpenOrca format and is designed to train the model to perform specific tasks related to translation.", "## Objective\n\nThe primary aim of this dataset is to achieve alignment in the model so that it acts strictly as a translator. The model should output solely the translation, avoiding any additional comments, considerations, extra content, fixes, or refusals.", "## Format", "### OpenOrca\n\nThe dataset employs the OpenOrca format and maintains a consistent system prompt across all samples. This uniformity in system prompts is intended to guide the model towards the desired behavior, which is to produce only the translation.", "## Usage\n\nThis dataset can be used to:\n- Fine-tune the 'parrot_en_es' translation model\n- Validate the model's ability to output clean translations without any superfluous content\n- Assess the effectiveness of using a consistent system prompt to guide model behavior", "## Contributing\n\nFor suggestions to improve or expand the dataset, feel free to open an issue or submit a pull request." ]
[ 12, 13, 55, 61, 2, 54, 62, 26 ]
[ "passage: TAGS\n#license-openrail #region-us \n# Parrot_en_es Fine-Tuning Dataset## Introduction\n\nThis dataset serves as a sample for fine-tuning the 'parrot_en_es' translation model. The data is structured in the OpenOrca format and is designed to train the model to perform specific tasks related to translation.## Objective\n\nThe primary aim of this dataset is to achieve alignment in the model so that it acts strictly as a translator. The model should output solely the translation, avoiding any additional comments, considerations, extra content, fixes, or refusals.## Format### OpenOrca\n\nThe dataset employs the OpenOrca format and maintains a consistent system prompt across all samples. This uniformity in system prompts is intended to guide the model towards the desired behavior, which is to produce only the translation.## Usage\n\nThis dataset can be used to:\n- Fine-tune the 'parrot_en_es' translation model\n- Validate the model's ability to output clean translations without any superfluous content\n- Assess the effectiveness of using a consistent system prompt to guide model behavior## Contributing\n\nFor suggestions to improve or expand the dataset, feel free to open an issue or submit a pull request." ]
2275485b0c9b21a2bc05737b05a5bd3dda147314
# Dataset Card for "dnotes-dataset-v2" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
senga-ml/dnotes-dataset-v2
[ "region:us" ]
2023-09-04T06:09:39+00:00
{"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "validation", "path": "data/validation-*"}, {"split": "test", "path": "data/test-*"}]}], "dataset_info": {"features": [{"name": "image", "dtype": "image"}, {"name": "ground_truth", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 185713056.0, "num_examples": 88}, {"name": "validation", "num_bytes": 22128331.0, "num_examples": 10}, {"name": "test", "num_bytes": 6254889.0, "num_examples": 6}], "download_size": 213932198, "dataset_size": 214096276.0}}
2023-09-04T14:14:14+00:00
[]
[]
TAGS #region-us
# Dataset Card for "dnotes-dataset-v2" More Information needed
[ "# Dataset Card for \"dnotes-dataset-v2\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"dnotes-dataset-v2\"\n\nMore Information needed" ]
[ 6, 18 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"dnotes-dataset-v2\"\n\nMore Information needed" ]
09625ab381fcdad0f6b39533aca5d080137424b8
# Dataset Card for "BEBO_DS" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
gurprbebo/BEBO_DS
[ "region:us" ]
2023-09-04T06:19:23+00:00
{"dataset_info": {"features": [{"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 4947, "num_examples": 15}], "download_size": 3142, "dataset_size": 4947}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]}
2023-09-04T06:37:28+00:00
[]
[]
TAGS #region-us
# Dataset Card for "BEBO_DS" More Information needed
[ "# Dataset Card for \"BEBO_DS\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"BEBO_DS\"\n\nMore Information needed" ]
[ 6, 14 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"BEBO_DS\"\n\nMore Information needed" ]
743473fd52cae57710891dd3b39e959e3893e977
# Dataset Card for "psychology-dataset" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
fiveflow/psychology-dataset
[ "region:us" ]
2023-09-04T06:22:56+00:00
{"dataset_info": {"features": [{"name": "index", "dtype": "int64"}, {"name": "Unnamed: 0", "dtype": "int64"}, {"name": "question", "dtype": "string"}, {"name": "type", "dtype": "string"}, {"name": "keywords", "dtype": "string"}, {"name": "resp", "dtype": "string"}, {"name": "new_resp", "dtype": "string"}, {"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 3612449, "num_examples": 2710}], "download_size": 1189445, "dataset_size": 3612449}}
2023-09-05T04:21:51+00:00
[]
[]
TAGS #region-us
# Dataset Card for "psychology-dataset" More Information needed
[ "# Dataset Card for \"psychology-dataset\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"psychology-dataset\"\n\nMore Information needed" ]
[ 6, 16 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"psychology-dataset\"\n\nMore Information needed" ]
85a63b9f395f20e03a32496fdafd331be64c0bb3
# Dataset Card for "AA_BERT_FT" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
EgilKarlsen/AA_BERT_FT
[ "region:us" ]
2023-09-04T06:30:51+00:00
{"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "test", "path": "data/test-*"}]}], "dataset_info": {"features": [{"name": "0", "dtype": "float32"}, {"name": "1", "dtype": "float32"}, {"name": "2", "dtype": "float32"}, {"name": "3", "dtype": "float32"}, {"name": "4", "dtype": "float32"}, {"name": "5", "dtype": "float32"}, {"name": "6", "dtype": "float32"}, {"name": "7", "dtype": "float32"}, {"name": "8", "dtype": "float32"}, {"name": "9", "dtype": "float32"}, {"name": "10", "dtype": "float32"}, {"name": "11", "dtype": "float32"}, {"name": "12", "dtype": "float32"}, {"name": "13", "dtype": "float32"}, {"name": "14", "dtype": "float32"}, {"name": "15", "dtype": "float32"}, {"name": "16", "dtype": "float32"}, {"name": "17", "dtype": "float32"}, {"name": "18", "dtype": "float32"}, {"name": "19", "dtype": "float32"}, {"name": "20", "dtype": "float32"}, {"name": "21", "dtype": "float32"}, {"name": "22", "dtype": "float32"}, {"name": "23", "dtype": "float32"}, {"name": "24", "dtype": "float32"}, {"name": "25", "dtype": "float32"}, {"name": "26", "dtype": "float32"}, {"name": "27", "dtype": "float32"}, {"name": "28", "dtype": "float32"}, {"name": "29", "dtype": "float32"}, {"name": "30", "dtype": "float32"}, {"name": "31", "dtype": "float32"}, {"name": "32", "dtype": "float32"}, {"name": "33", "dtype": "float32"}, {"name": "34", "dtype": "float32"}, {"name": "35", "dtype": "float32"}, {"name": "36", "dtype": "float32"}, {"name": "37", "dtype": "float32"}, {"name": "38", "dtype": "float32"}, {"name": "39", "dtype": "float32"}, {"name": "40", "dtype": "float32"}, {"name": "41", "dtype": "float32"}, {"name": "42", "dtype": "float32"}, {"name": "43", "dtype": "float32"}, {"name": "44", "dtype": "float32"}, {"name": "45", "dtype": "float32"}, {"name": "46", "dtype": "float32"}, {"name": "47", "dtype": "float32"}, {"name": "48", "dtype": "float32"}, {"name": "49", "dtype": "float32"}, {"name": "50", "dtype": "float32"}, {"name": "51", "dtype": "float32"}, {"name": "52", "dtype": "float32"}, {"name": "53", "dtype": "float32"}, {"name": "54", "dtype": "float32"}, {"name": "55", "dtype": "float32"}, {"name": "56", "dtype": "float32"}, {"name": "57", "dtype": "float32"}, {"name": "58", "dtype": "float32"}, {"name": "59", "dtype": "float32"}, {"name": "60", "dtype": "float32"}, {"name": "61", "dtype": "float32"}, {"name": "62", "dtype": "float32"}, {"name": "63", "dtype": "float32"}, {"name": "64", "dtype": "float32"}, {"name": "65", "dtype": "float32"}, {"name": "66", "dtype": "float32"}, {"name": "67", "dtype": "float32"}, {"name": "68", "dtype": "float32"}, {"name": "69", "dtype": "float32"}, {"name": "70", "dtype": "float32"}, {"name": "71", "dtype": "float32"}, {"name": "72", "dtype": "float32"}, {"name": "73", "dtype": "float32"}, {"name": "74", "dtype": "float32"}, {"name": "75", "dtype": "float32"}, {"name": "76", "dtype": "float32"}, {"name": "77", "dtype": "float32"}, {"name": "78", "dtype": "float32"}, {"name": "79", "dtype": "float32"}, {"name": "80", "dtype": "float32"}, {"name": "81", "dtype": "float32"}, {"name": "82", "dtype": "float32"}, {"name": "83", "dtype": "float32"}, {"name": "84", "dtype": "float32"}, {"name": "85", "dtype": "float32"}, {"name": "86", "dtype": "float32"}, {"name": "87", "dtype": "float32"}, {"name": "88", "dtype": "float32"}, {"name": "89", "dtype": "float32"}, {"name": "90", "dtype": "float32"}, {"name": "91", "dtype": "float32"}, {"name": "92", "dtype": "float32"}, {"name": "93", "dtype": "float32"}, {"name": "94", "dtype": "float32"}, {"name": "95", "dtype": "float32"}, {"name": "96", "dtype": "float32"}, {"name": "97", "dtype": "float32"}, {"name": "98", "dtype": "float32"}, {"name": "99", "dtype": "float32"}, {"name": "100", "dtype": "float32"}, {"name": "101", "dtype": "float32"}, {"name": "102", "dtype": "float32"}, {"name": "103", "dtype": "float32"}, {"name": "104", "dtype": "float32"}, {"name": "105", "dtype": "float32"}, {"name": "106", "dtype": "float32"}, {"name": "107", "dtype": "float32"}, {"name": "108", "dtype": "float32"}, {"name": "109", "dtype": "float32"}, {"name": "110", "dtype": "float32"}, {"name": "111", "dtype": "float32"}, {"name": "112", "dtype": "float32"}, {"name": "113", "dtype": "float32"}, {"name": "114", "dtype": "float32"}, {"name": "115", "dtype": "float32"}, {"name": "116", "dtype": "float32"}, {"name": "117", "dtype": "float32"}, {"name": "118", "dtype": "float32"}, {"name": "119", "dtype": "float32"}, {"name": "120", "dtype": "float32"}, {"name": "121", "dtype": "float32"}, {"name": "122", "dtype": "float32"}, {"name": "123", "dtype": "float32"}, {"name": "124", "dtype": "float32"}, {"name": "125", "dtype": "float32"}, {"name": "126", "dtype": "float32"}, {"name": "127", "dtype": "float32"}, {"name": "128", "dtype": "float32"}, {"name": "129", "dtype": "float32"}, {"name": "130", "dtype": "float32"}, {"name": "131", "dtype": "float32"}, {"name": "132", "dtype": "float32"}, {"name": "133", "dtype": "float32"}, {"name": "134", "dtype": "float32"}, {"name": "135", "dtype": "float32"}, {"name": "136", "dtype": "float32"}, {"name": "137", "dtype": "float32"}, {"name": "138", "dtype": "float32"}, {"name": "139", "dtype": "float32"}, {"name": "140", "dtype": "float32"}, {"name": "141", "dtype": "float32"}, {"name": "142", "dtype": "float32"}, {"name": "143", "dtype": "float32"}, {"name": "144", "dtype": "float32"}, {"name": "145", "dtype": "float32"}, {"name": "146", "dtype": "float32"}, {"name": "147", "dtype": "float32"}, {"name": "148", "dtype": "float32"}, {"name": "149", "dtype": "float32"}, {"name": "150", "dtype": "float32"}, {"name": "151", "dtype": "float32"}, {"name": "152", "dtype": "float32"}, {"name": "153", "dtype": "float32"}, {"name": "154", "dtype": "float32"}, {"name": "155", "dtype": "float32"}, {"name": "156", "dtype": "float32"}, {"name": "157", "dtype": "float32"}, {"name": "158", "dtype": "float32"}, {"name": "159", "dtype": "float32"}, {"name": "160", "dtype": "float32"}, {"name": "161", "dtype": "float32"}, {"name": "162", "dtype": "float32"}, {"name": "163", "dtype": "float32"}, {"name": "164", "dtype": "float32"}, {"name": "165", "dtype": "float32"}, {"name": "166", "dtype": "float32"}, {"name": "167", "dtype": "float32"}, {"name": "168", "dtype": "float32"}, {"name": "169", "dtype": "float32"}, {"name": "170", "dtype": "float32"}, {"name": "171", "dtype": "float32"}, {"name": "172", "dtype": "float32"}, {"name": "173", "dtype": "float32"}, {"name": "174", "dtype": "float32"}, {"name": "175", "dtype": "float32"}, {"name": "176", "dtype": "float32"}, {"name": "177", "dtype": "float32"}, {"name": "178", "dtype": "float32"}, {"name": "179", "dtype": "float32"}, {"name": "180", "dtype": "float32"}, {"name": "181", "dtype": "float32"}, {"name": "182", "dtype": "float32"}, {"name": "183", "dtype": "float32"}, {"name": "184", "dtype": "float32"}, {"name": "185", "dtype": "float32"}, {"name": "186", "dtype": "float32"}, {"name": "187", "dtype": "float32"}, {"name": "188", "dtype": "float32"}, {"name": "189", "dtype": "float32"}, {"name": "190", "dtype": "float32"}, {"name": "191", "dtype": "float32"}, {"name": "192", "dtype": "float32"}, {"name": "193", "dtype": "float32"}, {"name": "194", "dtype": "float32"}, {"name": "195", "dtype": "float32"}, {"name": "196", "dtype": "float32"}, {"name": "197", "dtype": "float32"}, {"name": "198", "dtype": "float32"}, {"name": "199", "dtype": "float32"}, {"name": "200", "dtype": "float32"}, {"name": "201", "dtype": "float32"}, {"name": "202", "dtype": "float32"}, {"name": "203", "dtype": "float32"}, {"name": "204", "dtype": "float32"}, {"name": "205", "dtype": "float32"}, {"name": "206", "dtype": "float32"}, {"name": "207", "dtype": "float32"}, {"name": "208", "dtype": "float32"}, {"name": "209", "dtype": "float32"}, {"name": "210", "dtype": "float32"}, {"name": "211", "dtype": "float32"}, {"name": "212", "dtype": "float32"}, {"name": "213", "dtype": "float32"}, {"name": "214", "dtype": "float32"}, {"name": "215", "dtype": "float32"}, {"name": "216", "dtype": "float32"}, {"name": "217", "dtype": "float32"}, {"name": "218", "dtype": "float32"}, {"name": "219", "dtype": "float32"}, {"name": "220", "dtype": "float32"}, {"name": "221", "dtype": "float32"}, {"name": "222", "dtype": "float32"}, {"name": "223", "dtype": "float32"}, {"name": "224", "dtype": "float32"}, {"name": "225", "dtype": "float32"}, {"name": "226", "dtype": "float32"}, {"name": "227", "dtype": "float32"}, {"name": "228", "dtype": "float32"}, {"name": "229", "dtype": "float32"}, {"name": "230", "dtype": "float32"}, {"name": "231", "dtype": "float32"}, {"name": "232", "dtype": "float32"}, {"name": "233", "dtype": "float32"}, {"name": "234", "dtype": "float32"}, {"name": "235", "dtype": "float32"}, {"name": "236", "dtype": "float32"}, {"name": "237", "dtype": "float32"}, {"name": "238", "dtype": "float32"}, {"name": "239", "dtype": "float32"}, {"name": "240", "dtype": "float32"}, {"name": "241", "dtype": "float32"}, {"name": "242", "dtype": "float32"}, {"name": "243", "dtype": "float32"}, {"name": "244", "dtype": "float32"}, {"name": "245", "dtype": "float32"}, {"name": "246", "dtype": "float32"}, {"name": "247", "dtype": "float32"}, {"name": "248", "dtype": "float32"}, {"name": "249", "dtype": "float32"}, {"name": "250", "dtype": "float32"}, {"name": "251", "dtype": "float32"}, {"name": "252", "dtype": "float32"}, {"name": "253", "dtype": "float32"}, {"name": "254", "dtype": "float32"}, {"name": "255", "dtype": "float32"}, {"name": "256", "dtype": "float32"}, {"name": "257", "dtype": "float32"}, {"name": "258", "dtype": "float32"}, {"name": "259", "dtype": "float32"}, {"name": "260", "dtype": "float32"}, {"name": "261", "dtype": "float32"}, {"name": "262", "dtype": "float32"}, {"name": "263", "dtype": "float32"}, {"name": "264", "dtype": "float32"}, {"name": "265", "dtype": "float32"}, {"name": "266", "dtype": "float32"}, {"name": "267", "dtype": "float32"}, {"name": "268", "dtype": "float32"}, {"name": "269", "dtype": "float32"}, {"name": "270", "dtype": "float32"}, {"name": "271", "dtype": "float32"}, {"name": "272", "dtype": "float32"}, {"name": "273", "dtype": "float32"}, {"name": "274", "dtype": "float32"}, {"name": "275", "dtype": "float32"}, {"name": "276", "dtype": "float32"}, {"name": "277", "dtype": "float32"}, {"name": "278", "dtype": "float32"}, {"name": "279", "dtype": "float32"}, {"name": "280", "dtype": "float32"}, {"name": "281", "dtype": "float32"}, {"name": "282", "dtype": "float32"}, {"name": "283", "dtype": "float32"}, {"name": "284", "dtype": "float32"}, {"name": "285", "dtype": "float32"}, {"name": "286", "dtype": "float32"}, {"name": "287", "dtype": "float32"}, {"name": "288", "dtype": "float32"}, {"name": "289", "dtype": "float32"}, {"name": "290", "dtype": "float32"}, {"name": "291", "dtype": "float32"}, {"name": "292", "dtype": "float32"}, {"name": "293", "dtype": "float32"}, {"name": "294", "dtype": "float32"}, {"name": "295", "dtype": "float32"}, {"name": "296", "dtype": "float32"}, {"name": "297", "dtype": "float32"}, {"name": "298", "dtype": "float32"}, {"name": "299", "dtype": "float32"}, {"name": "300", "dtype": "float32"}, {"name": "301", "dtype": "float32"}, {"name": "302", "dtype": "float32"}, {"name": "303", "dtype": "float32"}, {"name": "304", "dtype": "float32"}, {"name": "305", "dtype": "float32"}, {"name": "306", "dtype": "float32"}, {"name": "307", "dtype": "float32"}, {"name": "308", "dtype": "float32"}, {"name": "309", "dtype": "float32"}, {"name": "310", "dtype": "float32"}, {"name": "311", "dtype": "float32"}, {"name": "312", "dtype": "float32"}, {"name": "313", "dtype": "float32"}, {"name": "314", "dtype": "float32"}, {"name": "315", "dtype": "float32"}, {"name": "316", "dtype": "float32"}, {"name": "317", "dtype": "float32"}, {"name": "318", "dtype": "float32"}, {"name": "319", "dtype": "float32"}, {"name": "320", "dtype": "float32"}, {"name": "321", "dtype": "float32"}, {"name": "322", "dtype": "float32"}, {"name": "323", "dtype": "float32"}, {"name": "324", "dtype": "float32"}, {"name": "325", "dtype": "float32"}, {"name": "326", "dtype": "float32"}, {"name": "327", "dtype": "float32"}, {"name": "328", "dtype": "float32"}, {"name": "329", "dtype": "float32"}, {"name": "330", "dtype": "float32"}, {"name": "331", "dtype": "float32"}, {"name": "332", "dtype": "float32"}, {"name": "333", "dtype": "float32"}, {"name": "334", "dtype": "float32"}, {"name": "335", "dtype": "float32"}, {"name": "336", "dtype": "float32"}, {"name": "337", "dtype": "float32"}, {"name": "338", "dtype": "float32"}, {"name": "339", "dtype": "float32"}, {"name": "340", "dtype": "float32"}, {"name": "341", "dtype": "float32"}, {"name": "342", "dtype": "float32"}, {"name": "343", "dtype": "float32"}, {"name": "344", "dtype": "float32"}, {"name": "345", "dtype": "float32"}, {"name": "346", "dtype": "float32"}, {"name": "347", "dtype": "float32"}, {"name": "348", "dtype": "float32"}, {"name": "349", "dtype": "float32"}, {"name": "350", "dtype": "float32"}, {"name": "351", "dtype": "float32"}, {"name": "352", "dtype": "float32"}, {"name": "353", "dtype": "float32"}, {"name": "354", "dtype": "float32"}, {"name": "355", "dtype": "float32"}, {"name": "356", "dtype": "float32"}, {"name": "357", "dtype": "float32"}, {"name": "358", "dtype": "float32"}, {"name": "359", "dtype": "float32"}, {"name": "360", "dtype": "float32"}, {"name": "361", "dtype": "float32"}, {"name": "362", "dtype": "float32"}, {"name": "363", "dtype": "float32"}, {"name": "364", "dtype": "float32"}, {"name": "365", "dtype": "float32"}, {"name": "366", "dtype": "float32"}, {"name": "367", "dtype": "float32"}, {"name": "368", "dtype": "float32"}, {"name": "369", "dtype": "float32"}, {"name": "370", "dtype": "float32"}, {"name": "371", "dtype": "float32"}, {"name": "372", "dtype": "float32"}, {"name": "373", "dtype": "float32"}, {"name": "374", "dtype": "float32"}, {"name": "375", "dtype": "float32"}, {"name": "376", "dtype": "float32"}, {"name": "377", "dtype": "float32"}, {"name": "378", "dtype": "float32"}, {"name": "379", "dtype": "float32"}, {"name": "380", "dtype": "float32"}, {"name": "381", "dtype": "float32"}, {"name": "382", "dtype": "float32"}, {"name": "383", "dtype": "float32"}, {"name": "384", "dtype": "float32"}, {"name": "385", "dtype": "float32"}, {"name": "386", "dtype": "float32"}, {"name": "387", "dtype": "float32"}, {"name": "388", "dtype": "float32"}, {"name": "389", "dtype": "float32"}, {"name": "390", "dtype": "float32"}, {"name": "391", "dtype": "float32"}, {"name": "392", "dtype": "float32"}, {"name": "393", "dtype": "float32"}, {"name": "394", "dtype": "float32"}, {"name": "395", "dtype": "float32"}, {"name": "396", "dtype": "float32"}, {"name": "397", "dtype": "float32"}, {"name": "398", "dtype": "float32"}, {"name": "399", "dtype": "float32"}, {"name": "400", "dtype": "float32"}, {"name": "401", "dtype": "float32"}, {"name": "402", "dtype": "float32"}, {"name": "403", "dtype": "float32"}, {"name": "404", "dtype": "float32"}, {"name": "405", "dtype": "float32"}, {"name": "406", "dtype": "float32"}, {"name": "407", "dtype": "float32"}, {"name": "408", "dtype": "float32"}, {"name": "409", "dtype": "float32"}, {"name": "410", "dtype": "float32"}, {"name": "411", "dtype": "float32"}, {"name": "412", "dtype": "float32"}, {"name": "413", "dtype": "float32"}, {"name": "414", "dtype": "float32"}, {"name": "415", "dtype": "float32"}, {"name": "416", "dtype": "float32"}, {"name": "417", "dtype": "float32"}, {"name": "418", "dtype": "float32"}, {"name": "419", "dtype": "float32"}, {"name": "420", "dtype": "float32"}, {"name": "421", "dtype": "float32"}, {"name": "422", "dtype": "float32"}, {"name": "423", "dtype": "float32"}, {"name": "424", "dtype": "float32"}, {"name": "425", "dtype": "float32"}, {"name": "426", "dtype": "float32"}, {"name": "427", "dtype": "float32"}, {"name": "428", "dtype": "float32"}, {"name": "429", "dtype": "float32"}, {"name": "430", "dtype": "float32"}, {"name": "431", "dtype": "float32"}, {"name": "432", "dtype": "float32"}, {"name": "433", "dtype": "float32"}, {"name": "434", "dtype": "float32"}, {"name": "435", "dtype": "float32"}, {"name": "436", "dtype": "float32"}, {"name": "437", "dtype": "float32"}, {"name": "438", "dtype": "float32"}, {"name": "439", "dtype": "float32"}, {"name": "440", "dtype": "float32"}, {"name": "441", "dtype": "float32"}, {"name": "442", "dtype": "float32"}, {"name": "443", "dtype": "float32"}, {"name": "444", "dtype": "float32"}, {"name": "445", "dtype": "float32"}, {"name": "446", "dtype": "float32"}, {"name": "447", "dtype": "float32"}, {"name": "448", "dtype": "float32"}, {"name": "449", "dtype": "float32"}, {"name": "450", "dtype": "float32"}, {"name": "451", "dtype": "float32"}, {"name": "452", "dtype": "float32"}, {"name": "453", "dtype": "float32"}, {"name": "454", "dtype": "float32"}, {"name": "455", "dtype": "float32"}, {"name": "456", "dtype": "float32"}, {"name": "457", "dtype": "float32"}, {"name": "458", "dtype": "float32"}, {"name": "459", "dtype": "float32"}, {"name": "460", "dtype": "float32"}, {"name": "461", "dtype": "float32"}, {"name": "462", "dtype": "float32"}, {"name": "463", "dtype": "float32"}, {"name": "464", "dtype": "float32"}, {"name": "465", "dtype": "float32"}, {"name": "466", "dtype": "float32"}, {"name": "467", "dtype": "float32"}, {"name": "468", "dtype": "float32"}, {"name": "469", "dtype": "float32"}, {"name": "470", "dtype": "float32"}, {"name": "471", "dtype": "float32"}, {"name": "472", "dtype": "float32"}, {"name": "473", "dtype": "float32"}, {"name": "474", "dtype": "float32"}, {"name": "475", "dtype": "float32"}, {"name": "476", "dtype": "float32"}, {"name": "477", "dtype": "float32"}, {"name": "478", "dtype": "float32"}, {"name": "479", "dtype": "float32"}, {"name": "480", "dtype": "float32"}, {"name": "481", "dtype": "float32"}, {"name": "482", "dtype": "float32"}, {"name": "483", "dtype": "float32"}, {"name": "484", "dtype": "float32"}, {"name": "485", "dtype": "float32"}, {"name": "486", "dtype": "float32"}, {"name": "487", "dtype": "float32"}, {"name": "488", "dtype": "float32"}, {"name": "489", "dtype": "float32"}, {"name": "490", "dtype": "float32"}, {"name": "491", "dtype": "float32"}, {"name": "492", "dtype": "float32"}, {"name": "493", "dtype": "float32"}, {"name": "494", "dtype": "float32"}, {"name": "495", "dtype": "float32"}, {"name": "496", "dtype": "float32"}, {"name": "497", "dtype": "float32"}, {"name": "498", "dtype": "float32"}, {"name": "499", "dtype": "float32"}, {"name": "500", "dtype": "float32"}, {"name": "501", "dtype": "float32"}, {"name": "502", "dtype": "float32"}, {"name": "503", "dtype": "float32"}, {"name": "504", "dtype": "float32"}, {"name": "505", "dtype": "float32"}, {"name": "506", "dtype": "float32"}, {"name": "507", "dtype": "float32"}, {"name": "508", "dtype": "float32"}, {"name": "509", "dtype": "float32"}, {"name": "510", "dtype": "float32"}, {"name": "511", "dtype": "float32"}, {"name": "512", "dtype": "float32"}, {"name": "513", "dtype": "float32"}, {"name": "514", "dtype": "float32"}, {"name": "515", "dtype": "float32"}, {"name": "516", "dtype": "float32"}, {"name": "517", "dtype": "float32"}, {"name": "518", "dtype": "float32"}, {"name": "519", "dtype": "float32"}, {"name": "520", "dtype": "float32"}, {"name": "521", "dtype": "float32"}, {"name": "522", "dtype": "float32"}, {"name": "523", "dtype": "float32"}, {"name": "524", "dtype": "float32"}, {"name": "525", "dtype": "float32"}, {"name": "526", "dtype": "float32"}, {"name": "527", "dtype": "float32"}, {"name": "528", "dtype": "float32"}, {"name": "529", "dtype": "float32"}, {"name": "530", "dtype": "float32"}, {"name": "531", "dtype": "float32"}, {"name": "532", "dtype": "float32"}, {"name": "533", "dtype": "float32"}, {"name": "534", "dtype": "float32"}, {"name": "535", "dtype": "float32"}, {"name": "536", "dtype": "float32"}, {"name": "537", "dtype": "float32"}, {"name": "538", "dtype": "float32"}, {"name": "539", "dtype": "float32"}, {"name": "540", "dtype": "float32"}, {"name": "541", "dtype": "float32"}, {"name": "542", "dtype": "float32"}, {"name": "543", "dtype": "float32"}, {"name": "544", "dtype": "float32"}, {"name": "545", "dtype": "float32"}, {"name": "546", "dtype": "float32"}, {"name": "547", "dtype": "float32"}, {"name": "548", "dtype": "float32"}, {"name": "549", "dtype": "float32"}, {"name": "550", "dtype": "float32"}, {"name": "551", "dtype": "float32"}, {"name": "552", "dtype": "float32"}, {"name": "553", "dtype": "float32"}, {"name": "554", "dtype": "float32"}, {"name": "555", "dtype": "float32"}, {"name": "556", "dtype": "float32"}, {"name": "557", "dtype": "float32"}, {"name": "558", "dtype": "float32"}, {"name": "559", "dtype": "float32"}, {"name": "560", "dtype": "float32"}, {"name": "561", "dtype": "float32"}, {"name": "562", "dtype": "float32"}, {"name": "563", "dtype": "float32"}, {"name": "564", "dtype": "float32"}, {"name": "565", "dtype": "float32"}, {"name": "566", "dtype": "float32"}, {"name": "567", "dtype": "float32"}, {"name": "568", "dtype": "float32"}, {"name": "569", "dtype": "float32"}, {"name": "570", "dtype": "float32"}, {"name": "571", "dtype": "float32"}, {"name": "572", "dtype": "float32"}, {"name": "573", "dtype": "float32"}, {"name": "574", "dtype": "float32"}, {"name": "575", "dtype": "float32"}, {"name": "576", "dtype": "float32"}, {"name": "577", "dtype": "float32"}, {"name": "578", "dtype": "float32"}, {"name": "579", "dtype": "float32"}, {"name": "580", "dtype": "float32"}, {"name": "581", "dtype": "float32"}, {"name": "582", "dtype": "float32"}, {"name": "583", "dtype": "float32"}, {"name": "584", "dtype": "float32"}, {"name": "585", "dtype": "float32"}, {"name": "586", "dtype": "float32"}, {"name": "587", "dtype": "float32"}, {"name": "588", "dtype": "float32"}, {"name": "589", "dtype": "float32"}, {"name": "590", "dtype": "float32"}, {"name": "591", "dtype": "float32"}, {"name": "592", "dtype": "float32"}, {"name": "593", "dtype": "float32"}, {"name": "594", "dtype": "float32"}, {"name": "595", "dtype": "float32"}, {"name": "596", "dtype": "float32"}, {"name": "597", "dtype": "float32"}, {"name": "598", "dtype": "float32"}, {"name": "599", "dtype": "float32"}, {"name": "600", "dtype": "float32"}, {"name": "601", "dtype": "float32"}, {"name": "602", "dtype": "float32"}, {"name": "603", "dtype": "float32"}, {"name": "604", "dtype": "float32"}, {"name": "605", "dtype": "float32"}, {"name": "606", "dtype": "float32"}, {"name": "607", "dtype": "float32"}, {"name": "608", "dtype": "float32"}, {"name": "609", "dtype": "float32"}, {"name": "610", "dtype": "float32"}, {"name": "611", "dtype": "float32"}, {"name": "612", "dtype": "float32"}, {"name": "613", "dtype": "float32"}, {"name": "614", "dtype": "float32"}, {"name": "615", "dtype": "float32"}, {"name": "616", "dtype": "float32"}, {"name": "617", "dtype": "float32"}, {"name": "618", "dtype": "float32"}, {"name": "619", "dtype": "float32"}, {"name": "620", "dtype": "float32"}, {"name": "621", "dtype": "float32"}, {"name": "622", "dtype": "float32"}, {"name": "623", "dtype": "float32"}, {"name": "624", "dtype": "float32"}, {"name": "625", "dtype": "float32"}, {"name": "626", "dtype": "float32"}, {"name": "627", "dtype": "float32"}, {"name": "628", "dtype": "float32"}, {"name": "629", "dtype": "float32"}, {"name": "630", "dtype": "float32"}, {"name": "631", "dtype": "float32"}, {"name": "632", "dtype": "float32"}, {"name": "633", "dtype": "float32"}, {"name": "634", "dtype": "float32"}, {"name": "635", "dtype": "float32"}, {"name": "636", "dtype": "float32"}, {"name": "637", "dtype": "float32"}, {"name": "638", "dtype": "float32"}, {"name": "639", "dtype": "float32"}, {"name": "640", "dtype": "float32"}, {"name": "641", "dtype": "float32"}, {"name": "642", "dtype": "float32"}, {"name": "643", "dtype": "float32"}, {"name": "644", "dtype": "float32"}, {"name": "645", "dtype": "float32"}, {"name": "646", "dtype": "float32"}, {"name": "647", "dtype": "float32"}, {"name": "648", "dtype": "float32"}, {"name": "649", "dtype": "float32"}, {"name": "650", "dtype": "float32"}, {"name": "651", "dtype": "float32"}, {"name": "652", "dtype": "float32"}, {"name": "653", "dtype": "float32"}, {"name": "654", "dtype": "float32"}, {"name": "655", "dtype": "float32"}, {"name": "656", "dtype": "float32"}, {"name": "657", "dtype": "float32"}, {"name": "658", "dtype": "float32"}, {"name": "659", "dtype": "float32"}, {"name": "660", "dtype": "float32"}, {"name": "661", "dtype": "float32"}, {"name": "662", "dtype": "float32"}, {"name": "663", "dtype": "float32"}, {"name": "664", "dtype": "float32"}, {"name": "665", "dtype": "float32"}, {"name": "666", "dtype": "float32"}, {"name": "667", "dtype": "float32"}, {"name": "668", "dtype": "float32"}, {"name": "669", "dtype": "float32"}, {"name": "670", "dtype": "float32"}, {"name": "671", "dtype": "float32"}, {"name": "672", "dtype": "float32"}, {"name": "673", "dtype": "float32"}, {"name": "674", "dtype": "float32"}, {"name": "675", "dtype": "float32"}, {"name": "676", "dtype": "float32"}, {"name": "677", "dtype": "float32"}, {"name": "678", "dtype": "float32"}, {"name": "679", "dtype": "float32"}, {"name": "680", "dtype": "float32"}, {"name": "681", "dtype": "float32"}, {"name": "682", "dtype": "float32"}, {"name": "683", "dtype": "float32"}, {"name": "684", "dtype": "float32"}, {"name": "685", "dtype": "float32"}, {"name": "686", "dtype": "float32"}, {"name": "687", "dtype": "float32"}, {"name": "688", "dtype": "float32"}, {"name": "689", "dtype": "float32"}, {"name": "690", "dtype": "float32"}, {"name": "691", "dtype": "float32"}, {"name": "692", "dtype": "float32"}, {"name": "693", "dtype": "float32"}, {"name": "694", "dtype": "float32"}, {"name": "695", "dtype": "float32"}, {"name": "696", "dtype": "float32"}, {"name": "697", "dtype": "float32"}, {"name": "698", "dtype": "float32"}, {"name": "699", "dtype": "float32"}, {"name": "700", "dtype": "float32"}, {"name": "701", "dtype": "float32"}, {"name": "702", "dtype": "float32"}, {"name": "703", "dtype": "float32"}, {"name": "704", "dtype": "float32"}, {"name": "705", "dtype": "float32"}, {"name": "706", "dtype": "float32"}, {"name": "707", "dtype": "float32"}, {"name": "708", "dtype": "float32"}, {"name": "709", "dtype": "float32"}, {"name": "710", "dtype": "float32"}, {"name": "711", "dtype": "float32"}, {"name": "712", "dtype": "float32"}, {"name": "713", "dtype": "float32"}, {"name": "714", "dtype": "float32"}, {"name": "715", "dtype": "float32"}, {"name": "716", "dtype": "float32"}, {"name": "717", "dtype": "float32"}, {"name": "718", "dtype": "float32"}, {"name": "719", "dtype": "float32"}, {"name": "720", "dtype": "float32"}, {"name": "721", "dtype": "float32"}, {"name": "722", "dtype": "float32"}, {"name": "723", "dtype": "float32"}, {"name": "724", "dtype": "float32"}, {"name": "725", "dtype": "float32"}, {"name": "726", "dtype": "float32"}, {"name": "727", "dtype": "float32"}, {"name": "728", "dtype": "float32"}, {"name": "729", "dtype": "float32"}, {"name": "730", "dtype": "float32"}, {"name": "731", "dtype": "float32"}, {"name": "732", "dtype": "float32"}, {"name": "733", "dtype": "float32"}, {"name": "734", "dtype": "float32"}, {"name": "735", "dtype": "float32"}, {"name": "736", "dtype": "float32"}, {"name": "737", "dtype": "float32"}, {"name": "738", "dtype": "float32"}, {"name": "739", "dtype": "float32"}, {"name": "740", "dtype": "float32"}, {"name": "741", "dtype": "float32"}, {"name": "742", "dtype": "float32"}, {"name": "743", "dtype": "float32"}, {"name": "744", "dtype": "float32"}, {"name": "745", "dtype": "float32"}, {"name": "746", "dtype": "float32"}, {"name": "747", "dtype": "float32"}, {"name": "748", "dtype": "float32"}, {"name": "749", "dtype": "float32"}, {"name": "750", "dtype": "float32"}, {"name": "751", "dtype": "float32"}, {"name": "752", "dtype": "float32"}, {"name": "753", "dtype": "float32"}, {"name": "754", "dtype": "float32"}, {"name": "755", "dtype": "float32"}, {"name": "756", "dtype": "float32"}, {"name": "757", "dtype": "float32"}, {"name": "758", "dtype": "float32"}, {"name": "759", "dtype": "float32"}, {"name": "760", "dtype": "float32"}, {"name": "761", "dtype": "float32"}, {"name": "762", "dtype": "float32"}, {"name": "763", "dtype": "float32"}, {"name": "764", "dtype": "float32"}, {"name": "765", "dtype": "float32"}, {"name": "766", "dtype": "float32"}, {"name": "767", "dtype": "float32"}, {"name": "label", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 80318765, "num_examples": 26057}, {"name": "test", "num_bytes": 26774056, "num_examples": 8686}], "download_size": 147053945, "dataset_size": 107092821}}
2023-09-04T06:31:16+00:00
[]
[]
TAGS #region-us
# Dataset Card for "AA_BERT_FT" More Information needed
[ "# Dataset Card for \"AA_BERT_FT\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"AA_BERT_FT\"\n\nMore Information needed" ]
[ 6, 16 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"AA_BERT_FT\"\n\nMore Information needed" ]
3db182ea5cde44326550b8fbd07551ba8d9764d2
# Dataset Card for "AA_RoBERTa_FT" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
EgilKarlsen/AA_RoBERTa_FT
[ "region:us" ]
2023-09-04T06:33:24+00:00
{"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "test", "path": "data/test-*"}]}], "dataset_info": {"features": [{"name": "0", "dtype": "float32"}, {"name": "1", "dtype": "float32"}, {"name": "2", "dtype": "float32"}, {"name": "3", "dtype": "float32"}, {"name": "4", "dtype": "float32"}, {"name": "5", "dtype": "float32"}, {"name": "6", "dtype": "float32"}, {"name": "7", "dtype": "float32"}, {"name": "8", "dtype": "float32"}, {"name": "9", "dtype": "float32"}, {"name": "10", "dtype": "float32"}, {"name": "11", "dtype": "float32"}, {"name": "12", "dtype": "float32"}, {"name": "13", "dtype": "float32"}, {"name": "14", "dtype": "float32"}, {"name": "15", "dtype": "float32"}, {"name": "16", "dtype": "float32"}, {"name": "17", "dtype": "float32"}, {"name": "18", "dtype": "float32"}, {"name": "19", "dtype": "float32"}, {"name": "20", "dtype": "float32"}, {"name": "21", "dtype": "float32"}, {"name": "22", "dtype": "float32"}, {"name": "23", "dtype": "float32"}, {"name": "24", "dtype": "float32"}, {"name": "25", "dtype": "float32"}, {"name": "26", "dtype": "float32"}, {"name": "27", "dtype": "float32"}, {"name": "28", "dtype": "float32"}, {"name": "29", "dtype": "float32"}, {"name": "30", "dtype": "float32"}, {"name": "31", "dtype": "float32"}, {"name": "32", "dtype": "float32"}, {"name": "33", "dtype": "float32"}, {"name": "34", "dtype": "float32"}, {"name": "35", "dtype": "float32"}, {"name": "36", "dtype": "float32"}, {"name": "37", "dtype": "float32"}, {"name": "38", "dtype": "float32"}, {"name": "39", "dtype": "float32"}, {"name": "40", "dtype": "float32"}, {"name": "41", "dtype": "float32"}, {"name": "42", "dtype": "float32"}, {"name": "43", "dtype": "float32"}, {"name": "44", "dtype": "float32"}, {"name": "45", "dtype": "float32"}, {"name": "46", "dtype": "float32"}, {"name": "47", "dtype": "float32"}, {"name": "48", "dtype": "float32"}, {"name": "49", "dtype": "float32"}, {"name": "50", "dtype": "float32"}, {"name": "51", "dtype": "float32"}, {"name": "52", "dtype": "float32"}, {"name": "53", "dtype": "float32"}, {"name": "54", "dtype": "float32"}, {"name": "55", "dtype": "float32"}, {"name": "56", "dtype": "float32"}, {"name": "57", "dtype": "float32"}, {"name": "58", "dtype": "float32"}, {"name": "59", "dtype": "float32"}, {"name": "60", "dtype": "float32"}, {"name": "61", "dtype": "float32"}, {"name": "62", "dtype": "float32"}, {"name": "63", "dtype": "float32"}, {"name": "64", "dtype": "float32"}, {"name": "65", "dtype": "float32"}, {"name": "66", "dtype": "float32"}, {"name": "67", "dtype": "float32"}, {"name": "68", "dtype": "float32"}, {"name": "69", "dtype": "float32"}, {"name": "70", "dtype": "float32"}, {"name": "71", "dtype": "float32"}, {"name": "72", "dtype": "float32"}, {"name": "73", "dtype": "float32"}, {"name": "74", "dtype": "float32"}, {"name": "75", "dtype": "float32"}, {"name": "76", "dtype": "float32"}, {"name": "77", "dtype": "float32"}, {"name": "78", "dtype": "float32"}, {"name": "79", "dtype": "float32"}, {"name": "80", "dtype": "float32"}, {"name": "81", "dtype": "float32"}, {"name": "82", "dtype": "float32"}, {"name": "83", "dtype": "float32"}, {"name": "84", "dtype": "float32"}, {"name": "85", "dtype": "float32"}, {"name": "86", "dtype": "float32"}, {"name": "87", "dtype": "float32"}, {"name": "88", "dtype": "float32"}, {"name": "89", "dtype": "float32"}, {"name": "90", "dtype": "float32"}, {"name": "91", "dtype": "float32"}, {"name": "92", "dtype": "float32"}, {"name": "93", "dtype": "float32"}, {"name": "94", "dtype": "float32"}, {"name": "95", "dtype": "float32"}, {"name": "96", "dtype": "float32"}, {"name": "97", "dtype": "float32"}, {"name": "98", "dtype": "float32"}, {"name": "99", "dtype": "float32"}, {"name": "100", "dtype": "float32"}, {"name": "101", "dtype": "float32"}, {"name": "102", "dtype": "float32"}, {"name": "103", "dtype": "float32"}, {"name": "104", "dtype": "float32"}, {"name": "105", "dtype": "float32"}, {"name": "106", "dtype": "float32"}, {"name": "107", "dtype": "float32"}, {"name": "108", "dtype": "float32"}, {"name": "109", "dtype": "float32"}, {"name": "110", "dtype": "float32"}, {"name": "111", "dtype": "float32"}, {"name": "112", "dtype": "float32"}, {"name": "113", "dtype": "float32"}, {"name": "114", "dtype": "float32"}, {"name": "115", "dtype": "float32"}, {"name": "116", "dtype": "float32"}, {"name": "117", "dtype": "float32"}, {"name": "118", "dtype": "float32"}, {"name": "119", "dtype": "float32"}, {"name": "120", "dtype": "float32"}, {"name": "121", "dtype": "float32"}, {"name": "122", "dtype": "float32"}, {"name": "123", "dtype": "float32"}, {"name": "124", "dtype": "float32"}, {"name": "125", "dtype": "float32"}, {"name": "126", "dtype": "float32"}, {"name": "127", "dtype": "float32"}, {"name": "128", "dtype": "float32"}, {"name": "129", "dtype": "float32"}, {"name": "130", "dtype": "float32"}, {"name": "131", "dtype": "float32"}, {"name": "132", "dtype": "float32"}, {"name": "133", "dtype": "float32"}, {"name": "134", "dtype": "float32"}, {"name": "135", "dtype": "float32"}, {"name": "136", "dtype": "float32"}, {"name": "137", "dtype": "float32"}, {"name": "138", "dtype": "float32"}, {"name": "139", "dtype": "float32"}, {"name": "140", "dtype": "float32"}, {"name": "141", "dtype": "float32"}, {"name": "142", "dtype": "float32"}, {"name": "143", "dtype": "float32"}, {"name": "144", "dtype": "float32"}, {"name": "145", "dtype": "float32"}, {"name": "146", "dtype": "float32"}, {"name": "147", "dtype": "float32"}, {"name": "148", "dtype": "float32"}, {"name": "149", "dtype": "float32"}, {"name": "150", "dtype": "float32"}, {"name": "151", "dtype": "float32"}, {"name": "152", "dtype": "float32"}, {"name": "153", "dtype": "float32"}, {"name": "154", "dtype": "float32"}, {"name": "155", "dtype": "float32"}, {"name": "156", "dtype": "float32"}, {"name": "157", "dtype": "float32"}, {"name": "158", "dtype": "float32"}, {"name": "159", "dtype": "float32"}, {"name": "160", "dtype": "float32"}, {"name": "161", "dtype": "float32"}, {"name": "162", "dtype": "float32"}, {"name": "163", "dtype": "float32"}, {"name": "164", "dtype": "float32"}, {"name": "165", "dtype": "float32"}, {"name": "166", "dtype": "float32"}, {"name": "167", "dtype": "float32"}, {"name": "168", "dtype": "float32"}, {"name": "169", "dtype": "float32"}, {"name": "170", "dtype": "float32"}, {"name": "171", "dtype": "float32"}, {"name": "172", "dtype": "float32"}, {"name": "173", "dtype": "float32"}, {"name": "174", "dtype": "float32"}, {"name": "175", "dtype": "float32"}, {"name": "176", "dtype": "float32"}, {"name": "177", "dtype": "float32"}, {"name": "178", "dtype": "float32"}, {"name": "179", "dtype": "float32"}, {"name": "180", "dtype": "float32"}, {"name": "181", "dtype": "float32"}, {"name": "182", "dtype": "float32"}, {"name": "183", "dtype": "float32"}, {"name": "184", "dtype": "float32"}, {"name": "185", "dtype": "float32"}, {"name": "186", "dtype": "float32"}, {"name": "187", "dtype": "float32"}, {"name": "188", "dtype": "float32"}, {"name": "189", "dtype": "float32"}, {"name": "190", "dtype": "float32"}, {"name": "191", "dtype": "float32"}, {"name": "192", "dtype": "float32"}, {"name": "193", "dtype": "float32"}, {"name": "194", "dtype": "float32"}, {"name": "195", "dtype": "float32"}, {"name": "196", "dtype": "float32"}, {"name": "197", "dtype": "float32"}, {"name": "198", "dtype": "float32"}, {"name": "199", "dtype": "float32"}, {"name": "200", "dtype": "float32"}, {"name": "201", "dtype": "float32"}, {"name": "202", "dtype": "float32"}, {"name": "203", "dtype": "float32"}, {"name": "204", "dtype": "float32"}, {"name": "205", "dtype": "float32"}, {"name": "206", "dtype": "float32"}, {"name": "207", "dtype": "float32"}, {"name": "208", "dtype": "float32"}, {"name": "209", "dtype": "float32"}, {"name": "210", "dtype": "float32"}, {"name": "211", "dtype": "float32"}, {"name": "212", "dtype": "float32"}, {"name": "213", "dtype": "float32"}, {"name": "214", "dtype": "float32"}, {"name": "215", "dtype": "float32"}, {"name": "216", "dtype": "float32"}, {"name": "217", "dtype": "float32"}, {"name": "218", "dtype": "float32"}, {"name": "219", "dtype": "float32"}, {"name": "220", "dtype": "float32"}, {"name": "221", "dtype": "float32"}, {"name": "222", "dtype": "float32"}, {"name": "223", "dtype": "float32"}, {"name": "224", "dtype": "float32"}, {"name": "225", "dtype": "float32"}, {"name": "226", "dtype": "float32"}, {"name": "227", "dtype": "float32"}, {"name": "228", "dtype": "float32"}, {"name": "229", "dtype": "float32"}, {"name": "230", "dtype": "float32"}, {"name": "231", "dtype": "float32"}, {"name": "232", "dtype": "float32"}, {"name": "233", "dtype": "float32"}, {"name": "234", "dtype": "float32"}, {"name": "235", "dtype": "float32"}, {"name": "236", "dtype": "float32"}, {"name": "237", "dtype": "float32"}, {"name": "238", "dtype": "float32"}, {"name": "239", "dtype": "float32"}, {"name": "240", "dtype": "float32"}, {"name": "241", "dtype": "float32"}, {"name": "242", "dtype": "float32"}, {"name": "243", "dtype": "float32"}, {"name": "244", "dtype": "float32"}, {"name": "245", "dtype": "float32"}, {"name": "246", "dtype": "float32"}, {"name": "247", "dtype": "float32"}, {"name": "248", "dtype": "float32"}, {"name": "249", "dtype": "float32"}, {"name": "250", "dtype": "float32"}, {"name": "251", "dtype": "float32"}, {"name": "252", "dtype": "float32"}, {"name": "253", "dtype": "float32"}, {"name": "254", "dtype": "float32"}, {"name": "255", "dtype": "float32"}, {"name": "256", "dtype": "float32"}, {"name": "257", "dtype": "float32"}, {"name": "258", "dtype": "float32"}, {"name": "259", "dtype": "float32"}, {"name": "260", "dtype": "float32"}, {"name": "261", "dtype": "float32"}, {"name": "262", "dtype": "float32"}, {"name": "263", "dtype": "float32"}, {"name": "264", "dtype": "float32"}, {"name": "265", "dtype": "float32"}, {"name": "266", "dtype": "float32"}, {"name": "267", "dtype": "float32"}, {"name": "268", "dtype": "float32"}, {"name": "269", "dtype": "float32"}, {"name": "270", "dtype": "float32"}, {"name": "271", "dtype": "float32"}, {"name": "272", "dtype": "float32"}, {"name": "273", "dtype": "float32"}, {"name": "274", "dtype": "float32"}, {"name": "275", "dtype": "float32"}, {"name": "276", "dtype": "float32"}, {"name": "277", "dtype": "float32"}, {"name": "278", "dtype": "float32"}, {"name": "279", "dtype": "float32"}, {"name": "280", "dtype": "float32"}, {"name": "281", "dtype": "float32"}, {"name": "282", "dtype": "float32"}, {"name": "283", "dtype": "float32"}, {"name": "284", "dtype": "float32"}, {"name": "285", "dtype": "float32"}, {"name": "286", "dtype": "float32"}, {"name": "287", "dtype": "float32"}, {"name": "288", "dtype": "float32"}, {"name": "289", "dtype": "float32"}, {"name": "290", "dtype": "float32"}, {"name": "291", "dtype": "float32"}, {"name": "292", "dtype": "float32"}, {"name": "293", "dtype": "float32"}, {"name": "294", "dtype": "float32"}, {"name": "295", "dtype": "float32"}, {"name": "296", "dtype": "float32"}, {"name": "297", "dtype": "float32"}, {"name": "298", "dtype": "float32"}, {"name": "299", "dtype": "float32"}, {"name": "300", "dtype": "float32"}, {"name": "301", "dtype": "float32"}, {"name": "302", "dtype": "float32"}, {"name": "303", "dtype": "float32"}, {"name": "304", "dtype": "float32"}, {"name": "305", "dtype": "float32"}, {"name": "306", "dtype": "float32"}, {"name": "307", "dtype": "float32"}, {"name": "308", "dtype": "float32"}, {"name": "309", "dtype": "float32"}, {"name": "310", "dtype": "float32"}, {"name": "311", "dtype": "float32"}, {"name": "312", "dtype": "float32"}, {"name": "313", "dtype": "float32"}, {"name": "314", "dtype": "float32"}, {"name": "315", "dtype": "float32"}, {"name": "316", "dtype": "float32"}, {"name": "317", "dtype": "float32"}, {"name": "318", "dtype": "float32"}, {"name": "319", "dtype": "float32"}, {"name": "320", "dtype": "float32"}, {"name": "321", "dtype": "float32"}, {"name": "322", "dtype": "float32"}, {"name": "323", "dtype": "float32"}, {"name": "324", "dtype": "float32"}, {"name": "325", "dtype": "float32"}, {"name": "326", "dtype": "float32"}, {"name": "327", "dtype": "float32"}, {"name": "328", "dtype": "float32"}, {"name": "329", "dtype": "float32"}, {"name": "330", "dtype": "float32"}, {"name": "331", "dtype": "float32"}, {"name": "332", "dtype": "float32"}, {"name": "333", "dtype": "float32"}, {"name": "334", "dtype": "float32"}, {"name": "335", "dtype": "float32"}, {"name": "336", "dtype": "float32"}, {"name": "337", "dtype": "float32"}, {"name": "338", "dtype": "float32"}, {"name": "339", "dtype": "float32"}, {"name": "340", "dtype": "float32"}, {"name": "341", "dtype": "float32"}, {"name": "342", "dtype": "float32"}, {"name": "343", "dtype": "float32"}, {"name": "344", "dtype": "float32"}, {"name": "345", "dtype": "float32"}, {"name": "346", "dtype": "float32"}, {"name": "347", "dtype": "float32"}, {"name": "348", "dtype": "float32"}, {"name": "349", "dtype": "float32"}, {"name": "350", "dtype": "float32"}, {"name": "351", "dtype": "float32"}, {"name": "352", "dtype": "float32"}, {"name": "353", "dtype": "float32"}, {"name": "354", "dtype": "float32"}, {"name": "355", "dtype": "float32"}, {"name": "356", "dtype": "float32"}, {"name": "357", "dtype": "float32"}, {"name": "358", "dtype": "float32"}, {"name": "359", "dtype": "float32"}, {"name": "360", "dtype": "float32"}, {"name": "361", "dtype": "float32"}, {"name": "362", "dtype": "float32"}, {"name": "363", "dtype": "float32"}, {"name": "364", "dtype": "float32"}, {"name": "365", "dtype": "float32"}, {"name": "366", "dtype": "float32"}, {"name": "367", "dtype": "float32"}, {"name": "368", "dtype": "float32"}, {"name": "369", "dtype": "float32"}, {"name": "370", "dtype": "float32"}, {"name": "371", "dtype": "float32"}, {"name": "372", "dtype": "float32"}, {"name": "373", "dtype": "float32"}, {"name": "374", "dtype": "float32"}, {"name": "375", "dtype": "float32"}, {"name": "376", "dtype": "float32"}, {"name": "377", "dtype": "float32"}, {"name": "378", "dtype": "float32"}, {"name": "379", "dtype": "float32"}, {"name": "380", "dtype": "float32"}, {"name": "381", "dtype": "float32"}, {"name": "382", "dtype": "float32"}, {"name": "383", "dtype": "float32"}, {"name": "384", "dtype": "float32"}, {"name": "385", "dtype": "float32"}, {"name": "386", "dtype": "float32"}, {"name": "387", "dtype": "float32"}, {"name": "388", "dtype": "float32"}, {"name": "389", "dtype": "float32"}, {"name": "390", "dtype": "float32"}, {"name": "391", "dtype": "float32"}, {"name": "392", "dtype": "float32"}, {"name": "393", "dtype": "float32"}, {"name": "394", "dtype": "float32"}, {"name": "395", "dtype": "float32"}, {"name": "396", "dtype": "float32"}, {"name": "397", "dtype": "float32"}, {"name": "398", "dtype": "float32"}, {"name": "399", "dtype": "float32"}, {"name": "400", "dtype": "float32"}, {"name": "401", "dtype": "float32"}, {"name": "402", "dtype": "float32"}, {"name": "403", "dtype": "float32"}, {"name": "404", "dtype": "float32"}, {"name": "405", "dtype": "float32"}, {"name": "406", "dtype": "float32"}, {"name": "407", "dtype": "float32"}, {"name": "408", "dtype": "float32"}, {"name": "409", "dtype": "float32"}, {"name": "410", "dtype": "float32"}, {"name": "411", "dtype": "float32"}, {"name": "412", "dtype": "float32"}, {"name": "413", "dtype": "float32"}, {"name": "414", "dtype": "float32"}, {"name": "415", "dtype": "float32"}, {"name": "416", "dtype": "float32"}, {"name": "417", "dtype": "float32"}, {"name": "418", "dtype": "float32"}, {"name": "419", "dtype": "float32"}, {"name": "420", "dtype": "float32"}, {"name": "421", "dtype": "float32"}, {"name": "422", "dtype": "float32"}, {"name": "423", "dtype": "float32"}, {"name": "424", "dtype": "float32"}, {"name": "425", "dtype": "float32"}, {"name": "426", "dtype": "float32"}, {"name": "427", "dtype": "float32"}, {"name": "428", "dtype": "float32"}, {"name": "429", "dtype": "float32"}, {"name": "430", "dtype": "float32"}, {"name": "431", "dtype": "float32"}, {"name": "432", "dtype": "float32"}, {"name": "433", "dtype": "float32"}, {"name": "434", "dtype": "float32"}, {"name": "435", "dtype": "float32"}, {"name": "436", "dtype": "float32"}, {"name": "437", "dtype": "float32"}, {"name": "438", "dtype": "float32"}, {"name": "439", "dtype": "float32"}, {"name": "440", "dtype": "float32"}, {"name": "441", "dtype": "float32"}, {"name": "442", "dtype": "float32"}, {"name": "443", "dtype": "float32"}, {"name": "444", "dtype": "float32"}, {"name": "445", "dtype": "float32"}, {"name": "446", "dtype": "float32"}, {"name": "447", "dtype": "float32"}, {"name": "448", "dtype": "float32"}, {"name": "449", "dtype": "float32"}, {"name": "450", "dtype": "float32"}, {"name": "451", "dtype": "float32"}, {"name": "452", "dtype": "float32"}, {"name": "453", "dtype": "float32"}, {"name": "454", "dtype": "float32"}, {"name": "455", "dtype": "float32"}, {"name": "456", "dtype": "float32"}, {"name": "457", "dtype": "float32"}, {"name": "458", "dtype": "float32"}, {"name": "459", "dtype": "float32"}, {"name": "460", "dtype": "float32"}, {"name": "461", "dtype": "float32"}, {"name": "462", "dtype": "float32"}, {"name": "463", "dtype": "float32"}, {"name": "464", "dtype": "float32"}, {"name": "465", "dtype": "float32"}, {"name": "466", "dtype": "float32"}, {"name": "467", "dtype": "float32"}, {"name": "468", "dtype": "float32"}, {"name": "469", "dtype": "float32"}, {"name": "470", "dtype": "float32"}, {"name": "471", "dtype": "float32"}, {"name": "472", "dtype": "float32"}, {"name": "473", "dtype": "float32"}, {"name": "474", "dtype": "float32"}, {"name": "475", "dtype": "float32"}, {"name": "476", "dtype": "float32"}, {"name": "477", "dtype": "float32"}, {"name": "478", "dtype": "float32"}, {"name": "479", "dtype": "float32"}, {"name": "480", "dtype": "float32"}, {"name": "481", "dtype": "float32"}, {"name": "482", "dtype": "float32"}, {"name": "483", "dtype": "float32"}, {"name": "484", "dtype": "float32"}, {"name": "485", "dtype": "float32"}, {"name": "486", "dtype": "float32"}, {"name": "487", "dtype": "float32"}, {"name": "488", "dtype": "float32"}, {"name": "489", "dtype": "float32"}, {"name": "490", "dtype": "float32"}, {"name": "491", "dtype": "float32"}, {"name": "492", "dtype": "float32"}, {"name": "493", "dtype": "float32"}, {"name": "494", "dtype": "float32"}, {"name": "495", "dtype": "float32"}, {"name": "496", "dtype": "float32"}, {"name": "497", "dtype": "float32"}, {"name": "498", "dtype": "float32"}, {"name": "499", "dtype": "float32"}, {"name": "500", "dtype": "float32"}, {"name": "501", "dtype": "float32"}, {"name": "502", "dtype": "float32"}, {"name": "503", "dtype": "float32"}, {"name": "504", "dtype": "float32"}, {"name": "505", "dtype": "float32"}, {"name": "506", "dtype": "float32"}, {"name": "507", "dtype": "float32"}, {"name": "508", "dtype": "float32"}, {"name": "509", "dtype": "float32"}, {"name": "510", "dtype": "float32"}, {"name": "511", "dtype": "float32"}, {"name": "512", "dtype": "float32"}, {"name": "513", "dtype": "float32"}, {"name": "514", "dtype": "float32"}, {"name": "515", "dtype": "float32"}, {"name": "516", "dtype": "float32"}, {"name": "517", "dtype": "float32"}, {"name": "518", "dtype": "float32"}, {"name": "519", "dtype": "float32"}, {"name": "520", "dtype": "float32"}, {"name": "521", "dtype": "float32"}, {"name": "522", "dtype": "float32"}, {"name": "523", "dtype": "float32"}, {"name": "524", "dtype": "float32"}, {"name": "525", "dtype": "float32"}, {"name": "526", "dtype": "float32"}, {"name": "527", "dtype": "float32"}, {"name": "528", "dtype": "float32"}, {"name": "529", "dtype": "float32"}, {"name": "530", "dtype": "float32"}, {"name": "531", "dtype": "float32"}, {"name": "532", "dtype": "float32"}, {"name": "533", "dtype": "float32"}, {"name": "534", "dtype": "float32"}, {"name": "535", "dtype": "float32"}, {"name": "536", "dtype": "float32"}, {"name": "537", "dtype": "float32"}, {"name": "538", "dtype": "float32"}, {"name": "539", "dtype": "float32"}, {"name": "540", "dtype": "float32"}, {"name": "541", "dtype": "float32"}, {"name": "542", "dtype": "float32"}, {"name": "543", "dtype": "float32"}, {"name": "544", "dtype": "float32"}, {"name": "545", "dtype": "float32"}, {"name": "546", "dtype": "float32"}, {"name": "547", "dtype": "float32"}, {"name": "548", "dtype": "float32"}, {"name": "549", "dtype": "float32"}, {"name": "550", "dtype": "float32"}, {"name": "551", "dtype": "float32"}, {"name": "552", "dtype": "float32"}, {"name": "553", "dtype": "float32"}, {"name": "554", "dtype": "float32"}, {"name": "555", "dtype": "float32"}, {"name": "556", "dtype": "float32"}, {"name": "557", "dtype": "float32"}, {"name": "558", "dtype": "float32"}, {"name": "559", "dtype": "float32"}, {"name": "560", "dtype": "float32"}, {"name": "561", "dtype": "float32"}, {"name": "562", "dtype": "float32"}, {"name": "563", "dtype": "float32"}, {"name": "564", "dtype": "float32"}, {"name": "565", "dtype": "float32"}, {"name": "566", "dtype": "float32"}, {"name": "567", "dtype": "float32"}, {"name": "568", "dtype": "float32"}, {"name": "569", "dtype": "float32"}, {"name": "570", "dtype": "float32"}, {"name": "571", "dtype": "float32"}, {"name": "572", "dtype": "float32"}, {"name": "573", "dtype": "float32"}, {"name": "574", "dtype": "float32"}, {"name": "575", "dtype": "float32"}, {"name": "576", "dtype": "float32"}, {"name": "577", "dtype": "float32"}, {"name": "578", "dtype": "float32"}, {"name": "579", "dtype": "float32"}, {"name": "580", "dtype": "float32"}, {"name": "581", "dtype": "float32"}, {"name": "582", "dtype": "float32"}, {"name": "583", "dtype": "float32"}, {"name": "584", "dtype": "float32"}, {"name": "585", "dtype": "float32"}, {"name": "586", "dtype": "float32"}, {"name": "587", "dtype": "float32"}, {"name": "588", "dtype": "float32"}, {"name": "589", "dtype": "float32"}, {"name": "590", "dtype": "float32"}, {"name": "591", "dtype": "float32"}, {"name": "592", "dtype": "float32"}, {"name": "593", "dtype": "float32"}, {"name": "594", "dtype": "float32"}, {"name": "595", "dtype": "float32"}, {"name": "596", "dtype": "float32"}, {"name": "597", "dtype": "float32"}, {"name": "598", "dtype": "float32"}, {"name": "599", "dtype": "float32"}, {"name": "600", "dtype": "float32"}, {"name": "601", "dtype": "float32"}, {"name": "602", "dtype": "float32"}, {"name": "603", "dtype": "float32"}, {"name": "604", "dtype": "float32"}, {"name": "605", "dtype": "float32"}, {"name": "606", "dtype": "float32"}, {"name": "607", "dtype": "float32"}, {"name": "608", "dtype": "float32"}, {"name": "609", "dtype": "float32"}, {"name": "610", "dtype": "float32"}, {"name": "611", "dtype": "float32"}, {"name": "612", "dtype": "float32"}, {"name": "613", "dtype": "float32"}, {"name": "614", "dtype": "float32"}, {"name": "615", "dtype": "float32"}, {"name": "616", "dtype": "float32"}, {"name": "617", "dtype": "float32"}, {"name": "618", "dtype": "float32"}, {"name": "619", "dtype": "float32"}, {"name": "620", "dtype": "float32"}, {"name": "621", "dtype": "float32"}, {"name": "622", "dtype": "float32"}, {"name": "623", "dtype": "float32"}, {"name": "624", "dtype": "float32"}, {"name": "625", "dtype": "float32"}, {"name": "626", "dtype": "float32"}, {"name": "627", "dtype": "float32"}, {"name": "628", "dtype": "float32"}, {"name": "629", "dtype": "float32"}, {"name": "630", "dtype": "float32"}, {"name": "631", "dtype": "float32"}, {"name": "632", "dtype": "float32"}, {"name": "633", "dtype": "float32"}, {"name": "634", "dtype": "float32"}, {"name": "635", "dtype": "float32"}, {"name": "636", "dtype": "float32"}, {"name": "637", "dtype": "float32"}, {"name": "638", "dtype": "float32"}, {"name": "639", "dtype": "float32"}, {"name": "640", "dtype": "float32"}, {"name": "641", "dtype": "float32"}, {"name": "642", "dtype": "float32"}, {"name": "643", "dtype": "float32"}, {"name": "644", "dtype": "float32"}, {"name": "645", "dtype": "float32"}, {"name": "646", "dtype": "float32"}, {"name": "647", "dtype": "float32"}, {"name": "648", "dtype": "float32"}, {"name": "649", "dtype": "float32"}, {"name": "650", "dtype": "float32"}, {"name": "651", "dtype": "float32"}, {"name": "652", "dtype": "float32"}, {"name": "653", "dtype": "float32"}, {"name": "654", "dtype": "float32"}, {"name": "655", "dtype": "float32"}, {"name": "656", "dtype": "float32"}, {"name": "657", "dtype": "float32"}, {"name": "658", "dtype": "float32"}, {"name": "659", "dtype": "float32"}, {"name": "660", "dtype": "float32"}, {"name": "661", "dtype": "float32"}, {"name": "662", "dtype": "float32"}, {"name": "663", "dtype": "float32"}, {"name": "664", "dtype": "float32"}, {"name": "665", "dtype": "float32"}, {"name": "666", "dtype": "float32"}, {"name": "667", "dtype": "float32"}, {"name": "668", "dtype": "float32"}, {"name": "669", "dtype": "float32"}, {"name": "670", "dtype": "float32"}, {"name": "671", "dtype": "float32"}, {"name": "672", "dtype": "float32"}, {"name": "673", "dtype": "float32"}, {"name": "674", "dtype": "float32"}, {"name": "675", "dtype": "float32"}, {"name": "676", "dtype": "float32"}, {"name": "677", "dtype": "float32"}, {"name": "678", "dtype": "float32"}, {"name": "679", "dtype": "float32"}, {"name": "680", "dtype": "float32"}, {"name": "681", "dtype": "float32"}, {"name": "682", "dtype": "float32"}, {"name": "683", "dtype": "float32"}, {"name": "684", "dtype": "float32"}, {"name": "685", "dtype": "float32"}, {"name": "686", "dtype": "float32"}, {"name": "687", "dtype": "float32"}, {"name": "688", "dtype": "float32"}, {"name": "689", "dtype": "float32"}, {"name": "690", "dtype": "float32"}, {"name": "691", "dtype": "float32"}, {"name": "692", "dtype": "float32"}, {"name": "693", "dtype": "float32"}, {"name": "694", "dtype": "float32"}, {"name": "695", "dtype": "float32"}, {"name": "696", "dtype": "float32"}, {"name": "697", "dtype": "float32"}, {"name": "698", "dtype": "float32"}, {"name": "699", "dtype": "float32"}, {"name": "700", "dtype": "float32"}, {"name": "701", "dtype": "float32"}, {"name": "702", "dtype": "float32"}, {"name": "703", "dtype": "float32"}, {"name": "704", "dtype": "float32"}, {"name": "705", "dtype": "float32"}, {"name": "706", "dtype": "float32"}, {"name": "707", "dtype": "float32"}, {"name": "708", "dtype": "float32"}, {"name": "709", "dtype": "float32"}, {"name": "710", "dtype": "float32"}, {"name": "711", "dtype": "float32"}, {"name": "712", "dtype": "float32"}, {"name": "713", "dtype": "float32"}, {"name": "714", "dtype": "float32"}, {"name": "715", "dtype": "float32"}, {"name": "716", "dtype": "float32"}, {"name": "717", "dtype": "float32"}, {"name": "718", "dtype": "float32"}, {"name": "719", "dtype": "float32"}, {"name": "720", "dtype": "float32"}, {"name": "721", "dtype": "float32"}, {"name": "722", "dtype": "float32"}, {"name": "723", "dtype": "float32"}, {"name": "724", "dtype": "float32"}, {"name": "725", "dtype": "float32"}, {"name": "726", "dtype": "float32"}, {"name": "727", "dtype": "float32"}, {"name": "728", "dtype": "float32"}, {"name": "729", "dtype": "float32"}, {"name": "730", "dtype": "float32"}, {"name": "731", "dtype": "float32"}, {"name": "732", "dtype": "float32"}, {"name": "733", "dtype": "float32"}, {"name": "734", "dtype": "float32"}, {"name": "735", "dtype": "float32"}, {"name": "736", "dtype": "float32"}, {"name": "737", "dtype": "float32"}, {"name": "738", "dtype": "float32"}, {"name": "739", "dtype": "float32"}, {"name": "740", "dtype": "float32"}, {"name": "741", "dtype": "float32"}, {"name": "742", "dtype": "float32"}, {"name": "743", "dtype": "float32"}, {"name": "744", "dtype": "float32"}, {"name": "745", "dtype": "float32"}, {"name": "746", "dtype": "float32"}, {"name": "747", "dtype": "float32"}, {"name": "748", "dtype": "float32"}, {"name": "749", "dtype": "float32"}, {"name": "750", "dtype": "float32"}, {"name": "751", "dtype": "float32"}, {"name": "752", "dtype": "float32"}, {"name": "753", "dtype": "float32"}, {"name": "754", "dtype": "float32"}, {"name": "755", "dtype": "float32"}, {"name": "756", "dtype": "float32"}, {"name": "757", "dtype": "float32"}, {"name": "758", "dtype": "float32"}, {"name": "759", "dtype": "float32"}, {"name": "760", "dtype": "float32"}, {"name": "761", "dtype": "float32"}, {"name": "762", "dtype": "float32"}, {"name": "763", "dtype": "float32"}, {"name": "764", "dtype": "float32"}, {"name": "765", "dtype": "float32"}, {"name": "766", "dtype": "float32"}, {"name": "767", "dtype": "float32"}, {"name": "label", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 80318765, "num_examples": 26057}, {"name": "test", "num_bytes": 26774056, "num_examples": 8686}], "download_size": 147154828, "dataset_size": 107092821}}
2023-09-04T06:33:50+00:00
[]
[]
TAGS #region-us
# Dataset Card for "AA_RoBERTa_FT" More Information needed
[ "# Dataset Card for \"AA_RoBERTa_FT\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"AA_RoBERTa_FT\"\n\nMore Information needed" ]
[ 6, 17 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"AA_RoBERTa_FT\"\n\nMore Information needed" ]
f11e27aa32753500b24d492e91f61c864dbf0665
# Dataset Card for "AA_DistilRoBERTa_FT" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
EgilKarlsen/AA_DistilRoBERTa_FT
[ "region:us" ]
2023-09-04T06:35:12+00:00
{"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "test", "path": "data/test-*"}]}], "dataset_info": {"features": [{"name": "0", "dtype": "float32"}, {"name": "1", "dtype": "float32"}, {"name": "2", "dtype": "float32"}, {"name": "3", "dtype": "float32"}, {"name": "4", "dtype": "float32"}, {"name": "5", "dtype": "float32"}, {"name": "6", "dtype": "float32"}, {"name": "7", "dtype": "float32"}, {"name": "8", "dtype": "float32"}, {"name": "9", "dtype": "float32"}, {"name": "10", "dtype": "float32"}, {"name": "11", "dtype": "float32"}, {"name": "12", "dtype": "float32"}, {"name": "13", "dtype": "float32"}, {"name": "14", "dtype": "float32"}, {"name": "15", "dtype": "float32"}, {"name": "16", "dtype": "float32"}, {"name": "17", "dtype": "float32"}, {"name": "18", "dtype": "float32"}, {"name": "19", "dtype": "float32"}, {"name": "20", "dtype": "float32"}, {"name": "21", "dtype": "float32"}, {"name": "22", "dtype": "float32"}, {"name": "23", "dtype": "float32"}, {"name": "24", "dtype": "float32"}, {"name": "25", "dtype": "float32"}, {"name": "26", "dtype": "float32"}, {"name": "27", "dtype": "float32"}, {"name": "28", "dtype": "float32"}, {"name": "29", "dtype": "float32"}, {"name": "30", "dtype": "float32"}, {"name": "31", "dtype": "float32"}, {"name": "32", "dtype": "float32"}, {"name": "33", "dtype": "float32"}, {"name": "34", "dtype": "float32"}, {"name": "35", "dtype": "float32"}, {"name": "36", "dtype": "float32"}, {"name": "37", "dtype": "float32"}, {"name": "38", "dtype": "float32"}, {"name": "39", "dtype": "float32"}, {"name": "40", "dtype": "float32"}, {"name": "41", "dtype": "float32"}, {"name": "42", "dtype": "float32"}, {"name": "43", "dtype": "float32"}, {"name": "44", "dtype": "float32"}, {"name": "45", "dtype": "float32"}, {"name": "46", "dtype": "float32"}, {"name": "47", "dtype": "float32"}, {"name": "48", "dtype": "float32"}, {"name": "49", "dtype": "float32"}, {"name": "50", "dtype": "float32"}, {"name": "51", "dtype": "float32"}, {"name": "52", "dtype": "float32"}, {"name": "53", "dtype": "float32"}, {"name": "54", "dtype": "float32"}, {"name": "55", "dtype": "float32"}, {"name": "56", "dtype": "float32"}, {"name": "57", "dtype": "float32"}, {"name": "58", "dtype": "float32"}, {"name": "59", "dtype": "float32"}, {"name": "60", "dtype": "float32"}, {"name": "61", "dtype": "float32"}, {"name": "62", "dtype": "float32"}, {"name": "63", "dtype": "float32"}, {"name": "64", "dtype": "float32"}, {"name": "65", "dtype": "float32"}, {"name": "66", "dtype": "float32"}, {"name": "67", "dtype": "float32"}, {"name": "68", "dtype": "float32"}, {"name": "69", "dtype": "float32"}, {"name": "70", "dtype": "float32"}, {"name": "71", "dtype": "float32"}, {"name": "72", "dtype": "float32"}, {"name": "73", "dtype": "float32"}, {"name": "74", "dtype": "float32"}, {"name": "75", "dtype": "float32"}, {"name": "76", "dtype": "float32"}, {"name": "77", "dtype": "float32"}, {"name": "78", "dtype": "float32"}, {"name": "79", "dtype": "float32"}, {"name": "80", "dtype": "float32"}, {"name": "81", "dtype": "float32"}, {"name": "82", "dtype": "float32"}, {"name": "83", "dtype": "float32"}, {"name": "84", "dtype": "float32"}, {"name": "85", "dtype": "float32"}, {"name": "86", "dtype": "float32"}, {"name": "87", "dtype": "float32"}, {"name": "88", "dtype": "float32"}, {"name": "89", "dtype": "float32"}, {"name": "90", "dtype": "float32"}, {"name": "91", "dtype": "float32"}, {"name": "92", "dtype": "float32"}, {"name": "93", "dtype": "float32"}, {"name": "94", "dtype": "float32"}, {"name": "95", "dtype": "float32"}, {"name": "96", "dtype": "float32"}, {"name": "97", "dtype": "float32"}, {"name": "98", "dtype": "float32"}, {"name": "99", "dtype": "float32"}, {"name": "100", "dtype": "float32"}, {"name": "101", "dtype": "float32"}, {"name": "102", "dtype": "float32"}, {"name": "103", "dtype": "float32"}, {"name": "104", "dtype": "float32"}, {"name": "105", "dtype": "float32"}, {"name": "106", "dtype": "float32"}, {"name": "107", "dtype": "float32"}, {"name": "108", "dtype": "float32"}, {"name": "109", "dtype": "float32"}, {"name": "110", "dtype": "float32"}, {"name": "111", "dtype": "float32"}, {"name": "112", "dtype": "float32"}, {"name": "113", "dtype": "float32"}, {"name": "114", "dtype": "float32"}, {"name": "115", "dtype": "float32"}, {"name": "116", "dtype": "float32"}, {"name": "117", "dtype": "float32"}, {"name": "118", "dtype": "float32"}, {"name": "119", "dtype": "float32"}, {"name": "120", "dtype": "float32"}, {"name": "121", "dtype": "float32"}, {"name": "122", "dtype": "float32"}, {"name": "123", "dtype": "float32"}, {"name": "124", "dtype": "float32"}, {"name": "125", "dtype": "float32"}, {"name": "126", "dtype": "float32"}, {"name": "127", "dtype": "float32"}, {"name": "128", "dtype": "float32"}, {"name": "129", "dtype": "float32"}, {"name": "130", "dtype": "float32"}, {"name": "131", "dtype": "float32"}, {"name": "132", "dtype": "float32"}, {"name": "133", "dtype": "float32"}, {"name": "134", "dtype": "float32"}, {"name": "135", "dtype": "float32"}, {"name": "136", "dtype": "float32"}, {"name": "137", "dtype": "float32"}, {"name": "138", "dtype": "float32"}, {"name": "139", "dtype": "float32"}, {"name": "140", "dtype": "float32"}, {"name": "141", "dtype": "float32"}, {"name": "142", "dtype": "float32"}, {"name": "143", "dtype": "float32"}, {"name": "144", "dtype": "float32"}, {"name": "145", "dtype": "float32"}, {"name": "146", "dtype": "float32"}, {"name": "147", "dtype": "float32"}, {"name": "148", "dtype": "float32"}, {"name": "149", "dtype": "float32"}, {"name": "150", "dtype": "float32"}, {"name": "151", "dtype": "float32"}, {"name": "152", "dtype": "float32"}, {"name": "153", "dtype": "float32"}, {"name": "154", "dtype": "float32"}, {"name": "155", "dtype": "float32"}, {"name": "156", "dtype": "float32"}, {"name": "157", "dtype": "float32"}, {"name": "158", "dtype": "float32"}, {"name": "159", "dtype": "float32"}, {"name": "160", "dtype": "float32"}, {"name": "161", "dtype": "float32"}, {"name": "162", "dtype": "float32"}, {"name": "163", "dtype": "float32"}, {"name": "164", "dtype": "float32"}, {"name": "165", "dtype": "float32"}, {"name": "166", "dtype": "float32"}, {"name": "167", "dtype": "float32"}, {"name": "168", "dtype": "float32"}, {"name": "169", "dtype": "float32"}, {"name": "170", "dtype": "float32"}, {"name": "171", "dtype": "float32"}, {"name": "172", "dtype": "float32"}, {"name": "173", "dtype": "float32"}, {"name": "174", "dtype": "float32"}, {"name": "175", "dtype": "float32"}, {"name": "176", "dtype": "float32"}, {"name": "177", "dtype": "float32"}, {"name": "178", "dtype": "float32"}, {"name": "179", "dtype": "float32"}, {"name": "180", "dtype": "float32"}, {"name": "181", "dtype": "float32"}, {"name": "182", "dtype": "float32"}, {"name": "183", "dtype": "float32"}, {"name": "184", "dtype": "float32"}, {"name": "185", "dtype": "float32"}, {"name": "186", "dtype": "float32"}, {"name": "187", "dtype": "float32"}, {"name": "188", "dtype": "float32"}, {"name": "189", "dtype": "float32"}, {"name": "190", "dtype": "float32"}, {"name": "191", "dtype": "float32"}, {"name": "192", "dtype": "float32"}, {"name": "193", "dtype": "float32"}, {"name": "194", "dtype": "float32"}, {"name": "195", "dtype": "float32"}, {"name": "196", "dtype": "float32"}, {"name": "197", "dtype": "float32"}, {"name": "198", "dtype": "float32"}, {"name": "199", "dtype": "float32"}, {"name": "200", "dtype": "float32"}, {"name": "201", "dtype": "float32"}, {"name": "202", "dtype": "float32"}, {"name": "203", "dtype": "float32"}, {"name": "204", "dtype": "float32"}, {"name": "205", "dtype": "float32"}, {"name": "206", "dtype": "float32"}, {"name": "207", "dtype": "float32"}, {"name": "208", "dtype": "float32"}, {"name": "209", "dtype": "float32"}, {"name": "210", "dtype": "float32"}, {"name": "211", "dtype": "float32"}, {"name": "212", "dtype": "float32"}, {"name": "213", "dtype": "float32"}, {"name": "214", "dtype": "float32"}, {"name": "215", "dtype": "float32"}, {"name": "216", "dtype": "float32"}, {"name": "217", "dtype": "float32"}, {"name": "218", "dtype": "float32"}, {"name": "219", "dtype": "float32"}, {"name": "220", "dtype": "float32"}, {"name": "221", "dtype": "float32"}, {"name": "222", "dtype": "float32"}, {"name": "223", "dtype": "float32"}, {"name": "224", "dtype": "float32"}, {"name": "225", "dtype": "float32"}, {"name": "226", "dtype": "float32"}, {"name": "227", "dtype": "float32"}, {"name": "228", "dtype": "float32"}, {"name": "229", "dtype": "float32"}, {"name": "230", "dtype": "float32"}, {"name": "231", "dtype": "float32"}, {"name": "232", "dtype": "float32"}, {"name": "233", "dtype": "float32"}, {"name": "234", "dtype": "float32"}, {"name": "235", "dtype": "float32"}, {"name": "236", "dtype": "float32"}, {"name": "237", "dtype": "float32"}, {"name": "238", "dtype": "float32"}, {"name": "239", "dtype": "float32"}, {"name": "240", "dtype": "float32"}, {"name": "241", "dtype": "float32"}, {"name": "242", "dtype": "float32"}, {"name": "243", "dtype": "float32"}, {"name": "244", "dtype": "float32"}, {"name": "245", "dtype": "float32"}, {"name": "246", "dtype": "float32"}, {"name": "247", "dtype": "float32"}, {"name": "248", "dtype": "float32"}, {"name": "249", "dtype": "float32"}, {"name": "250", "dtype": "float32"}, {"name": "251", "dtype": "float32"}, {"name": "252", "dtype": "float32"}, {"name": "253", "dtype": "float32"}, {"name": "254", "dtype": "float32"}, {"name": "255", "dtype": "float32"}, {"name": "256", "dtype": "float32"}, {"name": "257", "dtype": "float32"}, {"name": "258", "dtype": "float32"}, {"name": "259", "dtype": "float32"}, {"name": "260", "dtype": "float32"}, {"name": "261", "dtype": "float32"}, {"name": "262", "dtype": "float32"}, {"name": "263", "dtype": "float32"}, {"name": "264", "dtype": "float32"}, {"name": "265", "dtype": "float32"}, {"name": "266", "dtype": "float32"}, {"name": "267", "dtype": "float32"}, {"name": "268", "dtype": "float32"}, {"name": "269", "dtype": "float32"}, {"name": "270", "dtype": "float32"}, {"name": "271", "dtype": "float32"}, {"name": "272", "dtype": "float32"}, {"name": "273", "dtype": "float32"}, {"name": "274", "dtype": "float32"}, {"name": "275", "dtype": "float32"}, {"name": "276", "dtype": "float32"}, {"name": "277", "dtype": "float32"}, {"name": "278", "dtype": "float32"}, {"name": "279", "dtype": "float32"}, {"name": "280", "dtype": "float32"}, {"name": "281", "dtype": "float32"}, {"name": "282", "dtype": "float32"}, {"name": "283", "dtype": "float32"}, {"name": "284", "dtype": "float32"}, {"name": "285", "dtype": "float32"}, {"name": "286", "dtype": "float32"}, {"name": "287", "dtype": "float32"}, {"name": "288", "dtype": "float32"}, {"name": "289", "dtype": "float32"}, {"name": "290", "dtype": "float32"}, {"name": "291", "dtype": "float32"}, {"name": "292", "dtype": "float32"}, {"name": "293", "dtype": "float32"}, {"name": "294", "dtype": "float32"}, {"name": "295", "dtype": "float32"}, {"name": "296", "dtype": "float32"}, {"name": "297", "dtype": "float32"}, {"name": "298", "dtype": "float32"}, {"name": "299", "dtype": "float32"}, {"name": "300", "dtype": "float32"}, {"name": "301", "dtype": "float32"}, {"name": "302", "dtype": "float32"}, {"name": "303", "dtype": "float32"}, {"name": "304", "dtype": "float32"}, {"name": "305", "dtype": "float32"}, {"name": "306", "dtype": "float32"}, {"name": "307", "dtype": "float32"}, {"name": "308", "dtype": "float32"}, {"name": "309", "dtype": "float32"}, {"name": "310", "dtype": "float32"}, {"name": "311", "dtype": "float32"}, {"name": "312", "dtype": "float32"}, {"name": "313", "dtype": "float32"}, {"name": "314", "dtype": "float32"}, {"name": "315", "dtype": "float32"}, {"name": "316", "dtype": "float32"}, {"name": "317", "dtype": "float32"}, {"name": "318", "dtype": "float32"}, {"name": "319", "dtype": "float32"}, {"name": "320", "dtype": "float32"}, {"name": "321", "dtype": "float32"}, {"name": "322", "dtype": "float32"}, {"name": "323", "dtype": "float32"}, {"name": "324", "dtype": "float32"}, {"name": "325", "dtype": "float32"}, {"name": "326", "dtype": "float32"}, {"name": "327", "dtype": "float32"}, {"name": "328", "dtype": "float32"}, {"name": "329", "dtype": "float32"}, {"name": "330", "dtype": "float32"}, {"name": "331", "dtype": "float32"}, {"name": "332", "dtype": "float32"}, {"name": "333", "dtype": "float32"}, {"name": "334", "dtype": "float32"}, {"name": "335", "dtype": "float32"}, {"name": "336", "dtype": "float32"}, {"name": "337", "dtype": "float32"}, {"name": "338", "dtype": "float32"}, {"name": "339", "dtype": "float32"}, {"name": "340", "dtype": "float32"}, {"name": "341", "dtype": "float32"}, {"name": "342", "dtype": "float32"}, {"name": "343", "dtype": "float32"}, {"name": "344", "dtype": "float32"}, {"name": "345", "dtype": "float32"}, {"name": "346", "dtype": "float32"}, {"name": "347", "dtype": "float32"}, {"name": "348", "dtype": "float32"}, {"name": "349", "dtype": "float32"}, {"name": "350", "dtype": "float32"}, {"name": "351", "dtype": "float32"}, {"name": "352", "dtype": "float32"}, {"name": "353", "dtype": "float32"}, {"name": "354", "dtype": "float32"}, {"name": "355", "dtype": "float32"}, {"name": "356", "dtype": "float32"}, {"name": "357", "dtype": "float32"}, {"name": "358", "dtype": "float32"}, {"name": "359", "dtype": "float32"}, {"name": "360", "dtype": "float32"}, {"name": "361", "dtype": "float32"}, {"name": "362", "dtype": "float32"}, {"name": "363", "dtype": "float32"}, {"name": "364", "dtype": "float32"}, {"name": "365", "dtype": "float32"}, {"name": "366", "dtype": "float32"}, {"name": "367", "dtype": "float32"}, {"name": "368", "dtype": "float32"}, {"name": "369", "dtype": "float32"}, {"name": "370", "dtype": "float32"}, {"name": "371", "dtype": "float32"}, {"name": "372", "dtype": "float32"}, {"name": "373", "dtype": "float32"}, {"name": "374", "dtype": "float32"}, {"name": "375", "dtype": "float32"}, {"name": "376", "dtype": "float32"}, {"name": "377", "dtype": "float32"}, {"name": "378", "dtype": "float32"}, {"name": "379", "dtype": "float32"}, {"name": "380", "dtype": "float32"}, {"name": "381", "dtype": "float32"}, {"name": "382", "dtype": "float32"}, {"name": "383", "dtype": "float32"}, {"name": "384", "dtype": "float32"}, {"name": "385", "dtype": "float32"}, {"name": "386", "dtype": "float32"}, {"name": "387", "dtype": "float32"}, {"name": "388", "dtype": "float32"}, {"name": "389", "dtype": "float32"}, {"name": "390", "dtype": "float32"}, {"name": "391", "dtype": "float32"}, {"name": "392", "dtype": "float32"}, {"name": "393", "dtype": "float32"}, {"name": "394", "dtype": "float32"}, {"name": "395", "dtype": "float32"}, {"name": "396", "dtype": "float32"}, {"name": "397", "dtype": "float32"}, {"name": "398", "dtype": "float32"}, {"name": "399", "dtype": "float32"}, {"name": "400", "dtype": "float32"}, {"name": "401", "dtype": "float32"}, {"name": "402", "dtype": "float32"}, {"name": "403", "dtype": "float32"}, {"name": "404", "dtype": "float32"}, {"name": "405", "dtype": "float32"}, {"name": "406", "dtype": "float32"}, {"name": "407", "dtype": "float32"}, {"name": "408", "dtype": "float32"}, {"name": "409", "dtype": "float32"}, {"name": "410", "dtype": "float32"}, {"name": "411", "dtype": "float32"}, {"name": "412", "dtype": "float32"}, {"name": "413", "dtype": "float32"}, {"name": "414", "dtype": "float32"}, {"name": "415", "dtype": "float32"}, {"name": "416", "dtype": "float32"}, {"name": "417", "dtype": "float32"}, {"name": "418", "dtype": "float32"}, {"name": "419", "dtype": "float32"}, {"name": "420", "dtype": "float32"}, {"name": "421", "dtype": "float32"}, {"name": "422", "dtype": "float32"}, {"name": "423", "dtype": "float32"}, {"name": "424", "dtype": "float32"}, {"name": "425", "dtype": "float32"}, {"name": "426", "dtype": "float32"}, {"name": "427", "dtype": "float32"}, {"name": "428", "dtype": "float32"}, {"name": "429", "dtype": "float32"}, {"name": "430", "dtype": "float32"}, {"name": "431", "dtype": "float32"}, {"name": "432", "dtype": "float32"}, {"name": "433", "dtype": "float32"}, {"name": "434", "dtype": "float32"}, {"name": "435", "dtype": "float32"}, {"name": "436", "dtype": "float32"}, {"name": "437", "dtype": "float32"}, {"name": "438", "dtype": "float32"}, {"name": "439", "dtype": "float32"}, {"name": "440", "dtype": "float32"}, {"name": "441", "dtype": "float32"}, {"name": "442", "dtype": "float32"}, {"name": "443", "dtype": "float32"}, {"name": "444", "dtype": "float32"}, {"name": "445", "dtype": "float32"}, {"name": "446", "dtype": "float32"}, {"name": "447", "dtype": "float32"}, {"name": "448", "dtype": "float32"}, {"name": "449", "dtype": "float32"}, {"name": "450", "dtype": "float32"}, {"name": "451", "dtype": "float32"}, {"name": "452", "dtype": "float32"}, {"name": "453", "dtype": "float32"}, {"name": "454", "dtype": "float32"}, {"name": "455", "dtype": "float32"}, {"name": "456", "dtype": "float32"}, {"name": "457", "dtype": "float32"}, {"name": "458", "dtype": "float32"}, {"name": "459", "dtype": "float32"}, {"name": "460", "dtype": "float32"}, {"name": "461", "dtype": "float32"}, {"name": "462", "dtype": "float32"}, {"name": "463", "dtype": "float32"}, {"name": "464", "dtype": "float32"}, {"name": "465", "dtype": "float32"}, {"name": "466", "dtype": "float32"}, {"name": "467", "dtype": "float32"}, {"name": "468", "dtype": "float32"}, {"name": "469", "dtype": "float32"}, {"name": "470", "dtype": "float32"}, {"name": "471", "dtype": "float32"}, {"name": "472", "dtype": "float32"}, {"name": "473", "dtype": "float32"}, {"name": "474", "dtype": "float32"}, {"name": "475", "dtype": "float32"}, {"name": "476", "dtype": "float32"}, {"name": "477", "dtype": "float32"}, {"name": "478", "dtype": "float32"}, {"name": "479", "dtype": "float32"}, {"name": "480", "dtype": "float32"}, {"name": "481", "dtype": "float32"}, {"name": "482", "dtype": "float32"}, {"name": "483", "dtype": "float32"}, {"name": "484", "dtype": "float32"}, {"name": "485", "dtype": "float32"}, {"name": "486", "dtype": "float32"}, {"name": "487", "dtype": "float32"}, {"name": "488", "dtype": "float32"}, {"name": "489", "dtype": "float32"}, {"name": "490", "dtype": "float32"}, {"name": "491", "dtype": "float32"}, {"name": "492", "dtype": "float32"}, {"name": "493", "dtype": "float32"}, {"name": "494", "dtype": "float32"}, {"name": "495", "dtype": "float32"}, {"name": "496", "dtype": "float32"}, {"name": "497", "dtype": "float32"}, {"name": "498", "dtype": "float32"}, {"name": "499", "dtype": "float32"}, {"name": "500", "dtype": "float32"}, {"name": "501", "dtype": "float32"}, {"name": "502", "dtype": "float32"}, {"name": "503", "dtype": "float32"}, {"name": "504", "dtype": "float32"}, {"name": "505", "dtype": "float32"}, {"name": "506", "dtype": "float32"}, {"name": "507", "dtype": "float32"}, {"name": "508", "dtype": "float32"}, {"name": "509", "dtype": "float32"}, {"name": "510", "dtype": "float32"}, {"name": "511", "dtype": "float32"}, {"name": "512", "dtype": "float32"}, {"name": "513", "dtype": "float32"}, {"name": "514", "dtype": "float32"}, {"name": "515", "dtype": "float32"}, {"name": "516", "dtype": "float32"}, {"name": "517", "dtype": "float32"}, {"name": "518", "dtype": "float32"}, {"name": "519", "dtype": "float32"}, {"name": "520", "dtype": "float32"}, {"name": "521", "dtype": "float32"}, {"name": "522", "dtype": "float32"}, {"name": "523", "dtype": "float32"}, {"name": "524", "dtype": "float32"}, {"name": "525", "dtype": "float32"}, {"name": "526", "dtype": "float32"}, {"name": "527", "dtype": "float32"}, {"name": "528", "dtype": "float32"}, {"name": "529", "dtype": "float32"}, {"name": "530", "dtype": "float32"}, {"name": "531", "dtype": "float32"}, {"name": "532", "dtype": "float32"}, {"name": "533", "dtype": "float32"}, {"name": "534", "dtype": "float32"}, {"name": "535", "dtype": "float32"}, {"name": "536", "dtype": "float32"}, {"name": "537", "dtype": "float32"}, {"name": "538", "dtype": "float32"}, {"name": "539", "dtype": "float32"}, {"name": "540", "dtype": "float32"}, {"name": "541", "dtype": "float32"}, {"name": "542", "dtype": "float32"}, {"name": "543", "dtype": "float32"}, {"name": "544", "dtype": "float32"}, {"name": "545", "dtype": "float32"}, {"name": "546", "dtype": "float32"}, {"name": "547", "dtype": "float32"}, {"name": "548", "dtype": "float32"}, {"name": "549", "dtype": "float32"}, {"name": "550", "dtype": "float32"}, {"name": "551", "dtype": "float32"}, {"name": "552", "dtype": "float32"}, {"name": "553", "dtype": "float32"}, {"name": "554", "dtype": "float32"}, {"name": "555", "dtype": "float32"}, {"name": "556", "dtype": "float32"}, {"name": "557", "dtype": "float32"}, {"name": "558", "dtype": "float32"}, {"name": "559", "dtype": "float32"}, {"name": "560", "dtype": "float32"}, {"name": "561", "dtype": "float32"}, {"name": "562", "dtype": "float32"}, {"name": "563", "dtype": "float32"}, {"name": "564", "dtype": "float32"}, {"name": "565", "dtype": "float32"}, {"name": "566", "dtype": "float32"}, {"name": "567", "dtype": "float32"}, {"name": "568", "dtype": "float32"}, {"name": "569", "dtype": "float32"}, {"name": "570", "dtype": "float32"}, {"name": "571", "dtype": "float32"}, {"name": "572", "dtype": "float32"}, {"name": "573", "dtype": "float32"}, {"name": "574", "dtype": "float32"}, {"name": "575", "dtype": "float32"}, {"name": "576", "dtype": "float32"}, {"name": "577", "dtype": "float32"}, {"name": "578", "dtype": "float32"}, {"name": "579", "dtype": "float32"}, {"name": "580", "dtype": "float32"}, {"name": "581", "dtype": "float32"}, {"name": "582", "dtype": "float32"}, {"name": "583", "dtype": "float32"}, {"name": "584", "dtype": "float32"}, {"name": "585", "dtype": "float32"}, {"name": "586", "dtype": "float32"}, {"name": "587", "dtype": "float32"}, {"name": "588", "dtype": "float32"}, {"name": "589", "dtype": "float32"}, {"name": "590", "dtype": "float32"}, {"name": "591", "dtype": "float32"}, {"name": "592", "dtype": "float32"}, {"name": "593", "dtype": "float32"}, {"name": "594", "dtype": "float32"}, {"name": "595", "dtype": "float32"}, {"name": "596", "dtype": "float32"}, {"name": "597", "dtype": "float32"}, {"name": "598", "dtype": "float32"}, {"name": "599", "dtype": "float32"}, {"name": "600", "dtype": "float32"}, {"name": "601", "dtype": "float32"}, {"name": "602", "dtype": "float32"}, {"name": "603", "dtype": "float32"}, {"name": "604", "dtype": "float32"}, {"name": "605", "dtype": "float32"}, {"name": "606", "dtype": "float32"}, {"name": "607", "dtype": "float32"}, {"name": "608", "dtype": "float32"}, {"name": "609", "dtype": "float32"}, {"name": "610", "dtype": "float32"}, {"name": "611", "dtype": "float32"}, {"name": "612", "dtype": "float32"}, {"name": "613", "dtype": "float32"}, {"name": "614", "dtype": "float32"}, {"name": "615", "dtype": "float32"}, {"name": "616", "dtype": "float32"}, {"name": "617", "dtype": "float32"}, {"name": "618", "dtype": "float32"}, {"name": "619", "dtype": "float32"}, {"name": "620", "dtype": "float32"}, {"name": "621", "dtype": "float32"}, {"name": "622", "dtype": "float32"}, {"name": "623", "dtype": "float32"}, {"name": "624", "dtype": "float32"}, {"name": "625", "dtype": "float32"}, {"name": "626", "dtype": "float32"}, {"name": "627", "dtype": "float32"}, {"name": "628", "dtype": "float32"}, {"name": "629", "dtype": "float32"}, {"name": "630", "dtype": "float32"}, {"name": "631", "dtype": "float32"}, {"name": "632", "dtype": "float32"}, {"name": "633", "dtype": "float32"}, {"name": "634", "dtype": "float32"}, {"name": "635", "dtype": "float32"}, {"name": "636", "dtype": "float32"}, {"name": "637", "dtype": "float32"}, {"name": "638", "dtype": "float32"}, {"name": "639", "dtype": "float32"}, {"name": "640", "dtype": "float32"}, {"name": "641", "dtype": "float32"}, {"name": "642", "dtype": "float32"}, {"name": "643", "dtype": "float32"}, {"name": "644", "dtype": "float32"}, {"name": "645", "dtype": "float32"}, {"name": "646", "dtype": "float32"}, {"name": "647", "dtype": "float32"}, {"name": "648", "dtype": "float32"}, {"name": "649", "dtype": "float32"}, {"name": "650", "dtype": "float32"}, {"name": "651", "dtype": "float32"}, {"name": "652", "dtype": "float32"}, {"name": "653", "dtype": "float32"}, {"name": "654", "dtype": "float32"}, {"name": "655", "dtype": "float32"}, {"name": "656", "dtype": "float32"}, {"name": "657", "dtype": "float32"}, {"name": "658", "dtype": "float32"}, {"name": "659", "dtype": "float32"}, {"name": "660", "dtype": "float32"}, {"name": "661", "dtype": "float32"}, {"name": "662", "dtype": "float32"}, {"name": "663", "dtype": "float32"}, {"name": "664", "dtype": "float32"}, {"name": "665", "dtype": "float32"}, {"name": "666", "dtype": "float32"}, {"name": "667", "dtype": "float32"}, {"name": "668", "dtype": "float32"}, {"name": "669", "dtype": "float32"}, {"name": "670", "dtype": "float32"}, {"name": "671", "dtype": "float32"}, {"name": "672", "dtype": "float32"}, {"name": "673", "dtype": "float32"}, {"name": "674", "dtype": "float32"}, {"name": "675", "dtype": "float32"}, {"name": "676", "dtype": "float32"}, {"name": "677", "dtype": "float32"}, {"name": "678", "dtype": "float32"}, {"name": "679", "dtype": "float32"}, {"name": "680", "dtype": "float32"}, {"name": "681", "dtype": "float32"}, {"name": "682", "dtype": "float32"}, {"name": "683", "dtype": "float32"}, {"name": "684", "dtype": "float32"}, {"name": "685", "dtype": "float32"}, {"name": "686", "dtype": "float32"}, {"name": "687", "dtype": "float32"}, {"name": "688", "dtype": "float32"}, {"name": "689", "dtype": "float32"}, {"name": "690", "dtype": "float32"}, {"name": "691", "dtype": "float32"}, {"name": "692", "dtype": "float32"}, {"name": "693", "dtype": "float32"}, {"name": "694", "dtype": "float32"}, {"name": "695", "dtype": "float32"}, {"name": "696", "dtype": "float32"}, {"name": "697", "dtype": "float32"}, {"name": "698", "dtype": "float32"}, {"name": "699", "dtype": "float32"}, {"name": "700", "dtype": "float32"}, {"name": "701", "dtype": "float32"}, {"name": "702", "dtype": "float32"}, {"name": "703", "dtype": "float32"}, {"name": "704", "dtype": "float32"}, {"name": "705", "dtype": "float32"}, {"name": "706", "dtype": "float32"}, {"name": "707", "dtype": "float32"}, {"name": "708", "dtype": "float32"}, {"name": "709", "dtype": "float32"}, {"name": "710", "dtype": "float32"}, {"name": "711", "dtype": "float32"}, {"name": "712", "dtype": "float32"}, {"name": "713", "dtype": "float32"}, {"name": "714", "dtype": "float32"}, {"name": "715", "dtype": "float32"}, {"name": "716", "dtype": "float32"}, {"name": "717", "dtype": "float32"}, {"name": "718", "dtype": "float32"}, {"name": "719", "dtype": "float32"}, {"name": "720", "dtype": "float32"}, {"name": "721", "dtype": "float32"}, {"name": "722", "dtype": "float32"}, {"name": "723", "dtype": "float32"}, {"name": "724", "dtype": "float32"}, {"name": "725", "dtype": "float32"}, {"name": "726", "dtype": "float32"}, {"name": "727", "dtype": "float32"}, {"name": "728", "dtype": "float32"}, {"name": "729", "dtype": "float32"}, {"name": "730", "dtype": "float32"}, {"name": "731", "dtype": "float32"}, {"name": "732", "dtype": "float32"}, {"name": "733", "dtype": "float32"}, {"name": "734", "dtype": "float32"}, {"name": "735", "dtype": "float32"}, {"name": "736", "dtype": "float32"}, {"name": "737", "dtype": "float32"}, {"name": "738", "dtype": "float32"}, {"name": "739", "dtype": "float32"}, {"name": "740", "dtype": "float32"}, {"name": "741", "dtype": "float32"}, {"name": "742", "dtype": "float32"}, {"name": "743", "dtype": "float32"}, {"name": "744", "dtype": "float32"}, {"name": "745", "dtype": "float32"}, {"name": "746", "dtype": "float32"}, {"name": "747", "dtype": "float32"}, {"name": "748", "dtype": "float32"}, {"name": "749", "dtype": "float32"}, {"name": "750", "dtype": "float32"}, {"name": "751", "dtype": "float32"}, {"name": "752", "dtype": "float32"}, {"name": "753", "dtype": "float32"}, {"name": "754", "dtype": "float32"}, {"name": "755", "dtype": "float32"}, {"name": "756", "dtype": "float32"}, {"name": "757", "dtype": "float32"}, {"name": "758", "dtype": "float32"}, {"name": "759", "dtype": "float32"}, {"name": "760", "dtype": "float32"}, {"name": "761", "dtype": "float32"}, {"name": "762", "dtype": "float32"}, {"name": "763", "dtype": "float32"}, {"name": "764", "dtype": "float32"}, {"name": "765", "dtype": "float32"}, {"name": "766", "dtype": "float32"}, {"name": "767", "dtype": "float32"}, {"name": "label", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 80318765, "num_examples": 26057}, {"name": "test", "num_bytes": 26774056, "num_examples": 8686}], "download_size": 147154939, "dataset_size": 107092821}}
2023-09-04T06:35:37+00:00
[]
[]
TAGS #region-us
# Dataset Card for "AA_DistilRoBERTa_FT" More Information needed
[ "# Dataset Card for \"AA_DistilRoBERTa_FT\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"AA_DistilRoBERTa_FT\"\n\nMore Information needed" ]
[ 6, 19 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"AA_DistilRoBERTa_FT\"\n\nMore Information needed" ]
f13284dc561ca6d9851853b87a145f042786c278
# European Languages Multipurpose Dataset for NER ## Introduction This is a multipurpose dataset that includes names, proper nouns, and acronyms from various European languages. It is designed to be particularly useful for Named Entity Recognition (NER) tasks. ## Language Composition The dataset predominantly features data in English, followed by Spanish, French, and Romanian. ## Objective The primary aim of this dataset is to further fine-tune base BERT or RoBERTa models for tasks related to Named Entity Recognition (NER). ## Dataset Content - **Names**: Includes first names, last names, and full names. - **Proper Nouns**: Includes names of places, organizations, etc. - **Acronyms**: Includes acronyms commonly used in the featured languages. ## Usage This dataset can be used to: - Fine-tune BERT or RoBERTa models for NER tasks - Evaluate the effectiveness of different fine-tuning techniques on NER - Conduct multilingual NER research ## Contributing If you have suggestions for improving or expanding the dataset, please feel free to open an issue or submit a pull request.
eduardem/ner_acro_combined
[ "license:openrail", "region:us" ]
2023-09-04T06:35:14+00:00
{"license": "openrail"}
2023-09-04T06:39:13+00:00
[]
[]
TAGS #license-openrail #region-us
# European Languages Multipurpose Dataset for NER ## Introduction This is a multipurpose dataset that includes names, proper nouns, and acronyms from various European languages. It is designed to be particularly useful for Named Entity Recognition (NER) tasks. ## Language Composition The dataset predominantly features data in English, followed by Spanish, French, and Romanian. ## Objective The primary aim of this dataset is to further fine-tune base BERT or RoBERTa models for tasks related to Named Entity Recognition (NER). ## Dataset Content - Names: Includes first names, last names, and full names. - Proper Nouns: Includes names of places, organizations, etc. - Acronyms: Includes acronyms commonly used in the featured languages. ## Usage This dataset can be used to: - Fine-tune BERT or RoBERTa models for NER tasks - Evaluate the effectiveness of different fine-tuning techniques on NER - Conduct multilingual NER research ## Contributing If you have suggestions for improving or expanding the dataset, please feel free to open an issue or submit a pull request.
[ "# European Languages Multipurpose Dataset for NER", "## Introduction\n\nThis is a multipurpose dataset that includes names, proper nouns, and acronyms from various European languages. It is designed to be particularly useful for Named Entity Recognition (NER) tasks.", "## Language Composition\n\nThe dataset predominantly features data in English, followed by Spanish, French, and Romanian.", "## Objective\n\nThe primary aim of this dataset is to further fine-tune base BERT or RoBERTa models for tasks related to Named Entity Recognition (NER).", "## Dataset Content\n\n- Names: Includes first names, last names, and full names.\n- Proper Nouns: Includes names of places, organizations, etc.\n- Acronyms: Includes acronyms commonly used in the featured languages.", "## Usage\n\nThis dataset can be used to:\n- Fine-tune BERT or RoBERTa models for NER tasks\n- Evaluate the effectiveness of different fine-tuning techniques on NER\n- Conduct multilingual NER research", "## Contributing\n\nIf you have suggestions for improving or expanding the dataset, please feel free to open an issue or submit a pull request." ]
[ "TAGS\n#license-openrail #region-us \n", "# European Languages Multipurpose Dataset for NER", "## Introduction\n\nThis is a multipurpose dataset that includes names, proper nouns, and acronyms from various European languages. It is designed to be particularly useful for Named Entity Recognition (NER) tasks.", "## Language Composition\n\nThe dataset predominantly features data in English, followed by Spanish, French, and Romanian.", "## Objective\n\nThe primary aim of this dataset is to further fine-tune base BERT or RoBERTa models for tasks related to Named Entity Recognition (NER).", "## Dataset Content\n\n- Names: Includes first names, last names, and full names.\n- Proper Nouns: Includes names of places, organizations, etc.\n- Acronyms: Includes acronyms commonly used in the featured languages.", "## Usage\n\nThis dataset can be used to:\n- Fine-tune BERT or RoBERTa models for NER tasks\n- Evaluate the effectiveness of different fine-tuning techniques on NER\n- Conduct multilingual NER research", "## Contributing\n\nIf you have suggestions for improving or expanding the dataset, please feel free to open an issue or submit a pull request." ]
[ 12, 12, 51, 25, 40, 63, 52, 30 ]
[ "passage: TAGS\n#license-openrail #region-us \n# European Languages Multipurpose Dataset for NER## Introduction\n\nThis is a multipurpose dataset that includes names, proper nouns, and acronyms from various European languages. It is designed to be particularly useful for Named Entity Recognition (NER) tasks.## Language Composition\n\nThe dataset predominantly features data in English, followed by Spanish, French, and Romanian.## Objective\n\nThe primary aim of this dataset is to further fine-tune base BERT or RoBERTa models for tasks related to Named Entity Recognition (NER).## Dataset Content\n\n- Names: Includes first names, last names, and full names.\n- Proper Nouns: Includes names of places, organizations, etc.\n- Acronyms: Includes acronyms commonly used in the featured languages.## Usage\n\nThis dataset can be used to:\n- Fine-tune BERT or RoBERTa models for NER tasks\n- Evaluate the effectiveness of different fine-tuning techniques on NER\n- Conduct multilingual NER research## Contributing\n\nIf you have suggestions for improving or expanding the dataset, please feel free to open an issue or submit a pull request." ]
4a33c94e22c9251a8451100af10f27728132015e
# Dataset Card for "AA_GPT2_FT" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
EgilKarlsen/AA_GPT2_FT
[ "region:us" ]
2023-09-04T06:37:40+00:00
{"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "test", "path": "data/test-*"}]}], "dataset_info": {"features": [{"name": "0", "dtype": "float32"}, {"name": "1", "dtype": "float32"}, {"name": "2", "dtype": "float32"}, {"name": "3", "dtype": "float32"}, {"name": "4", "dtype": "float32"}, {"name": "5", "dtype": "float32"}, {"name": "6", "dtype": "float32"}, {"name": "7", "dtype": "float32"}, {"name": "8", "dtype": "float32"}, {"name": "9", "dtype": "float32"}, {"name": "10", "dtype": "float32"}, {"name": "11", "dtype": "float32"}, {"name": "12", "dtype": "float32"}, {"name": "13", "dtype": "float32"}, {"name": "14", "dtype": "float32"}, {"name": "15", "dtype": "float32"}, {"name": "16", "dtype": "float32"}, {"name": "17", "dtype": "float32"}, {"name": "18", "dtype": "float32"}, {"name": "19", "dtype": "float32"}, {"name": "20", "dtype": "float32"}, {"name": "21", "dtype": "float32"}, {"name": "22", "dtype": "float32"}, {"name": "23", "dtype": "float32"}, {"name": "24", "dtype": "float32"}, {"name": "25", "dtype": "float32"}, {"name": "26", "dtype": "float32"}, {"name": "27", "dtype": "float32"}, {"name": "28", "dtype": "float32"}, {"name": "29", "dtype": "float32"}, {"name": "30", "dtype": "float32"}, {"name": "31", "dtype": "float32"}, {"name": "32", "dtype": "float32"}, {"name": "33", "dtype": "float32"}, {"name": "34", "dtype": "float32"}, {"name": "35", "dtype": "float32"}, {"name": "36", "dtype": "float32"}, {"name": "37", "dtype": "float32"}, {"name": "38", "dtype": "float32"}, {"name": "39", "dtype": "float32"}, {"name": "40", "dtype": "float32"}, {"name": "41", "dtype": "float32"}, {"name": "42", "dtype": "float32"}, {"name": "43", "dtype": "float32"}, {"name": "44", "dtype": "float32"}, {"name": "45", "dtype": "float32"}, {"name": "46", "dtype": "float32"}, {"name": "47", "dtype": "float32"}, {"name": "48", "dtype": "float32"}, {"name": "49", "dtype": "float32"}, {"name": "50", "dtype": "float32"}, {"name": "51", "dtype": "float32"}, {"name": "52", "dtype": "float32"}, {"name": "53", "dtype": "float32"}, {"name": "54", "dtype": "float32"}, {"name": "55", "dtype": "float32"}, {"name": "56", "dtype": "float32"}, {"name": "57", "dtype": "float32"}, {"name": "58", "dtype": "float32"}, {"name": "59", "dtype": "float32"}, {"name": "60", "dtype": "float32"}, {"name": "61", "dtype": "float32"}, {"name": "62", "dtype": "float32"}, {"name": "63", "dtype": "float32"}, {"name": "64", "dtype": "float32"}, {"name": "65", "dtype": "float32"}, {"name": "66", "dtype": "float32"}, {"name": "67", "dtype": "float32"}, {"name": "68", "dtype": "float32"}, {"name": "69", "dtype": "float32"}, {"name": "70", "dtype": "float32"}, {"name": "71", "dtype": "float32"}, {"name": "72", "dtype": "float32"}, {"name": "73", "dtype": "float32"}, {"name": "74", "dtype": "float32"}, {"name": "75", "dtype": "float32"}, {"name": "76", "dtype": "float32"}, {"name": "77", "dtype": "float32"}, {"name": "78", "dtype": "float32"}, {"name": "79", "dtype": "float32"}, {"name": "80", "dtype": "float32"}, {"name": "81", "dtype": "float32"}, {"name": "82", "dtype": "float32"}, {"name": "83", "dtype": "float32"}, {"name": "84", "dtype": "float32"}, {"name": "85", "dtype": "float32"}, {"name": "86", "dtype": "float32"}, {"name": "87", "dtype": "float32"}, {"name": "88", "dtype": "float32"}, {"name": "89", "dtype": "float32"}, {"name": "90", "dtype": "float32"}, {"name": "91", "dtype": "float32"}, {"name": "92", "dtype": "float32"}, {"name": "93", "dtype": "float32"}, {"name": "94", "dtype": "float32"}, {"name": "95", "dtype": "float32"}, {"name": "96", "dtype": "float32"}, {"name": "97", "dtype": "float32"}, {"name": "98", "dtype": "float32"}, {"name": "99", "dtype": "float32"}, {"name": "100", "dtype": "float32"}, {"name": "101", "dtype": "float32"}, {"name": "102", "dtype": "float32"}, {"name": "103", "dtype": "float32"}, {"name": "104", "dtype": "float32"}, {"name": "105", "dtype": "float32"}, {"name": "106", "dtype": "float32"}, {"name": "107", "dtype": "float32"}, {"name": "108", "dtype": "float32"}, {"name": "109", "dtype": "float32"}, {"name": "110", "dtype": "float32"}, {"name": "111", "dtype": "float32"}, {"name": "112", "dtype": "float32"}, {"name": "113", "dtype": "float32"}, {"name": "114", "dtype": "float32"}, {"name": "115", "dtype": "float32"}, {"name": "116", "dtype": "float32"}, {"name": "117", "dtype": "float32"}, {"name": "118", "dtype": "float32"}, {"name": "119", "dtype": "float32"}, {"name": "120", "dtype": "float32"}, {"name": "121", "dtype": "float32"}, {"name": "122", "dtype": "float32"}, {"name": "123", "dtype": "float32"}, {"name": "124", "dtype": "float32"}, {"name": "125", "dtype": "float32"}, {"name": "126", "dtype": "float32"}, {"name": "127", "dtype": "float32"}, {"name": "128", "dtype": "float32"}, {"name": "129", "dtype": "float32"}, {"name": "130", "dtype": "float32"}, {"name": "131", "dtype": "float32"}, {"name": "132", "dtype": "float32"}, {"name": "133", "dtype": "float32"}, {"name": "134", "dtype": "float32"}, {"name": "135", "dtype": "float32"}, {"name": "136", "dtype": "float32"}, {"name": "137", "dtype": "float32"}, {"name": "138", "dtype": "float32"}, {"name": "139", "dtype": "float32"}, {"name": "140", "dtype": "float32"}, {"name": "141", "dtype": "float32"}, {"name": "142", "dtype": "float32"}, {"name": "143", "dtype": "float32"}, {"name": "144", "dtype": "float32"}, {"name": "145", "dtype": "float32"}, {"name": "146", "dtype": "float32"}, {"name": "147", "dtype": "float32"}, {"name": "148", "dtype": "float32"}, {"name": "149", "dtype": "float32"}, {"name": "150", "dtype": "float32"}, {"name": "151", "dtype": "float32"}, {"name": "152", "dtype": "float32"}, {"name": "153", "dtype": "float32"}, {"name": "154", "dtype": "float32"}, {"name": "155", "dtype": "float32"}, {"name": "156", "dtype": "float32"}, {"name": "157", "dtype": "float32"}, {"name": "158", "dtype": "float32"}, {"name": "159", "dtype": "float32"}, {"name": "160", "dtype": "float32"}, {"name": "161", "dtype": "float32"}, {"name": "162", "dtype": "float32"}, {"name": "163", "dtype": "float32"}, {"name": "164", "dtype": "float32"}, {"name": "165", "dtype": "float32"}, {"name": "166", "dtype": "float32"}, {"name": "167", "dtype": "float32"}, {"name": "168", "dtype": "float32"}, {"name": "169", "dtype": "float32"}, {"name": "170", "dtype": "float32"}, {"name": "171", "dtype": "float32"}, {"name": "172", "dtype": "float32"}, {"name": "173", "dtype": "float32"}, {"name": "174", "dtype": "float32"}, {"name": "175", "dtype": "float32"}, {"name": "176", "dtype": "float32"}, {"name": "177", "dtype": "float32"}, {"name": "178", "dtype": "float32"}, {"name": "179", "dtype": "float32"}, {"name": "180", "dtype": "float32"}, {"name": "181", "dtype": "float32"}, {"name": "182", "dtype": "float32"}, {"name": "183", "dtype": "float32"}, {"name": "184", "dtype": "float32"}, {"name": "185", "dtype": "float32"}, {"name": "186", "dtype": "float32"}, {"name": "187", "dtype": "float32"}, {"name": "188", "dtype": "float32"}, {"name": "189", "dtype": "float32"}, {"name": "190", "dtype": "float32"}, {"name": "191", "dtype": "float32"}, {"name": "192", "dtype": "float32"}, {"name": "193", "dtype": "float32"}, {"name": "194", "dtype": "float32"}, {"name": "195", "dtype": "float32"}, {"name": "196", "dtype": "float32"}, {"name": "197", "dtype": "float32"}, {"name": "198", "dtype": "float32"}, {"name": "199", "dtype": "float32"}, {"name": "200", "dtype": "float32"}, {"name": "201", "dtype": "float32"}, {"name": "202", "dtype": "float32"}, {"name": "203", "dtype": "float32"}, {"name": "204", "dtype": "float32"}, {"name": "205", "dtype": "float32"}, {"name": "206", "dtype": "float32"}, {"name": "207", "dtype": "float32"}, {"name": "208", "dtype": "float32"}, {"name": "209", "dtype": "float32"}, {"name": "210", "dtype": "float32"}, {"name": "211", "dtype": "float32"}, {"name": "212", "dtype": "float32"}, {"name": "213", "dtype": "float32"}, {"name": "214", "dtype": "float32"}, {"name": "215", "dtype": "float32"}, {"name": "216", "dtype": "float32"}, {"name": "217", "dtype": "float32"}, {"name": "218", "dtype": "float32"}, {"name": "219", "dtype": "float32"}, {"name": "220", "dtype": "float32"}, {"name": "221", "dtype": "float32"}, {"name": "222", "dtype": "float32"}, {"name": "223", "dtype": "float32"}, {"name": "224", "dtype": "float32"}, {"name": "225", "dtype": "float32"}, {"name": "226", "dtype": "float32"}, {"name": "227", "dtype": "float32"}, {"name": "228", "dtype": "float32"}, {"name": "229", "dtype": "float32"}, {"name": "230", "dtype": "float32"}, {"name": "231", "dtype": "float32"}, {"name": "232", "dtype": "float32"}, {"name": "233", "dtype": "float32"}, {"name": "234", "dtype": "float32"}, {"name": "235", "dtype": "float32"}, {"name": "236", "dtype": "float32"}, {"name": "237", "dtype": "float32"}, {"name": "238", "dtype": "float32"}, {"name": "239", "dtype": "float32"}, {"name": "240", "dtype": "float32"}, {"name": "241", "dtype": "float32"}, {"name": "242", "dtype": "float32"}, {"name": "243", "dtype": "float32"}, {"name": "244", "dtype": "float32"}, {"name": "245", "dtype": "float32"}, {"name": "246", "dtype": "float32"}, {"name": "247", "dtype": "float32"}, {"name": "248", "dtype": "float32"}, {"name": "249", "dtype": "float32"}, {"name": "250", "dtype": "float32"}, {"name": "251", "dtype": "float32"}, {"name": "252", "dtype": "float32"}, {"name": "253", "dtype": "float32"}, {"name": "254", "dtype": "float32"}, {"name": "255", "dtype": "float32"}, {"name": "256", "dtype": "float32"}, {"name": "257", "dtype": "float32"}, {"name": "258", "dtype": "float32"}, {"name": "259", "dtype": "float32"}, {"name": "260", "dtype": "float32"}, {"name": "261", "dtype": "float32"}, {"name": "262", "dtype": "float32"}, {"name": "263", "dtype": "float32"}, {"name": "264", "dtype": "float32"}, {"name": "265", "dtype": "float32"}, {"name": "266", "dtype": "float32"}, {"name": "267", "dtype": "float32"}, {"name": "268", "dtype": "float32"}, {"name": "269", "dtype": "float32"}, {"name": "270", "dtype": "float32"}, {"name": "271", "dtype": "float32"}, {"name": "272", "dtype": "float32"}, {"name": "273", "dtype": "float32"}, {"name": "274", "dtype": "float32"}, {"name": "275", "dtype": "float32"}, {"name": "276", "dtype": "float32"}, {"name": "277", "dtype": "float32"}, {"name": "278", "dtype": "float32"}, {"name": "279", "dtype": "float32"}, {"name": "280", "dtype": "float32"}, {"name": "281", "dtype": "float32"}, {"name": "282", "dtype": "float32"}, {"name": "283", "dtype": "float32"}, {"name": "284", "dtype": "float32"}, {"name": "285", "dtype": "float32"}, {"name": "286", "dtype": "float32"}, {"name": "287", "dtype": "float32"}, {"name": "288", "dtype": "float32"}, {"name": "289", "dtype": "float32"}, {"name": "290", "dtype": "float32"}, {"name": "291", "dtype": "float32"}, {"name": "292", "dtype": "float32"}, {"name": "293", "dtype": "float32"}, {"name": "294", "dtype": "float32"}, {"name": "295", "dtype": "float32"}, {"name": "296", "dtype": "float32"}, {"name": "297", "dtype": "float32"}, {"name": "298", "dtype": "float32"}, {"name": "299", "dtype": "float32"}, {"name": "300", "dtype": "float32"}, {"name": "301", "dtype": "float32"}, {"name": "302", "dtype": "float32"}, {"name": "303", "dtype": "float32"}, {"name": "304", "dtype": "float32"}, {"name": "305", "dtype": "float32"}, {"name": "306", "dtype": "float32"}, {"name": "307", "dtype": "float32"}, {"name": "308", "dtype": "float32"}, {"name": "309", "dtype": "float32"}, {"name": "310", "dtype": "float32"}, {"name": "311", "dtype": "float32"}, {"name": "312", "dtype": "float32"}, {"name": "313", "dtype": "float32"}, {"name": "314", "dtype": "float32"}, {"name": "315", "dtype": "float32"}, {"name": "316", "dtype": "float32"}, {"name": "317", "dtype": "float32"}, {"name": "318", "dtype": "float32"}, {"name": "319", "dtype": "float32"}, {"name": "320", "dtype": "float32"}, {"name": "321", "dtype": "float32"}, {"name": "322", "dtype": "float32"}, {"name": "323", "dtype": "float32"}, {"name": "324", "dtype": "float32"}, {"name": "325", "dtype": "float32"}, {"name": "326", "dtype": "float32"}, {"name": "327", "dtype": "float32"}, {"name": "328", "dtype": "float32"}, {"name": "329", "dtype": "float32"}, {"name": "330", "dtype": "float32"}, {"name": "331", "dtype": "float32"}, {"name": "332", "dtype": "float32"}, {"name": "333", "dtype": "float32"}, {"name": "334", "dtype": "float32"}, {"name": "335", "dtype": "float32"}, {"name": "336", "dtype": "float32"}, {"name": "337", "dtype": "float32"}, {"name": "338", "dtype": "float32"}, {"name": "339", "dtype": "float32"}, {"name": "340", "dtype": "float32"}, {"name": "341", "dtype": "float32"}, {"name": "342", "dtype": "float32"}, {"name": "343", "dtype": "float32"}, {"name": "344", "dtype": "float32"}, {"name": "345", "dtype": "float32"}, {"name": "346", "dtype": "float32"}, {"name": "347", "dtype": "float32"}, {"name": "348", "dtype": "float32"}, {"name": "349", "dtype": "float32"}, {"name": "350", "dtype": "float32"}, {"name": "351", "dtype": "float32"}, {"name": "352", "dtype": "float32"}, {"name": "353", "dtype": "float32"}, {"name": "354", "dtype": "float32"}, {"name": "355", "dtype": "float32"}, {"name": "356", "dtype": "float32"}, {"name": "357", "dtype": "float32"}, {"name": "358", "dtype": "float32"}, {"name": "359", "dtype": "float32"}, {"name": "360", "dtype": "float32"}, {"name": "361", "dtype": "float32"}, {"name": "362", "dtype": "float32"}, {"name": "363", "dtype": "float32"}, {"name": "364", "dtype": "float32"}, {"name": "365", "dtype": "float32"}, {"name": "366", "dtype": "float32"}, {"name": "367", "dtype": "float32"}, {"name": "368", "dtype": "float32"}, {"name": "369", "dtype": "float32"}, {"name": "370", "dtype": "float32"}, {"name": "371", "dtype": "float32"}, {"name": "372", "dtype": "float32"}, {"name": "373", "dtype": "float32"}, {"name": "374", "dtype": "float32"}, {"name": "375", "dtype": "float32"}, {"name": "376", "dtype": "float32"}, {"name": "377", "dtype": "float32"}, {"name": "378", "dtype": "float32"}, {"name": "379", "dtype": "float32"}, {"name": "380", "dtype": "float32"}, {"name": "381", "dtype": "float32"}, {"name": "382", "dtype": "float32"}, {"name": "383", "dtype": "float32"}, {"name": "384", "dtype": "float32"}, {"name": "385", "dtype": "float32"}, {"name": "386", "dtype": "float32"}, {"name": "387", "dtype": "float32"}, {"name": "388", "dtype": "float32"}, {"name": "389", "dtype": "float32"}, {"name": "390", "dtype": "float32"}, {"name": "391", "dtype": "float32"}, {"name": "392", "dtype": "float32"}, {"name": "393", "dtype": "float32"}, {"name": "394", "dtype": "float32"}, {"name": "395", "dtype": "float32"}, {"name": "396", "dtype": "float32"}, {"name": "397", "dtype": "float32"}, {"name": "398", "dtype": "float32"}, {"name": "399", "dtype": "float32"}, {"name": "400", "dtype": "float32"}, {"name": "401", "dtype": "float32"}, {"name": "402", "dtype": "float32"}, {"name": "403", "dtype": "float32"}, {"name": "404", "dtype": "float32"}, {"name": "405", "dtype": "float32"}, {"name": "406", "dtype": "float32"}, {"name": "407", "dtype": "float32"}, {"name": "408", "dtype": "float32"}, {"name": "409", "dtype": "float32"}, {"name": "410", "dtype": "float32"}, {"name": "411", "dtype": "float32"}, {"name": "412", "dtype": "float32"}, {"name": "413", "dtype": "float32"}, {"name": "414", "dtype": "float32"}, {"name": "415", "dtype": "float32"}, {"name": "416", "dtype": "float32"}, {"name": "417", "dtype": "float32"}, {"name": "418", "dtype": "float32"}, {"name": "419", "dtype": "float32"}, {"name": "420", "dtype": "float32"}, {"name": "421", "dtype": "float32"}, {"name": "422", "dtype": "float32"}, {"name": "423", "dtype": "float32"}, {"name": "424", "dtype": "float32"}, {"name": "425", "dtype": "float32"}, {"name": "426", "dtype": "float32"}, {"name": "427", "dtype": "float32"}, {"name": "428", "dtype": "float32"}, {"name": "429", "dtype": "float32"}, {"name": "430", "dtype": "float32"}, {"name": "431", "dtype": "float32"}, {"name": "432", "dtype": "float32"}, {"name": "433", "dtype": "float32"}, {"name": "434", "dtype": "float32"}, {"name": "435", "dtype": "float32"}, {"name": "436", "dtype": "float32"}, {"name": "437", "dtype": "float32"}, {"name": "438", "dtype": "float32"}, {"name": "439", "dtype": "float32"}, {"name": "440", "dtype": "float32"}, {"name": "441", "dtype": "float32"}, {"name": "442", "dtype": "float32"}, {"name": "443", "dtype": "float32"}, {"name": "444", "dtype": "float32"}, {"name": "445", "dtype": "float32"}, {"name": "446", "dtype": "float32"}, {"name": "447", "dtype": "float32"}, {"name": "448", "dtype": "float32"}, {"name": "449", "dtype": "float32"}, {"name": "450", "dtype": "float32"}, {"name": "451", "dtype": "float32"}, {"name": "452", "dtype": "float32"}, {"name": "453", "dtype": "float32"}, {"name": "454", "dtype": "float32"}, {"name": "455", "dtype": "float32"}, {"name": "456", "dtype": "float32"}, {"name": "457", "dtype": "float32"}, {"name": "458", "dtype": "float32"}, {"name": "459", "dtype": "float32"}, {"name": "460", "dtype": "float32"}, {"name": "461", "dtype": "float32"}, {"name": "462", "dtype": "float32"}, {"name": "463", "dtype": "float32"}, {"name": "464", "dtype": "float32"}, {"name": "465", "dtype": "float32"}, {"name": "466", "dtype": "float32"}, {"name": "467", "dtype": "float32"}, {"name": "468", "dtype": "float32"}, {"name": "469", "dtype": "float32"}, {"name": "470", "dtype": "float32"}, {"name": "471", "dtype": "float32"}, {"name": "472", "dtype": "float32"}, {"name": "473", "dtype": "float32"}, {"name": "474", "dtype": "float32"}, {"name": "475", "dtype": "float32"}, {"name": "476", "dtype": "float32"}, {"name": "477", "dtype": "float32"}, {"name": "478", "dtype": "float32"}, {"name": "479", "dtype": "float32"}, {"name": "480", "dtype": "float32"}, {"name": "481", "dtype": "float32"}, {"name": "482", "dtype": "float32"}, {"name": "483", "dtype": "float32"}, {"name": "484", "dtype": "float32"}, {"name": "485", "dtype": "float32"}, {"name": "486", "dtype": "float32"}, {"name": "487", "dtype": "float32"}, {"name": "488", "dtype": "float32"}, {"name": "489", "dtype": "float32"}, {"name": "490", "dtype": "float32"}, {"name": "491", "dtype": "float32"}, {"name": "492", "dtype": "float32"}, {"name": "493", "dtype": "float32"}, {"name": "494", "dtype": "float32"}, {"name": "495", "dtype": "float32"}, {"name": "496", "dtype": "float32"}, {"name": "497", "dtype": "float32"}, {"name": "498", "dtype": "float32"}, {"name": "499", "dtype": "float32"}, {"name": "500", "dtype": "float32"}, {"name": "501", "dtype": "float32"}, {"name": "502", "dtype": "float32"}, {"name": "503", "dtype": "float32"}, {"name": "504", "dtype": "float32"}, {"name": "505", "dtype": "float32"}, {"name": "506", "dtype": "float32"}, {"name": "507", "dtype": "float32"}, {"name": "508", "dtype": "float32"}, {"name": "509", "dtype": "float32"}, {"name": "510", "dtype": "float32"}, {"name": "511", "dtype": "float32"}, {"name": "512", "dtype": "float32"}, {"name": "513", "dtype": "float32"}, {"name": "514", "dtype": "float32"}, {"name": "515", "dtype": "float32"}, {"name": "516", "dtype": "float32"}, {"name": "517", "dtype": "float32"}, {"name": "518", "dtype": "float32"}, {"name": "519", "dtype": "float32"}, {"name": "520", "dtype": "float32"}, {"name": "521", "dtype": "float32"}, {"name": "522", "dtype": "float32"}, {"name": "523", "dtype": "float32"}, {"name": "524", "dtype": "float32"}, {"name": "525", "dtype": "float32"}, {"name": "526", "dtype": "float32"}, {"name": "527", "dtype": "float32"}, {"name": "528", "dtype": "float32"}, {"name": "529", "dtype": "float32"}, {"name": "530", "dtype": "float32"}, {"name": "531", "dtype": "float32"}, {"name": "532", "dtype": "float32"}, {"name": "533", "dtype": "float32"}, {"name": "534", "dtype": "float32"}, {"name": "535", "dtype": "float32"}, {"name": "536", "dtype": "float32"}, {"name": "537", "dtype": "float32"}, {"name": "538", "dtype": "float32"}, {"name": "539", "dtype": "float32"}, {"name": "540", "dtype": "float32"}, {"name": "541", "dtype": "float32"}, {"name": "542", "dtype": "float32"}, {"name": "543", "dtype": "float32"}, {"name": "544", "dtype": "float32"}, {"name": "545", "dtype": "float32"}, {"name": "546", "dtype": "float32"}, {"name": "547", "dtype": "float32"}, {"name": "548", "dtype": "float32"}, {"name": "549", "dtype": "float32"}, {"name": "550", "dtype": "float32"}, {"name": "551", "dtype": "float32"}, {"name": "552", "dtype": "float32"}, {"name": "553", "dtype": "float32"}, {"name": "554", "dtype": "float32"}, {"name": "555", "dtype": "float32"}, {"name": "556", "dtype": "float32"}, {"name": "557", "dtype": "float32"}, {"name": "558", "dtype": "float32"}, {"name": "559", "dtype": "float32"}, {"name": "560", "dtype": "float32"}, {"name": "561", "dtype": "float32"}, {"name": "562", "dtype": "float32"}, {"name": "563", "dtype": "float32"}, {"name": "564", "dtype": "float32"}, {"name": "565", "dtype": "float32"}, {"name": "566", "dtype": "float32"}, {"name": "567", "dtype": "float32"}, {"name": "568", "dtype": "float32"}, {"name": "569", "dtype": "float32"}, {"name": "570", "dtype": "float32"}, {"name": "571", "dtype": "float32"}, {"name": "572", "dtype": "float32"}, {"name": "573", "dtype": "float32"}, {"name": "574", "dtype": "float32"}, {"name": "575", "dtype": "float32"}, {"name": "576", "dtype": "float32"}, {"name": "577", "dtype": "float32"}, {"name": "578", "dtype": "float32"}, {"name": "579", "dtype": "float32"}, {"name": "580", "dtype": "float32"}, {"name": "581", "dtype": "float32"}, {"name": "582", "dtype": "float32"}, {"name": "583", "dtype": "float32"}, {"name": "584", "dtype": "float32"}, {"name": "585", "dtype": "float32"}, {"name": "586", "dtype": "float32"}, {"name": "587", "dtype": "float32"}, {"name": "588", "dtype": "float32"}, {"name": "589", "dtype": "float32"}, {"name": "590", "dtype": "float32"}, {"name": "591", "dtype": "float32"}, {"name": "592", "dtype": "float32"}, {"name": "593", "dtype": "float32"}, {"name": "594", "dtype": "float32"}, {"name": "595", "dtype": "float32"}, {"name": "596", "dtype": "float32"}, {"name": "597", "dtype": "float32"}, {"name": "598", "dtype": "float32"}, {"name": "599", "dtype": "float32"}, {"name": "600", "dtype": "float32"}, {"name": "601", "dtype": "float32"}, {"name": "602", "dtype": "float32"}, {"name": "603", "dtype": "float32"}, {"name": "604", "dtype": "float32"}, {"name": "605", "dtype": "float32"}, {"name": "606", "dtype": "float32"}, {"name": "607", "dtype": "float32"}, {"name": "608", "dtype": "float32"}, {"name": "609", "dtype": "float32"}, {"name": "610", "dtype": "float32"}, {"name": "611", "dtype": "float32"}, {"name": "612", "dtype": "float32"}, {"name": "613", "dtype": "float32"}, {"name": "614", "dtype": "float32"}, {"name": "615", "dtype": "float32"}, {"name": "616", "dtype": "float32"}, {"name": "617", "dtype": "float32"}, {"name": "618", "dtype": "float32"}, {"name": "619", "dtype": "float32"}, {"name": "620", "dtype": "float32"}, {"name": "621", "dtype": "float32"}, {"name": "622", "dtype": "float32"}, {"name": "623", "dtype": "float32"}, {"name": "624", "dtype": "float32"}, {"name": "625", "dtype": "float32"}, {"name": "626", "dtype": "float32"}, {"name": "627", "dtype": "float32"}, {"name": "628", "dtype": "float32"}, {"name": "629", "dtype": "float32"}, {"name": "630", "dtype": "float32"}, {"name": "631", "dtype": "float32"}, {"name": "632", "dtype": "float32"}, {"name": "633", "dtype": "float32"}, {"name": "634", "dtype": "float32"}, {"name": "635", "dtype": "float32"}, {"name": "636", "dtype": "float32"}, {"name": "637", "dtype": "float32"}, {"name": "638", "dtype": "float32"}, {"name": "639", "dtype": "float32"}, {"name": "640", "dtype": "float32"}, {"name": "641", "dtype": "float32"}, {"name": "642", "dtype": "float32"}, {"name": "643", "dtype": "float32"}, {"name": "644", "dtype": "float32"}, {"name": "645", "dtype": "float32"}, {"name": "646", "dtype": "float32"}, {"name": "647", "dtype": "float32"}, {"name": "648", "dtype": "float32"}, {"name": "649", "dtype": "float32"}, {"name": "650", "dtype": "float32"}, {"name": "651", "dtype": "float32"}, {"name": "652", "dtype": "float32"}, {"name": "653", "dtype": "float32"}, {"name": "654", "dtype": "float32"}, {"name": "655", "dtype": "float32"}, {"name": "656", "dtype": "float32"}, {"name": "657", "dtype": "float32"}, {"name": "658", "dtype": "float32"}, {"name": "659", "dtype": "float32"}, {"name": "660", "dtype": "float32"}, {"name": "661", "dtype": "float32"}, {"name": "662", "dtype": "float32"}, {"name": "663", "dtype": "float32"}, {"name": "664", "dtype": "float32"}, {"name": "665", "dtype": "float32"}, {"name": "666", "dtype": "float32"}, {"name": "667", "dtype": "float32"}, {"name": "668", "dtype": "float32"}, {"name": "669", "dtype": "float32"}, {"name": "670", "dtype": "float32"}, {"name": "671", "dtype": "float32"}, {"name": "672", "dtype": "float32"}, {"name": "673", "dtype": "float32"}, {"name": "674", "dtype": "float32"}, {"name": "675", "dtype": "float32"}, {"name": "676", "dtype": "float32"}, {"name": "677", "dtype": "float32"}, {"name": "678", "dtype": "float32"}, {"name": "679", "dtype": "float32"}, {"name": "680", "dtype": "float32"}, {"name": "681", "dtype": "float32"}, {"name": "682", "dtype": "float32"}, {"name": "683", "dtype": "float32"}, {"name": "684", "dtype": "float32"}, {"name": "685", "dtype": "float32"}, {"name": "686", "dtype": "float32"}, {"name": "687", "dtype": "float32"}, {"name": "688", "dtype": "float32"}, {"name": "689", "dtype": "float32"}, {"name": "690", "dtype": "float32"}, {"name": "691", "dtype": "float32"}, {"name": "692", "dtype": "float32"}, {"name": "693", "dtype": "float32"}, {"name": "694", "dtype": "float32"}, {"name": "695", "dtype": "float32"}, {"name": "696", "dtype": "float32"}, {"name": "697", "dtype": "float32"}, {"name": "698", "dtype": "float32"}, {"name": "699", "dtype": "float32"}, {"name": "700", "dtype": "float32"}, {"name": "701", "dtype": "float32"}, {"name": "702", "dtype": "float32"}, {"name": "703", "dtype": "float32"}, {"name": "704", "dtype": "float32"}, {"name": "705", "dtype": "float32"}, {"name": "706", "dtype": "float32"}, {"name": "707", "dtype": "float32"}, {"name": "708", "dtype": "float32"}, {"name": "709", "dtype": "float32"}, {"name": "710", "dtype": "float32"}, {"name": "711", "dtype": "float32"}, {"name": "712", "dtype": "float32"}, {"name": "713", "dtype": "float32"}, {"name": "714", "dtype": "float32"}, {"name": "715", "dtype": "float32"}, {"name": "716", "dtype": "float32"}, {"name": "717", "dtype": "float32"}, {"name": "718", "dtype": "float32"}, {"name": "719", "dtype": "float32"}, {"name": "720", "dtype": "float32"}, {"name": "721", "dtype": "float32"}, {"name": "722", "dtype": "float32"}, {"name": "723", "dtype": "float32"}, {"name": "724", "dtype": "float32"}, {"name": "725", "dtype": "float32"}, {"name": "726", "dtype": "float32"}, {"name": "727", "dtype": "float32"}, {"name": "728", "dtype": "float32"}, {"name": "729", "dtype": "float32"}, {"name": "730", "dtype": "float32"}, {"name": "731", "dtype": "float32"}, {"name": "732", "dtype": "float32"}, {"name": "733", "dtype": "float32"}, {"name": "734", "dtype": "float32"}, {"name": "735", "dtype": "float32"}, {"name": "736", "dtype": "float32"}, {"name": "737", "dtype": "float32"}, {"name": "738", "dtype": "float32"}, {"name": "739", "dtype": "float32"}, {"name": "740", "dtype": "float32"}, {"name": "741", "dtype": "float32"}, {"name": "742", "dtype": "float32"}, {"name": "743", "dtype": "float32"}, {"name": "744", "dtype": "float32"}, {"name": "745", "dtype": "float32"}, {"name": "746", "dtype": "float32"}, {"name": "747", "dtype": "float32"}, {"name": "748", "dtype": "float32"}, {"name": "749", "dtype": "float32"}, {"name": "750", "dtype": "float32"}, {"name": "751", "dtype": "float32"}, {"name": "752", "dtype": "float32"}, {"name": "753", "dtype": "float32"}, {"name": "754", "dtype": "float32"}, {"name": "755", "dtype": "float32"}, {"name": "756", "dtype": "float32"}, {"name": "757", "dtype": "float32"}, {"name": "758", "dtype": "float32"}, {"name": "759", "dtype": "float32"}, {"name": "760", "dtype": "float32"}, {"name": "761", "dtype": "float32"}, {"name": "762", "dtype": "float32"}, {"name": "763", "dtype": "float32"}, {"name": "764", "dtype": "float32"}, {"name": "765", "dtype": "float32"}, {"name": "766", "dtype": "float32"}, {"name": "767", "dtype": "float32"}, {"name": "label", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 80318765, "num_examples": 26057}, {"name": "test", "num_bytes": 26774056, "num_examples": 8686}], "download_size": 147157938, "dataset_size": 107092821}}
2023-09-04T06:38:06+00:00
[]
[]
TAGS #region-us
# Dataset Card for "AA_GPT2_FT" More Information needed
[ "# Dataset Card for \"AA_GPT2_FT\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"AA_GPT2_FT\"\n\nMore Information needed" ]
[ 6, 17 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"AA_GPT2_FT\"\n\nMore Information needed" ]
8596c0fb16ca906404575242028f6ac537065e2a
# Dataset Card for "generadai-sample" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
jschew39/generadai-sample
[ "region:us" ]
2023-09-04T07:02:20+00:00
{"dataset_info": {"features": [{"name": "item", "dtype": "string"}, {"name": "description", "dtype": "string"}, {"name": "ad", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 6765, "num_examples": 5}], "download_size": 11936, "dataset_size": 6765}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]}
2023-09-04T07:02:22+00:00
[]
[]
TAGS #region-us
# Dataset Card for "generadai-sample" More Information needed
[ "# Dataset Card for \"generadai-sample\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"generadai-sample\"\n\nMore Information needed" ]
[ 6, 15 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"generadai-sample\"\n\nMore Information needed" ]
4b6e7e3810e6073d384c376ffd154251c2fb750b
# Dataset Card for "AA_GPTNEO_FT" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
EgilKarlsen/AA_GPTNEO_FT
[ "region:us" ]
2023-09-04T07:03:34+00:00
{"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "test", "path": "data/test-*"}]}], "dataset_info": {"features": [{"name": "0", "dtype": "float32"}, {"name": "1", "dtype": "float32"}, {"name": "2", "dtype": "float32"}, {"name": "3", "dtype": "float32"}, {"name": "4", "dtype": "float32"}, {"name": "5", "dtype": "float32"}, {"name": "6", "dtype": "float32"}, {"name": "7", "dtype": "float32"}, {"name": "8", "dtype": "float32"}, {"name": "9", "dtype": "float32"}, {"name": "10", "dtype": "float32"}, {"name": "11", "dtype": "float32"}, {"name": "12", "dtype": "float32"}, {"name": "13", "dtype": "float32"}, {"name": "14", "dtype": "float32"}, {"name": "15", "dtype": "float32"}, {"name": "16", "dtype": "float32"}, {"name": "17", "dtype": "float32"}, {"name": "18", "dtype": "float32"}, {"name": "19", "dtype": "float32"}, {"name": "20", "dtype": "float32"}, {"name": "21", "dtype": "float32"}, {"name": "22", "dtype": "float32"}, {"name": "23", "dtype": "float32"}, {"name": "24", "dtype": "float32"}, {"name": "25", "dtype": "float32"}, {"name": "26", "dtype": "float32"}, {"name": "27", "dtype": "float32"}, {"name": "28", "dtype": "float32"}, {"name": "29", "dtype": "float32"}, {"name": "30", "dtype": "float32"}, {"name": "31", "dtype": "float32"}, {"name": "32", "dtype": "float32"}, {"name": "33", "dtype": "float32"}, {"name": "34", "dtype": "float32"}, {"name": "35", "dtype": "float32"}, {"name": "36", "dtype": "float32"}, {"name": "37", "dtype": "float32"}, {"name": "38", "dtype": "float32"}, {"name": "39", "dtype": "float32"}, {"name": "40", "dtype": "float32"}, {"name": "41", "dtype": "float32"}, {"name": "42", "dtype": "float32"}, {"name": "43", "dtype": "float32"}, {"name": "44", "dtype": "float32"}, {"name": "45", "dtype": "float32"}, {"name": "46", "dtype": "float32"}, {"name": "47", "dtype": "float32"}, {"name": "48", "dtype": "float32"}, {"name": "49", "dtype": "float32"}, {"name": "50", "dtype": "float32"}, {"name": "51", "dtype": "float32"}, {"name": "52", "dtype": "float32"}, {"name": "53", "dtype": "float32"}, {"name": "54", "dtype": "float32"}, {"name": "55", "dtype": "float32"}, {"name": "56", "dtype": "float32"}, {"name": "57", "dtype": "float32"}, {"name": "58", "dtype": "float32"}, {"name": "59", "dtype": "float32"}, {"name": "60", "dtype": "float32"}, {"name": "61", "dtype": "float32"}, {"name": "62", "dtype": "float32"}, {"name": "63", "dtype": "float32"}, {"name": "64", "dtype": "float32"}, {"name": "65", "dtype": "float32"}, {"name": "66", "dtype": "float32"}, {"name": "67", "dtype": "float32"}, {"name": "68", "dtype": "float32"}, {"name": "69", "dtype": "float32"}, {"name": "70", "dtype": "float32"}, {"name": "71", "dtype": "float32"}, {"name": "72", "dtype": "float32"}, {"name": "73", "dtype": "float32"}, {"name": "74", "dtype": "float32"}, {"name": "75", "dtype": "float32"}, {"name": "76", "dtype": "float32"}, {"name": "77", "dtype": "float32"}, {"name": "78", "dtype": "float32"}, {"name": "79", "dtype": "float32"}, {"name": "80", "dtype": "float32"}, {"name": "81", "dtype": "float32"}, {"name": "82", "dtype": "float32"}, {"name": "83", "dtype": "float32"}, {"name": "84", "dtype": "float32"}, {"name": "85", "dtype": "float32"}, {"name": "86", "dtype": "float32"}, {"name": "87", "dtype": "float32"}, {"name": "88", "dtype": "float32"}, {"name": "89", "dtype": "float32"}, {"name": "90", "dtype": "float32"}, {"name": "91", "dtype": "float32"}, {"name": "92", "dtype": "float32"}, {"name": "93", "dtype": "float32"}, {"name": "94", "dtype": "float32"}, {"name": "95", "dtype": "float32"}, {"name": "96", "dtype": "float32"}, {"name": "97", "dtype": "float32"}, {"name": "98", "dtype": "float32"}, {"name": "99", "dtype": "float32"}, {"name": "100", "dtype": "float32"}, {"name": "101", "dtype": "float32"}, {"name": "102", "dtype": "float32"}, {"name": "103", "dtype": "float32"}, {"name": "104", "dtype": "float32"}, {"name": "105", "dtype": "float32"}, {"name": "106", "dtype": "float32"}, {"name": "107", "dtype": "float32"}, {"name": "108", "dtype": "float32"}, {"name": "109", "dtype": "float32"}, {"name": "110", "dtype": "float32"}, {"name": "111", "dtype": "float32"}, {"name": "112", "dtype": "float32"}, {"name": "113", "dtype": "float32"}, {"name": "114", "dtype": "float32"}, {"name": "115", "dtype": "float32"}, {"name": "116", "dtype": "float32"}, {"name": "117", "dtype": "float32"}, {"name": "118", "dtype": "float32"}, {"name": "119", "dtype": "float32"}, {"name": "120", "dtype": "float32"}, {"name": "121", "dtype": "float32"}, {"name": "122", "dtype": "float32"}, {"name": "123", "dtype": "float32"}, {"name": "124", "dtype": "float32"}, {"name": "125", "dtype": "float32"}, {"name": "126", "dtype": "float32"}, {"name": "127", "dtype": "float32"}, {"name": "128", "dtype": "float32"}, {"name": "129", "dtype": "float32"}, {"name": "130", "dtype": "float32"}, {"name": "131", "dtype": "float32"}, {"name": "132", "dtype": "float32"}, {"name": "133", "dtype": "float32"}, {"name": "134", "dtype": "float32"}, {"name": "135", "dtype": "float32"}, {"name": "136", "dtype": "float32"}, {"name": "137", "dtype": "float32"}, {"name": "138", "dtype": "float32"}, {"name": "139", "dtype": "float32"}, {"name": "140", "dtype": "float32"}, {"name": "141", "dtype": "float32"}, {"name": "142", "dtype": "float32"}, {"name": "143", "dtype": "float32"}, {"name": "144", "dtype": "float32"}, {"name": "145", "dtype": "float32"}, {"name": "146", "dtype": "float32"}, {"name": "147", "dtype": "float32"}, {"name": "148", "dtype": "float32"}, {"name": "149", "dtype": "float32"}, {"name": "150", "dtype": "float32"}, {"name": "151", "dtype": "float32"}, {"name": "152", "dtype": "float32"}, {"name": "153", "dtype": "float32"}, {"name": "154", "dtype": "float32"}, {"name": "155", "dtype": "float32"}, {"name": "156", "dtype": "float32"}, {"name": "157", "dtype": "float32"}, {"name": "158", "dtype": "float32"}, {"name": "159", "dtype": "float32"}, {"name": "160", "dtype": "float32"}, {"name": "161", "dtype": "float32"}, {"name": "162", "dtype": "float32"}, {"name": "163", "dtype": "float32"}, {"name": "164", "dtype": "float32"}, {"name": "165", "dtype": "float32"}, {"name": "166", "dtype": "float32"}, {"name": "167", "dtype": "float32"}, {"name": "168", "dtype": "float32"}, {"name": "169", "dtype": "float32"}, {"name": "170", "dtype": "float32"}, {"name": "171", "dtype": "float32"}, {"name": "172", "dtype": "float32"}, {"name": "173", "dtype": "float32"}, {"name": "174", "dtype": "float32"}, {"name": "175", "dtype": "float32"}, {"name": "176", "dtype": "float32"}, {"name": "177", "dtype": "float32"}, {"name": "178", "dtype": "float32"}, {"name": "179", "dtype": "float32"}, {"name": "180", "dtype": "float32"}, {"name": "181", "dtype": "float32"}, {"name": "182", "dtype": "float32"}, {"name": "183", "dtype": "float32"}, {"name": "184", "dtype": "float32"}, {"name": "185", "dtype": "float32"}, {"name": "186", "dtype": "float32"}, {"name": "187", "dtype": "float32"}, {"name": "188", "dtype": "float32"}, {"name": "189", "dtype": "float32"}, {"name": "190", "dtype": "float32"}, {"name": "191", "dtype": "float32"}, {"name": "192", "dtype": "float32"}, {"name": "193", "dtype": "float32"}, {"name": "194", "dtype": "float32"}, {"name": "195", "dtype": "float32"}, {"name": "196", "dtype": "float32"}, {"name": "197", "dtype": "float32"}, {"name": "198", "dtype": "float32"}, {"name": "199", "dtype": "float32"}, {"name": "200", "dtype": "float32"}, {"name": "201", "dtype": "float32"}, {"name": "202", "dtype": "float32"}, {"name": "203", "dtype": "float32"}, {"name": "204", "dtype": "float32"}, {"name": "205", "dtype": "float32"}, {"name": "206", "dtype": "float32"}, {"name": "207", "dtype": "float32"}, {"name": "208", "dtype": "float32"}, {"name": "209", "dtype": "float32"}, {"name": "210", "dtype": "float32"}, {"name": "211", "dtype": "float32"}, {"name": "212", "dtype": "float32"}, {"name": "213", "dtype": "float32"}, {"name": "214", "dtype": "float32"}, {"name": "215", "dtype": "float32"}, {"name": "216", "dtype": "float32"}, {"name": "217", "dtype": "float32"}, {"name": "218", "dtype": "float32"}, {"name": "219", "dtype": "float32"}, {"name": "220", "dtype": "float32"}, {"name": "221", "dtype": "float32"}, {"name": "222", "dtype": "float32"}, {"name": "223", "dtype": "float32"}, {"name": "224", "dtype": "float32"}, {"name": "225", "dtype": "float32"}, {"name": "226", "dtype": "float32"}, {"name": "227", "dtype": "float32"}, {"name": "228", "dtype": "float32"}, {"name": "229", "dtype": "float32"}, {"name": "230", "dtype": "float32"}, {"name": "231", "dtype": "float32"}, {"name": "232", "dtype": "float32"}, {"name": "233", "dtype": "float32"}, {"name": "234", "dtype": "float32"}, {"name": "235", "dtype": "float32"}, {"name": "236", "dtype": "float32"}, {"name": "237", "dtype": "float32"}, {"name": "238", "dtype": "float32"}, {"name": "239", "dtype": "float32"}, {"name": "240", "dtype": "float32"}, {"name": "241", "dtype": "float32"}, {"name": "242", "dtype": "float32"}, {"name": "243", "dtype": "float32"}, {"name": "244", "dtype": "float32"}, {"name": "245", "dtype": "float32"}, {"name": "246", "dtype": "float32"}, {"name": "247", "dtype": "float32"}, {"name": "248", "dtype": "float32"}, {"name": "249", "dtype": "float32"}, {"name": "250", "dtype": "float32"}, {"name": "251", "dtype": "float32"}, {"name": "252", "dtype": "float32"}, {"name": "253", "dtype": "float32"}, {"name": "254", "dtype": "float32"}, {"name": "255", "dtype": "float32"}, {"name": "256", "dtype": "float32"}, {"name": "257", "dtype": "float32"}, {"name": "258", "dtype": "float32"}, {"name": "259", "dtype": "float32"}, {"name": "260", "dtype": "float32"}, {"name": "261", "dtype": "float32"}, {"name": "262", "dtype": "float32"}, {"name": "263", "dtype": "float32"}, {"name": "264", "dtype": "float32"}, {"name": "265", "dtype": "float32"}, {"name": "266", "dtype": "float32"}, {"name": "267", "dtype": "float32"}, {"name": "268", "dtype": "float32"}, {"name": "269", "dtype": "float32"}, {"name": "270", "dtype": "float32"}, {"name": "271", "dtype": "float32"}, {"name": "272", "dtype": "float32"}, {"name": "273", "dtype": "float32"}, {"name": "274", "dtype": "float32"}, {"name": "275", "dtype": "float32"}, {"name": "276", "dtype": "float32"}, {"name": "277", "dtype": "float32"}, {"name": "278", "dtype": "float32"}, {"name": "279", "dtype": "float32"}, {"name": "280", "dtype": "float32"}, {"name": "281", "dtype": "float32"}, {"name": "282", "dtype": "float32"}, {"name": "283", "dtype": "float32"}, {"name": "284", "dtype": "float32"}, {"name": "285", "dtype": "float32"}, {"name": "286", "dtype": "float32"}, {"name": "287", "dtype": "float32"}, {"name": "288", "dtype": "float32"}, {"name": "289", "dtype": "float32"}, {"name": "290", "dtype": "float32"}, {"name": "291", "dtype": "float32"}, {"name": "292", "dtype": "float32"}, {"name": "293", "dtype": "float32"}, {"name": "294", "dtype": "float32"}, {"name": "295", "dtype": "float32"}, {"name": "296", "dtype": "float32"}, {"name": "297", "dtype": "float32"}, {"name": "298", "dtype": "float32"}, {"name": "299", "dtype": "float32"}, {"name": "300", "dtype": "float32"}, {"name": "301", "dtype": "float32"}, {"name": "302", "dtype": "float32"}, {"name": "303", "dtype": "float32"}, {"name": "304", "dtype": "float32"}, {"name": "305", "dtype": "float32"}, {"name": "306", "dtype": "float32"}, {"name": "307", "dtype": "float32"}, {"name": "308", "dtype": "float32"}, {"name": "309", "dtype": "float32"}, {"name": "310", "dtype": "float32"}, {"name": "311", "dtype": "float32"}, {"name": "312", "dtype": "float32"}, {"name": "313", "dtype": "float32"}, {"name": "314", "dtype": "float32"}, {"name": "315", "dtype": "float32"}, {"name": "316", "dtype": "float32"}, {"name": "317", "dtype": "float32"}, {"name": "318", "dtype": "float32"}, {"name": "319", "dtype": "float32"}, {"name": "320", "dtype": "float32"}, {"name": "321", "dtype": "float32"}, {"name": "322", "dtype": "float32"}, {"name": "323", "dtype": "float32"}, {"name": "324", "dtype": "float32"}, {"name": "325", "dtype": "float32"}, {"name": "326", "dtype": "float32"}, {"name": "327", "dtype": "float32"}, {"name": "328", "dtype": "float32"}, {"name": "329", "dtype": "float32"}, {"name": "330", "dtype": "float32"}, {"name": "331", "dtype": "float32"}, {"name": "332", "dtype": "float32"}, {"name": "333", "dtype": "float32"}, {"name": "334", "dtype": "float32"}, {"name": "335", "dtype": "float32"}, {"name": "336", "dtype": "float32"}, {"name": "337", "dtype": "float32"}, {"name": "338", "dtype": "float32"}, {"name": "339", "dtype": "float32"}, {"name": "340", "dtype": "float32"}, {"name": "341", "dtype": "float32"}, {"name": "342", "dtype": "float32"}, {"name": "343", "dtype": "float32"}, {"name": "344", "dtype": "float32"}, {"name": "345", "dtype": "float32"}, {"name": "346", "dtype": "float32"}, {"name": "347", "dtype": "float32"}, {"name": "348", "dtype": "float32"}, {"name": "349", "dtype": "float32"}, {"name": "350", "dtype": "float32"}, {"name": "351", "dtype": "float32"}, {"name": "352", "dtype": "float32"}, {"name": "353", "dtype": "float32"}, {"name": "354", "dtype": "float32"}, {"name": "355", "dtype": "float32"}, {"name": "356", "dtype": "float32"}, {"name": "357", "dtype": "float32"}, {"name": "358", "dtype": "float32"}, {"name": "359", "dtype": "float32"}, {"name": "360", "dtype": "float32"}, {"name": "361", "dtype": "float32"}, {"name": "362", "dtype": "float32"}, {"name": "363", "dtype": "float32"}, {"name": "364", "dtype": "float32"}, {"name": "365", "dtype": "float32"}, {"name": "366", "dtype": "float32"}, {"name": "367", "dtype": "float32"}, {"name": "368", "dtype": "float32"}, {"name": "369", "dtype": "float32"}, {"name": "370", "dtype": "float32"}, {"name": "371", "dtype": "float32"}, {"name": "372", "dtype": "float32"}, {"name": "373", "dtype": "float32"}, {"name": "374", "dtype": "float32"}, {"name": "375", "dtype": "float32"}, {"name": "376", "dtype": "float32"}, {"name": "377", "dtype": "float32"}, {"name": "378", "dtype": "float32"}, {"name": "379", "dtype": "float32"}, {"name": "380", "dtype": "float32"}, {"name": "381", "dtype": "float32"}, {"name": "382", "dtype": "float32"}, {"name": "383", "dtype": "float32"}, {"name": "384", "dtype": "float32"}, {"name": "385", "dtype": "float32"}, {"name": "386", "dtype": "float32"}, {"name": "387", "dtype": "float32"}, {"name": "388", "dtype": "float32"}, {"name": "389", "dtype": "float32"}, {"name": "390", "dtype": "float32"}, {"name": "391", "dtype": "float32"}, {"name": "392", "dtype": "float32"}, {"name": "393", "dtype": "float32"}, {"name": "394", "dtype": "float32"}, {"name": "395", "dtype": "float32"}, {"name": "396", "dtype": "float32"}, {"name": "397", "dtype": "float32"}, {"name": "398", "dtype": "float32"}, {"name": "399", "dtype": "float32"}, {"name": "400", "dtype": "float32"}, {"name": "401", "dtype": "float32"}, {"name": "402", "dtype": "float32"}, {"name": "403", "dtype": "float32"}, {"name": "404", "dtype": "float32"}, {"name": "405", "dtype": "float32"}, {"name": "406", "dtype": "float32"}, {"name": "407", "dtype": "float32"}, {"name": "408", "dtype": "float32"}, {"name": "409", "dtype": "float32"}, {"name": "410", "dtype": "float32"}, {"name": "411", "dtype": "float32"}, {"name": "412", "dtype": "float32"}, {"name": "413", "dtype": "float32"}, {"name": "414", "dtype": "float32"}, {"name": "415", "dtype": "float32"}, {"name": "416", "dtype": "float32"}, {"name": "417", "dtype": "float32"}, {"name": "418", "dtype": "float32"}, {"name": "419", "dtype": "float32"}, {"name": "420", "dtype": "float32"}, {"name": "421", "dtype": "float32"}, {"name": "422", "dtype": "float32"}, {"name": "423", "dtype": "float32"}, {"name": "424", "dtype": "float32"}, {"name": "425", "dtype": "float32"}, {"name": "426", "dtype": "float32"}, {"name": "427", "dtype": "float32"}, {"name": "428", "dtype": "float32"}, {"name": "429", "dtype": "float32"}, {"name": "430", "dtype": "float32"}, {"name": "431", "dtype": "float32"}, {"name": "432", "dtype": "float32"}, {"name": "433", "dtype": "float32"}, {"name": "434", "dtype": "float32"}, {"name": "435", "dtype": "float32"}, {"name": "436", "dtype": "float32"}, {"name": "437", "dtype": "float32"}, {"name": "438", "dtype": "float32"}, {"name": "439", "dtype": "float32"}, {"name": "440", "dtype": "float32"}, {"name": "441", "dtype": "float32"}, {"name": "442", "dtype": "float32"}, {"name": "443", "dtype": "float32"}, {"name": "444", "dtype": "float32"}, {"name": "445", "dtype": "float32"}, {"name": "446", "dtype": "float32"}, {"name": "447", "dtype": "float32"}, {"name": "448", "dtype": "float32"}, {"name": "449", "dtype": "float32"}, {"name": "450", "dtype": "float32"}, {"name": "451", "dtype": "float32"}, {"name": "452", "dtype": "float32"}, {"name": "453", "dtype": "float32"}, {"name": "454", "dtype": "float32"}, {"name": "455", "dtype": "float32"}, {"name": "456", "dtype": "float32"}, {"name": "457", "dtype": "float32"}, {"name": "458", "dtype": "float32"}, {"name": "459", "dtype": "float32"}, {"name": "460", "dtype": "float32"}, {"name": "461", "dtype": "float32"}, {"name": "462", "dtype": "float32"}, {"name": "463", "dtype": "float32"}, {"name": "464", "dtype": "float32"}, {"name": "465", "dtype": "float32"}, {"name": "466", "dtype": "float32"}, {"name": "467", "dtype": "float32"}, {"name": "468", "dtype": "float32"}, {"name": "469", "dtype": "float32"}, {"name": "470", "dtype": "float32"}, {"name": "471", "dtype": "float32"}, {"name": "472", "dtype": "float32"}, {"name": "473", "dtype": "float32"}, {"name": "474", "dtype": "float32"}, {"name": "475", "dtype": "float32"}, {"name": "476", "dtype": "float32"}, {"name": "477", "dtype": "float32"}, {"name": "478", "dtype": "float32"}, {"name": "479", "dtype": "float32"}, {"name": "480", "dtype": "float32"}, {"name": "481", "dtype": "float32"}, {"name": "482", "dtype": "float32"}, {"name": "483", "dtype": "float32"}, {"name": "484", "dtype": "float32"}, {"name": "485", "dtype": "float32"}, {"name": "486", "dtype": "float32"}, {"name": "487", "dtype": "float32"}, {"name": "488", "dtype": "float32"}, {"name": "489", "dtype": "float32"}, {"name": "490", "dtype": "float32"}, {"name": "491", "dtype": "float32"}, {"name": "492", "dtype": "float32"}, {"name": "493", "dtype": "float32"}, {"name": "494", "dtype": "float32"}, {"name": "495", "dtype": "float32"}, {"name": "496", "dtype": "float32"}, {"name": "497", "dtype": "float32"}, {"name": "498", "dtype": "float32"}, {"name": "499", "dtype": "float32"}, {"name": "500", "dtype": "float32"}, {"name": "501", "dtype": "float32"}, {"name": "502", "dtype": "float32"}, {"name": "503", "dtype": "float32"}, {"name": "504", "dtype": "float32"}, {"name": "505", "dtype": "float32"}, {"name": "506", "dtype": "float32"}, {"name": "507", "dtype": "float32"}, {"name": "508", "dtype": "float32"}, {"name": "509", "dtype": "float32"}, {"name": "510", "dtype": "float32"}, {"name": "511", "dtype": "float32"}, {"name": "512", "dtype": "float32"}, {"name": "513", "dtype": "float32"}, {"name": "514", "dtype": "float32"}, {"name": "515", "dtype": "float32"}, {"name": "516", "dtype": "float32"}, {"name": "517", "dtype": "float32"}, {"name": "518", "dtype": "float32"}, {"name": "519", "dtype": "float32"}, {"name": "520", "dtype": "float32"}, {"name": "521", "dtype": "float32"}, {"name": "522", "dtype": "float32"}, {"name": "523", "dtype": "float32"}, {"name": "524", "dtype": "float32"}, {"name": "525", "dtype": "float32"}, {"name": "526", "dtype": "float32"}, {"name": "527", "dtype": "float32"}, {"name": "528", "dtype": "float32"}, {"name": "529", "dtype": "float32"}, {"name": "530", "dtype": "float32"}, {"name": "531", "dtype": "float32"}, {"name": "532", "dtype": "float32"}, {"name": "533", "dtype": "float32"}, {"name": "534", "dtype": "float32"}, {"name": "535", "dtype": "float32"}, {"name": "536", "dtype": "float32"}, {"name": "537", "dtype": "float32"}, {"name": "538", "dtype": "float32"}, {"name": "539", "dtype": "float32"}, {"name": "540", "dtype": "float32"}, {"name": "541", "dtype": "float32"}, {"name": "542", "dtype": "float32"}, {"name": "543", "dtype": "float32"}, {"name": "544", "dtype": "float32"}, {"name": "545", "dtype": "float32"}, {"name": "546", "dtype": "float32"}, {"name": "547", "dtype": "float32"}, {"name": "548", "dtype": "float32"}, {"name": "549", "dtype": "float32"}, {"name": "550", "dtype": "float32"}, {"name": "551", "dtype": "float32"}, {"name": "552", "dtype": "float32"}, {"name": "553", "dtype": "float32"}, {"name": "554", "dtype": "float32"}, {"name": "555", "dtype": "float32"}, {"name": "556", "dtype": "float32"}, {"name": "557", "dtype": "float32"}, {"name": "558", "dtype": "float32"}, {"name": "559", "dtype": "float32"}, {"name": "560", "dtype": "float32"}, {"name": "561", "dtype": "float32"}, {"name": "562", "dtype": "float32"}, {"name": "563", "dtype": "float32"}, {"name": "564", "dtype": "float32"}, {"name": "565", "dtype": "float32"}, {"name": "566", "dtype": "float32"}, {"name": "567", "dtype": "float32"}, {"name": "568", "dtype": "float32"}, {"name": "569", "dtype": "float32"}, {"name": "570", "dtype": "float32"}, {"name": "571", "dtype": "float32"}, {"name": "572", "dtype": "float32"}, {"name": "573", "dtype": "float32"}, {"name": "574", "dtype": "float32"}, {"name": "575", "dtype": "float32"}, {"name": "576", "dtype": "float32"}, {"name": "577", "dtype": "float32"}, {"name": "578", "dtype": "float32"}, {"name": "579", "dtype": "float32"}, {"name": "580", "dtype": "float32"}, {"name": "581", "dtype": "float32"}, {"name": "582", "dtype": "float32"}, {"name": "583", "dtype": "float32"}, {"name": "584", "dtype": "float32"}, {"name": "585", "dtype": "float32"}, {"name": "586", "dtype": "float32"}, {"name": "587", "dtype": "float32"}, {"name": "588", "dtype": "float32"}, {"name": "589", "dtype": "float32"}, {"name": "590", "dtype": "float32"}, {"name": "591", "dtype": "float32"}, {"name": "592", "dtype": "float32"}, {"name": "593", "dtype": "float32"}, {"name": "594", "dtype": "float32"}, {"name": "595", "dtype": "float32"}, {"name": "596", "dtype": "float32"}, {"name": "597", "dtype": "float32"}, {"name": "598", "dtype": "float32"}, {"name": "599", "dtype": "float32"}, {"name": "600", "dtype": "float32"}, {"name": "601", "dtype": "float32"}, {"name": "602", "dtype": "float32"}, {"name": "603", "dtype": "float32"}, {"name": "604", "dtype": "float32"}, {"name": "605", "dtype": "float32"}, {"name": "606", "dtype": "float32"}, {"name": "607", "dtype": "float32"}, {"name": "608", "dtype": "float32"}, {"name": "609", "dtype": "float32"}, {"name": "610", "dtype": "float32"}, {"name": "611", "dtype": "float32"}, {"name": "612", "dtype": "float32"}, {"name": "613", "dtype": "float32"}, {"name": "614", "dtype": "float32"}, {"name": "615", "dtype": "float32"}, {"name": "616", "dtype": "float32"}, {"name": "617", "dtype": "float32"}, {"name": "618", "dtype": "float32"}, {"name": "619", "dtype": "float32"}, {"name": "620", "dtype": "float32"}, {"name": "621", "dtype": "float32"}, {"name": "622", "dtype": "float32"}, {"name": "623", "dtype": "float32"}, {"name": "624", "dtype": "float32"}, {"name": "625", "dtype": "float32"}, {"name": "626", "dtype": "float32"}, {"name": "627", "dtype": "float32"}, {"name": "628", "dtype": "float32"}, {"name": "629", "dtype": "float32"}, {"name": "630", "dtype": "float32"}, {"name": "631", "dtype": "float32"}, {"name": "632", "dtype": "float32"}, {"name": "633", "dtype": "float32"}, {"name": "634", "dtype": "float32"}, {"name": "635", "dtype": "float32"}, {"name": "636", "dtype": "float32"}, {"name": "637", "dtype": "float32"}, {"name": "638", "dtype": "float32"}, {"name": "639", "dtype": "float32"}, {"name": "640", "dtype": "float32"}, {"name": "641", "dtype": "float32"}, {"name": "642", "dtype": "float32"}, {"name": "643", "dtype": "float32"}, {"name": "644", "dtype": "float32"}, {"name": "645", "dtype": "float32"}, {"name": "646", "dtype": "float32"}, {"name": "647", "dtype": "float32"}, {"name": "648", "dtype": "float32"}, {"name": "649", "dtype": "float32"}, {"name": "650", "dtype": "float32"}, {"name": "651", "dtype": "float32"}, {"name": "652", "dtype": "float32"}, {"name": "653", "dtype": "float32"}, {"name": "654", "dtype": "float32"}, {"name": "655", "dtype": "float32"}, {"name": "656", "dtype": "float32"}, {"name": "657", "dtype": "float32"}, {"name": "658", "dtype": "float32"}, {"name": "659", "dtype": "float32"}, {"name": "660", "dtype": "float32"}, {"name": "661", "dtype": "float32"}, {"name": "662", "dtype": "float32"}, {"name": "663", "dtype": "float32"}, {"name": "664", "dtype": "float32"}, {"name": "665", "dtype": "float32"}, {"name": "666", "dtype": "float32"}, {"name": "667", "dtype": "float32"}, {"name": "668", "dtype": "float32"}, {"name": "669", "dtype": "float32"}, {"name": "670", "dtype": "float32"}, {"name": "671", "dtype": "float32"}, {"name": "672", "dtype": "float32"}, {"name": "673", "dtype": "float32"}, {"name": "674", "dtype": "float32"}, {"name": "675", "dtype": "float32"}, {"name": "676", "dtype": "float32"}, {"name": "677", "dtype": "float32"}, {"name": "678", "dtype": "float32"}, {"name": "679", "dtype": "float32"}, {"name": "680", "dtype": "float32"}, {"name": "681", "dtype": "float32"}, {"name": "682", "dtype": "float32"}, {"name": "683", "dtype": "float32"}, {"name": "684", "dtype": "float32"}, {"name": "685", "dtype": "float32"}, {"name": "686", "dtype": "float32"}, {"name": "687", "dtype": "float32"}, {"name": "688", "dtype": "float32"}, {"name": "689", "dtype": "float32"}, {"name": "690", "dtype": "float32"}, {"name": "691", "dtype": "float32"}, {"name": "692", "dtype": "float32"}, {"name": "693", "dtype": "float32"}, {"name": "694", "dtype": "float32"}, {"name": "695", "dtype": "float32"}, {"name": "696", "dtype": "float32"}, {"name": "697", "dtype": "float32"}, {"name": "698", "dtype": "float32"}, {"name": "699", "dtype": "float32"}, {"name": "700", "dtype": "float32"}, {"name": "701", "dtype": "float32"}, {"name": "702", "dtype": "float32"}, {"name": "703", "dtype": "float32"}, {"name": "704", "dtype": "float32"}, {"name": "705", "dtype": "float32"}, {"name": "706", "dtype": "float32"}, {"name": "707", "dtype": "float32"}, {"name": "708", "dtype": "float32"}, {"name": "709", "dtype": "float32"}, {"name": "710", "dtype": "float32"}, {"name": "711", "dtype": "float32"}, {"name": "712", "dtype": "float32"}, {"name": "713", "dtype": "float32"}, {"name": "714", "dtype": "float32"}, {"name": "715", "dtype": "float32"}, {"name": "716", "dtype": "float32"}, {"name": "717", "dtype": "float32"}, {"name": "718", "dtype": "float32"}, {"name": "719", "dtype": "float32"}, {"name": "720", "dtype": "float32"}, {"name": "721", "dtype": "float32"}, {"name": "722", "dtype": "float32"}, {"name": "723", "dtype": "float32"}, {"name": "724", "dtype": "float32"}, {"name": "725", "dtype": "float32"}, {"name": "726", "dtype": "float32"}, {"name": "727", "dtype": "float32"}, {"name": "728", "dtype": "float32"}, {"name": "729", "dtype": "float32"}, {"name": "730", "dtype": "float32"}, {"name": "731", "dtype": "float32"}, {"name": "732", "dtype": "float32"}, {"name": "733", "dtype": "float32"}, {"name": "734", "dtype": "float32"}, {"name": "735", "dtype": "float32"}, {"name": "736", "dtype": "float32"}, {"name": "737", "dtype": "float32"}, {"name": "738", "dtype": "float32"}, {"name": "739", "dtype": "float32"}, {"name": "740", "dtype": "float32"}, {"name": "741", "dtype": "float32"}, {"name": "742", "dtype": "float32"}, {"name": "743", "dtype": "float32"}, {"name": "744", "dtype": "float32"}, {"name": "745", "dtype": "float32"}, {"name": "746", "dtype": "float32"}, {"name": "747", "dtype": "float32"}, {"name": "748", "dtype": "float32"}, {"name": "749", "dtype": "float32"}, {"name": "750", "dtype": "float32"}, {"name": "751", "dtype": "float32"}, {"name": "752", "dtype": "float32"}, {"name": "753", "dtype": "float32"}, {"name": "754", "dtype": "float32"}, {"name": "755", "dtype": "float32"}, {"name": "756", "dtype": "float32"}, {"name": "757", "dtype": "float32"}, {"name": "758", "dtype": "float32"}, {"name": "759", "dtype": "float32"}, {"name": "760", "dtype": "float32"}, {"name": "761", "dtype": "float32"}, {"name": "762", "dtype": "float32"}, {"name": "763", "dtype": "float32"}, {"name": "764", "dtype": "float32"}, {"name": "765", "dtype": "float32"}, {"name": "766", "dtype": "float32"}, {"name": "767", "dtype": "float32"}, {"name": "768", "dtype": "float32"}, {"name": "769", "dtype": "float32"}, {"name": "770", "dtype": "float32"}, {"name": "771", "dtype": "float32"}, {"name": "772", "dtype": "float32"}, {"name": "773", "dtype": "float32"}, {"name": "774", "dtype": "float32"}, {"name": "775", "dtype": "float32"}, {"name": "776", "dtype": "float32"}, {"name": "777", "dtype": "float32"}, {"name": "778", "dtype": "float32"}, {"name": "779", "dtype": "float32"}, {"name": "780", "dtype": "float32"}, {"name": "781", "dtype": "float32"}, {"name": "782", "dtype": "float32"}, {"name": "783", "dtype": "float32"}, {"name": "784", "dtype": "float32"}, {"name": "785", "dtype": "float32"}, {"name": "786", "dtype": "float32"}, {"name": "787", "dtype": "float32"}, {"name": "788", "dtype": "float32"}, {"name": "789", "dtype": "float32"}, {"name": "790", "dtype": "float32"}, {"name": "791", "dtype": "float32"}, {"name": "792", "dtype": "float32"}, {"name": "793", "dtype": "float32"}, {"name": "794", "dtype": "float32"}, {"name": "795", "dtype": "float32"}, {"name": "796", "dtype": "float32"}, {"name": "797", "dtype": "float32"}, {"name": "798", "dtype": "float32"}, {"name": "799", "dtype": "float32"}, {"name": "800", "dtype": "float32"}, {"name": "801", "dtype": "float32"}, {"name": "802", "dtype": "float32"}, {"name": "803", "dtype": "float32"}, {"name": "804", "dtype": "float32"}, {"name": "805", "dtype": "float32"}, {"name": "806", "dtype": "float32"}, {"name": "807", "dtype": "float32"}, {"name": "808", "dtype": "float32"}, {"name": "809", "dtype": "float32"}, {"name": "810", "dtype": "float32"}, {"name": "811", "dtype": "float32"}, {"name": "812", "dtype": "float32"}, {"name": "813", "dtype": "float32"}, {"name": "814", "dtype": "float32"}, {"name": "815", "dtype": "float32"}, {"name": "816", "dtype": "float32"}, {"name": "817", "dtype": "float32"}, {"name": "818", "dtype": "float32"}, {"name": "819", "dtype": "float32"}, {"name": "820", "dtype": "float32"}, {"name": "821", "dtype": "float32"}, {"name": "822", "dtype": "float32"}, {"name": "823", "dtype": "float32"}, {"name": "824", "dtype": "float32"}, {"name": "825", "dtype": "float32"}, {"name": "826", "dtype": "float32"}, {"name": "827", "dtype": "float32"}, {"name": "828", "dtype": "float32"}, {"name": "829", "dtype": "float32"}, {"name": "830", "dtype": "float32"}, {"name": "831", "dtype": "float32"}, {"name": "832", "dtype": "float32"}, {"name": "833", "dtype": "float32"}, {"name": "834", "dtype": "float32"}, {"name": "835", "dtype": "float32"}, {"name": "836", "dtype": "float32"}, {"name": "837", "dtype": "float32"}, {"name": "838", "dtype": "float32"}, {"name": "839", "dtype": "float32"}, {"name": "840", "dtype": "float32"}, {"name": "841", "dtype": "float32"}, {"name": "842", "dtype": "float32"}, {"name": "843", "dtype": "float32"}, {"name": "844", "dtype": "float32"}, {"name": "845", "dtype": "float32"}, {"name": "846", "dtype": "float32"}, {"name": "847", "dtype": "float32"}, {"name": "848", "dtype": "float32"}, {"name": "849", "dtype": "float32"}, {"name": "850", "dtype": "float32"}, {"name": "851", "dtype": "float32"}, {"name": "852", "dtype": "float32"}, {"name": "853", "dtype": "float32"}, {"name": "854", "dtype": "float32"}, {"name": "855", "dtype": "float32"}, {"name": "856", "dtype": "float32"}, {"name": "857", "dtype": "float32"}, {"name": "858", "dtype": "float32"}, {"name": "859", "dtype": "float32"}, {"name": "860", "dtype": "float32"}, {"name": "861", "dtype": "float32"}, {"name": "862", "dtype": "float32"}, {"name": "863", "dtype": "float32"}, {"name": "864", "dtype": "float32"}, {"name": "865", "dtype": "float32"}, {"name": "866", "dtype": "float32"}, {"name": "867", "dtype": "float32"}, {"name": "868", "dtype": "float32"}, {"name": "869", "dtype": "float32"}, {"name": "870", "dtype": "float32"}, {"name": "871", "dtype": "float32"}, {"name": "872", "dtype": "float32"}, {"name": "873", "dtype": "float32"}, {"name": "874", "dtype": "float32"}, {"name": "875", "dtype": "float32"}, {"name": "876", "dtype": "float32"}, {"name": "877", "dtype": "float32"}, {"name": "878", "dtype": "float32"}, {"name": "879", "dtype": "float32"}, {"name": "880", "dtype": "float32"}, {"name": "881", "dtype": "float32"}, {"name": "882", "dtype": "float32"}, {"name": "883", "dtype": "float32"}, {"name": "884", "dtype": "float32"}, {"name": "885", "dtype": "float32"}, {"name": "886", "dtype": "float32"}, {"name": "887", "dtype": "float32"}, {"name": "888", "dtype": "float32"}, {"name": "889", "dtype": "float32"}, {"name": "890", "dtype": "float32"}, {"name": "891", "dtype": "float32"}, {"name": "892", "dtype": "float32"}, {"name": "893", "dtype": "float32"}, {"name": "894", "dtype": "float32"}, {"name": "895", "dtype": "float32"}, {"name": "896", "dtype": "float32"}, {"name": "897", "dtype": "float32"}, {"name": "898", "dtype": "float32"}, {"name": "899", "dtype": "float32"}, {"name": "900", "dtype": "float32"}, {"name": "901", "dtype": "float32"}, {"name": "902", "dtype": "float32"}, {"name": "903", "dtype": "float32"}, {"name": "904", "dtype": "float32"}, {"name": "905", "dtype": "float32"}, {"name": "906", "dtype": "float32"}, {"name": "907", "dtype": "float32"}, {"name": "908", "dtype": "float32"}, {"name": "909", "dtype": "float32"}, {"name": "910", "dtype": "float32"}, {"name": "911", "dtype": "float32"}, {"name": "912", "dtype": "float32"}, {"name": "913", "dtype": "float32"}, {"name": "914", "dtype": "float32"}, {"name": "915", "dtype": "float32"}, {"name": "916", "dtype": "float32"}, {"name": "917", "dtype": "float32"}, {"name": "918", "dtype": "float32"}, {"name": "919", "dtype": "float32"}, {"name": "920", "dtype": "float32"}, {"name": "921", "dtype": "float32"}, {"name": "922", "dtype": "float32"}, {"name": "923", "dtype": "float32"}, {"name": "924", "dtype": "float32"}, {"name": "925", "dtype": "float32"}, {"name": "926", "dtype": "float32"}, {"name": "927", "dtype": "float32"}, {"name": "928", "dtype": "float32"}, {"name": "929", "dtype": "float32"}, {"name": "930", "dtype": "float32"}, {"name": "931", "dtype": "float32"}, {"name": "932", "dtype": "float32"}, {"name": "933", "dtype": "float32"}, {"name": "934", "dtype": "float32"}, {"name": "935", "dtype": "float32"}, {"name": "936", "dtype": "float32"}, {"name": "937", "dtype": "float32"}, {"name": "938", "dtype": "float32"}, {"name": "939", "dtype": "float32"}, {"name": "940", "dtype": "float32"}, {"name": "941", "dtype": "float32"}, {"name": "942", "dtype": "float32"}, {"name": "943", "dtype": "float32"}, {"name": "944", "dtype": "float32"}, {"name": "945", "dtype": "float32"}, {"name": "946", "dtype": "float32"}, {"name": "947", "dtype": "float32"}, {"name": "948", "dtype": "float32"}, {"name": "949", "dtype": "float32"}, {"name": "950", "dtype": "float32"}, {"name": "951", "dtype": "float32"}, {"name": "952", "dtype": "float32"}, {"name": "953", "dtype": "float32"}, {"name": "954", "dtype": "float32"}, {"name": "955", "dtype": "float32"}, {"name": "956", "dtype": "float32"}, {"name": "957", "dtype": "float32"}, {"name": "958", "dtype": "float32"}, {"name": "959", "dtype": "float32"}, {"name": "960", "dtype": "float32"}, {"name": "961", "dtype": "float32"}, {"name": "962", "dtype": "float32"}, {"name": "963", "dtype": "float32"}, {"name": "964", "dtype": "float32"}, {"name": "965", "dtype": "float32"}, {"name": "966", "dtype": "float32"}, {"name": "967", "dtype": "float32"}, {"name": "968", "dtype": "float32"}, {"name": "969", "dtype": "float32"}, {"name": "970", "dtype": "float32"}, {"name": "971", "dtype": "float32"}, {"name": "972", "dtype": "float32"}, {"name": "973", "dtype": "float32"}, {"name": "974", "dtype": "float32"}, {"name": "975", "dtype": "float32"}, {"name": "976", "dtype": "float32"}, {"name": "977", "dtype": "float32"}, {"name": "978", "dtype": "float32"}, {"name": "979", "dtype": "float32"}, {"name": "980", "dtype": "float32"}, {"name": "981", "dtype": "float32"}, {"name": "982", "dtype": "float32"}, {"name": "983", "dtype": "float32"}, {"name": "984", "dtype": "float32"}, {"name": "985", "dtype": "float32"}, {"name": "986", "dtype": "float32"}, {"name": "987", "dtype": "float32"}, {"name": "988", "dtype": "float32"}, {"name": "989", "dtype": "float32"}, {"name": "990", "dtype": "float32"}, {"name": "991", "dtype": "float32"}, {"name": "992", "dtype": "float32"}, {"name": "993", "dtype": "float32"}, {"name": "994", "dtype": "float32"}, {"name": "995", "dtype": "float32"}, {"name": "996", "dtype": "float32"}, {"name": "997", "dtype": "float32"}, {"name": "998", "dtype": "float32"}, {"name": "999", "dtype": "float32"}, {"name": "1000", "dtype": "float32"}, {"name": "1001", "dtype": "float32"}, {"name": "1002", "dtype": "float32"}, {"name": "1003", "dtype": "float32"}, {"name": "1004", "dtype": "float32"}, {"name": "1005", "dtype": "float32"}, {"name": "1006", "dtype": "float32"}, {"name": "1007", "dtype": "float32"}, {"name": "1008", "dtype": "float32"}, {"name": "1009", "dtype": "float32"}, {"name": "1010", "dtype": "float32"}, {"name": "1011", "dtype": "float32"}, {"name": "1012", "dtype": "float32"}, {"name": "1013", "dtype": "float32"}, {"name": "1014", "dtype": "float32"}, {"name": "1015", "dtype": "float32"}, {"name": "1016", "dtype": "float32"}, {"name": "1017", "dtype": "float32"}, {"name": "1018", "dtype": "float32"}, {"name": "1019", "dtype": "float32"}, {"name": "1020", "dtype": "float32"}, {"name": "1021", "dtype": "float32"}, {"name": "1022", "dtype": "float32"}, {"name": "1023", "dtype": "float32"}, {"name": "1024", "dtype": "float32"}, {"name": "1025", "dtype": "float32"}, {"name": "1026", "dtype": "float32"}, {"name": "1027", "dtype": "float32"}, {"name": "1028", "dtype": "float32"}, {"name": "1029", "dtype": "float32"}, {"name": "1030", "dtype": "float32"}, {"name": "1031", "dtype": "float32"}, {"name": "1032", "dtype": "float32"}, {"name": "1033", "dtype": "float32"}, {"name": "1034", "dtype": "float32"}, {"name": "1035", "dtype": "float32"}, {"name": "1036", "dtype": "float32"}, {"name": "1037", "dtype": "float32"}, {"name": "1038", "dtype": "float32"}, {"name": "1039", "dtype": "float32"}, {"name": "1040", "dtype": "float32"}, {"name": "1041", "dtype": "float32"}, {"name": "1042", "dtype": "float32"}, {"name": "1043", "dtype": "float32"}, {"name": "1044", "dtype": "float32"}, {"name": "1045", "dtype": "float32"}, {"name": "1046", "dtype": "float32"}, {"name": "1047", "dtype": "float32"}, {"name": "1048", "dtype": "float32"}, {"name": "1049", "dtype": "float32"}, {"name": "1050", "dtype": "float32"}, {"name": "1051", "dtype": "float32"}, {"name": "1052", "dtype": "float32"}, {"name": "1053", "dtype": "float32"}, {"name": "1054", "dtype": "float32"}, {"name": "1055", "dtype": "float32"}, {"name": "1056", "dtype": "float32"}, {"name": "1057", "dtype": "float32"}, {"name": "1058", "dtype": "float32"}, {"name": "1059", "dtype": "float32"}, {"name": "1060", "dtype": "float32"}, {"name": "1061", "dtype": "float32"}, {"name": "1062", "dtype": "float32"}, {"name": "1063", "dtype": "float32"}, {"name": "1064", "dtype": "float32"}, {"name": "1065", "dtype": "float32"}, {"name": "1066", "dtype": "float32"}, {"name": "1067", "dtype": "float32"}, {"name": "1068", "dtype": "float32"}, {"name": "1069", "dtype": "float32"}, {"name": "1070", "dtype": "float32"}, {"name": "1071", "dtype": "float32"}, {"name": "1072", "dtype": "float32"}, {"name": "1073", "dtype": "float32"}, {"name": "1074", "dtype": "float32"}, {"name": "1075", "dtype": "float32"}, {"name": "1076", "dtype": "float32"}, {"name": "1077", "dtype": "float32"}, {"name": "1078", "dtype": "float32"}, {"name": "1079", "dtype": "float32"}, {"name": "1080", "dtype": "float32"}, {"name": "1081", "dtype": "float32"}, {"name": "1082", "dtype": "float32"}, {"name": "1083", "dtype": "float32"}, {"name": "1084", "dtype": "float32"}, {"name": "1085", "dtype": "float32"}, {"name": "1086", "dtype": "float32"}, {"name": "1087", "dtype": "float32"}, {"name": "1088", "dtype": "float32"}, {"name": "1089", "dtype": "float32"}, {"name": "1090", "dtype": "float32"}, {"name": "1091", "dtype": "float32"}, {"name": "1092", "dtype": "float32"}, {"name": "1093", "dtype": "float32"}, {"name": "1094", "dtype": "float32"}, {"name": "1095", "dtype": "float32"}, {"name": "1096", "dtype": "float32"}, {"name": "1097", "dtype": "float32"}, {"name": "1098", "dtype": "float32"}, {"name": "1099", "dtype": "float32"}, {"name": "1100", "dtype": "float32"}, {"name": "1101", "dtype": "float32"}, {"name": "1102", "dtype": "float32"}, {"name": "1103", "dtype": "float32"}, {"name": "1104", "dtype": "float32"}, {"name": "1105", "dtype": "float32"}, {"name": "1106", "dtype": "float32"}, {"name": "1107", "dtype": "float32"}, {"name": "1108", "dtype": "float32"}, {"name": "1109", "dtype": "float32"}, {"name": "1110", "dtype": "float32"}, {"name": "1111", "dtype": "float32"}, {"name": "1112", "dtype": "float32"}, {"name": "1113", "dtype": "float32"}, {"name": "1114", "dtype": "float32"}, {"name": "1115", "dtype": "float32"}, {"name": "1116", "dtype": "float32"}, {"name": "1117", "dtype": "float32"}, {"name": "1118", "dtype": "float32"}, {"name": "1119", "dtype": "float32"}, {"name": "1120", "dtype": "float32"}, {"name": "1121", "dtype": "float32"}, {"name": "1122", "dtype": "float32"}, {"name": "1123", "dtype": "float32"}, {"name": "1124", "dtype": "float32"}, {"name": "1125", "dtype": "float32"}, {"name": "1126", "dtype": "float32"}, {"name": "1127", "dtype": "float32"}, {"name": "1128", "dtype": "float32"}, {"name": "1129", "dtype": "float32"}, {"name": "1130", "dtype": "float32"}, {"name": "1131", "dtype": "float32"}, {"name": "1132", "dtype": "float32"}, {"name": "1133", "dtype": "float32"}, {"name": "1134", "dtype": "float32"}, {"name": "1135", "dtype": "float32"}, {"name": "1136", "dtype": "float32"}, {"name": "1137", "dtype": "float32"}, {"name": "1138", "dtype": "float32"}, {"name": "1139", "dtype": "float32"}, {"name": "1140", "dtype": "float32"}, {"name": "1141", "dtype": "float32"}, {"name": "1142", "dtype": "float32"}, {"name": "1143", "dtype": "float32"}, {"name": "1144", "dtype": "float32"}, {"name": "1145", "dtype": "float32"}, {"name": "1146", "dtype": "float32"}, {"name": "1147", "dtype": "float32"}, {"name": "1148", "dtype": "float32"}, {"name": "1149", "dtype": "float32"}, {"name": "1150", "dtype": "float32"}, {"name": "1151", "dtype": "float32"}, {"name": "1152", "dtype": "float32"}, {"name": "1153", "dtype": "float32"}, {"name": "1154", "dtype": "float32"}, {"name": "1155", "dtype": "float32"}, {"name": "1156", "dtype": "float32"}, {"name": "1157", "dtype": "float32"}, {"name": "1158", "dtype": "float32"}, {"name": "1159", "dtype": "float32"}, {"name": "1160", "dtype": "float32"}, {"name": "1161", "dtype": "float32"}, {"name": "1162", "dtype": "float32"}, {"name": "1163", "dtype": "float32"}, {"name": "1164", "dtype": "float32"}, {"name": "1165", "dtype": "float32"}, {"name": "1166", "dtype": "float32"}, {"name": "1167", "dtype": "float32"}, {"name": "1168", "dtype": "float32"}, {"name": "1169", "dtype": "float32"}, {"name": "1170", "dtype": "float32"}, {"name": "1171", "dtype": "float32"}, {"name": "1172", "dtype": "float32"}, {"name": "1173", "dtype": "float32"}, {"name": "1174", "dtype": "float32"}, {"name": "1175", "dtype": "float32"}, {"name": "1176", "dtype": "float32"}, {"name": "1177", "dtype": "float32"}, {"name": "1178", "dtype": "float32"}, {"name": "1179", "dtype": "float32"}, {"name": "1180", "dtype": "float32"}, {"name": "1181", "dtype": "float32"}, {"name": "1182", "dtype": "float32"}, {"name": "1183", "dtype": "float32"}, {"name": "1184", "dtype": "float32"}, {"name": "1185", "dtype": "float32"}, {"name": "1186", "dtype": "float32"}, {"name": "1187", "dtype": "float32"}, {"name": "1188", "dtype": "float32"}, {"name": "1189", "dtype": "float32"}, {"name": "1190", "dtype": "float32"}, {"name": "1191", "dtype": "float32"}, {"name": "1192", "dtype": "float32"}, {"name": "1193", "dtype": "float32"}, {"name": "1194", "dtype": "float32"}, {"name": "1195", "dtype": "float32"}, {"name": "1196", "dtype": "float32"}, {"name": "1197", "dtype": "float32"}, {"name": "1198", "dtype": "float32"}, {"name": "1199", "dtype": "float32"}, {"name": "1200", "dtype": "float32"}, {"name": "1201", "dtype": "float32"}, {"name": "1202", "dtype": "float32"}, {"name": "1203", "dtype": "float32"}, {"name": "1204", "dtype": "float32"}, {"name": "1205", "dtype": "float32"}, {"name": "1206", "dtype": "float32"}, {"name": "1207", "dtype": "float32"}, {"name": "1208", "dtype": "float32"}, {"name": "1209", "dtype": "float32"}, {"name": "1210", "dtype": "float32"}, {"name": "1211", "dtype": "float32"}, {"name": "1212", "dtype": "float32"}, {"name": "1213", "dtype": "float32"}, {"name": "1214", "dtype": "float32"}, {"name": "1215", "dtype": "float32"}, {"name": "1216", "dtype": "float32"}, {"name": "1217", "dtype": "float32"}, {"name": "1218", "dtype": "float32"}, {"name": "1219", "dtype": "float32"}, {"name": "1220", "dtype": "float32"}, {"name": "1221", "dtype": "float32"}, {"name": "1222", "dtype": "float32"}, {"name": "1223", "dtype": "float32"}, {"name": "1224", "dtype": "float32"}, {"name": "1225", "dtype": "float32"}, {"name": "1226", "dtype": "float32"}, {"name": "1227", "dtype": "float32"}, {"name": "1228", "dtype": "float32"}, {"name": "1229", "dtype": "float32"}, {"name": "1230", "dtype": "float32"}, {"name": "1231", "dtype": "float32"}, {"name": "1232", "dtype": "float32"}, {"name": "1233", "dtype": "float32"}, {"name": "1234", "dtype": "float32"}, {"name": "1235", "dtype": "float32"}, {"name": "1236", "dtype": "float32"}, {"name": "1237", "dtype": "float32"}, {"name": "1238", "dtype": "float32"}, {"name": "1239", "dtype": "float32"}, {"name": "1240", "dtype": "float32"}, {"name": "1241", "dtype": "float32"}, {"name": "1242", "dtype": "float32"}, {"name": "1243", "dtype": "float32"}, {"name": "1244", "dtype": "float32"}, {"name": "1245", "dtype": "float32"}, {"name": "1246", "dtype": "float32"}, {"name": "1247", "dtype": "float32"}, {"name": "1248", "dtype": "float32"}, {"name": "1249", "dtype": "float32"}, {"name": "1250", "dtype": "float32"}, {"name": "1251", "dtype": "float32"}, {"name": "1252", "dtype": "float32"}, {"name": "1253", "dtype": "float32"}, {"name": "1254", "dtype": "float32"}, {"name": "1255", "dtype": "float32"}, {"name": "1256", "dtype": "float32"}, {"name": "1257", "dtype": "float32"}, {"name": "1258", "dtype": "float32"}, {"name": "1259", "dtype": "float32"}, {"name": "1260", "dtype": "float32"}, {"name": "1261", "dtype": "float32"}, {"name": "1262", "dtype": "float32"}, {"name": "1263", "dtype": "float32"}, {"name": "1264", "dtype": "float32"}, {"name": "1265", "dtype": "float32"}, {"name": "1266", "dtype": "float32"}, {"name": "1267", "dtype": "float32"}, {"name": "1268", "dtype": "float32"}, {"name": "1269", "dtype": "float32"}, {"name": "1270", "dtype": "float32"}, {"name": "1271", "dtype": "float32"}, {"name": "1272", "dtype": "float32"}, {"name": "1273", "dtype": "float32"}, {"name": "1274", "dtype": "float32"}, {"name": "1275", "dtype": "float32"}, {"name": "1276", "dtype": "float32"}, {"name": "1277", "dtype": "float32"}, {"name": "1278", "dtype": "float32"}, {"name": "1279", "dtype": "float32"}, {"name": "1280", "dtype": "float32"}, {"name": "1281", "dtype": "float32"}, {"name": "1282", "dtype": "float32"}, {"name": "1283", "dtype": "float32"}, {"name": "1284", "dtype": "float32"}, {"name": "1285", "dtype": "float32"}, {"name": "1286", "dtype": "float32"}, {"name": "1287", "dtype": "float32"}, {"name": "1288", "dtype": "float32"}, {"name": "1289", "dtype": "float32"}, {"name": "1290", "dtype": "float32"}, {"name": "1291", "dtype": "float32"}, {"name": "1292", "dtype": "float32"}, {"name": "1293", "dtype": "float32"}, {"name": "1294", "dtype": "float32"}, {"name": "1295", "dtype": "float32"}, {"name": "1296", "dtype": "float32"}, {"name": "1297", "dtype": "float32"}, {"name": "1298", "dtype": "float32"}, {"name": "1299", "dtype": "float32"}, {"name": "1300", "dtype": "float32"}, {"name": "1301", "dtype": "float32"}, {"name": "1302", "dtype": "float32"}, {"name": "1303", "dtype": "float32"}, {"name": "1304", "dtype": "float32"}, {"name": "1305", "dtype": "float32"}, {"name": "1306", "dtype": "float32"}, {"name": "1307", "dtype": "float32"}, {"name": "1308", "dtype": "float32"}, {"name": "1309", "dtype": "float32"}, {"name": "1310", "dtype": "float32"}, {"name": "1311", "dtype": "float32"}, {"name": "1312", "dtype": "float32"}, {"name": "1313", "dtype": "float32"}, {"name": "1314", "dtype": "float32"}, {"name": "1315", "dtype": "float32"}, {"name": "1316", "dtype": "float32"}, {"name": "1317", "dtype": "float32"}, {"name": "1318", "dtype": "float32"}, {"name": "1319", "dtype": "float32"}, {"name": "1320", "dtype": "float32"}, {"name": "1321", "dtype": "float32"}, {"name": "1322", "dtype": "float32"}, {"name": "1323", "dtype": "float32"}, {"name": "1324", "dtype": "float32"}, {"name": "1325", "dtype": "float32"}, {"name": "1326", "dtype": "float32"}, {"name": "1327", "dtype": "float32"}, {"name": "1328", "dtype": "float32"}, {"name": "1329", "dtype": "float32"}, {"name": "1330", "dtype": "float32"}, {"name": "1331", "dtype": "float32"}, {"name": "1332", "dtype": "float32"}, {"name": "1333", "dtype": "float32"}, {"name": "1334", "dtype": "float32"}, {"name": "1335", "dtype": "float32"}, {"name": "1336", "dtype": "float32"}, {"name": "1337", "dtype": "float32"}, {"name": "1338", "dtype": "float32"}, {"name": "1339", "dtype": "float32"}, {"name": "1340", "dtype": "float32"}, {"name": "1341", "dtype": "float32"}, {"name": "1342", "dtype": "float32"}, {"name": "1343", "dtype": "float32"}, {"name": "1344", "dtype": "float32"}, {"name": "1345", "dtype": "float32"}, {"name": "1346", "dtype": "float32"}, {"name": "1347", "dtype": "float32"}, {"name": "1348", "dtype": "float32"}, {"name": "1349", "dtype": "float32"}, {"name": "1350", "dtype": "float32"}, {"name": "1351", "dtype": "float32"}, {"name": "1352", "dtype": "float32"}, {"name": "1353", "dtype": "float32"}, {"name": "1354", "dtype": "float32"}, {"name": "1355", "dtype": "float32"}, {"name": "1356", "dtype": "float32"}, {"name": "1357", "dtype": "float32"}, {"name": "1358", "dtype": "float32"}, {"name": "1359", "dtype": "float32"}, {"name": "1360", "dtype": "float32"}, {"name": "1361", "dtype": "float32"}, {"name": "1362", "dtype": "float32"}, {"name": "1363", "dtype": "float32"}, {"name": "1364", "dtype": "float32"}, {"name": "1365", "dtype": "float32"}, {"name": "1366", "dtype": "float32"}, {"name": "1367", "dtype": "float32"}, {"name": "1368", "dtype": "float32"}, {"name": "1369", "dtype": "float32"}, {"name": "1370", "dtype": "float32"}, {"name": "1371", "dtype": "float32"}, {"name": "1372", "dtype": "float32"}, {"name": "1373", "dtype": "float32"}, {"name": "1374", "dtype": "float32"}, {"name": "1375", "dtype": "float32"}, {"name": "1376", "dtype": "float32"}, {"name": "1377", "dtype": "float32"}, {"name": "1378", "dtype": "float32"}, {"name": "1379", "dtype": "float32"}, {"name": "1380", "dtype": "float32"}, {"name": "1381", "dtype": "float32"}, {"name": "1382", "dtype": "float32"}, {"name": "1383", "dtype": "float32"}, {"name": "1384", "dtype": "float32"}, {"name": "1385", "dtype": "float32"}, {"name": "1386", "dtype": "float32"}, {"name": "1387", "dtype": "float32"}, {"name": "1388", "dtype": "float32"}, {"name": "1389", "dtype": "float32"}, {"name": "1390", "dtype": "float32"}, {"name": "1391", "dtype": "float32"}, {"name": "1392", "dtype": "float32"}, {"name": "1393", "dtype": "float32"}, {"name": "1394", "dtype": "float32"}, {"name": "1395", "dtype": "float32"}, {"name": "1396", "dtype": "float32"}, {"name": "1397", "dtype": "float32"}, {"name": "1398", "dtype": "float32"}, {"name": "1399", "dtype": "float32"}, {"name": "1400", "dtype": "float32"}, {"name": "1401", "dtype": "float32"}, {"name": "1402", "dtype": "float32"}, {"name": "1403", "dtype": "float32"}, {"name": "1404", "dtype": "float32"}, {"name": "1405", "dtype": "float32"}, {"name": "1406", "dtype": "float32"}, {"name": "1407", "dtype": "float32"}, {"name": "1408", "dtype": "float32"}, {"name": "1409", "dtype": "float32"}, {"name": "1410", "dtype": "float32"}, {"name": "1411", "dtype": "float32"}, {"name": "1412", "dtype": "float32"}, {"name": "1413", "dtype": "float32"}, {"name": "1414", "dtype": "float32"}, {"name": "1415", "dtype": "float32"}, {"name": "1416", "dtype": "float32"}, {"name": "1417", "dtype": "float32"}, {"name": "1418", "dtype": "float32"}, {"name": "1419", "dtype": "float32"}, {"name": "1420", "dtype": "float32"}, {"name": "1421", "dtype": "float32"}, {"name": "1422", "dtype": "float32"}, {"name": "1423", "dtype": "float32"}, {"name": "1424", "dtype": "float32"}, {"name": "1425", "dtype": "float32"}, {"name": "1426", "dtype": "float32"}, {"name": "1427", "dtype": "float32"}, {"name": "1428", "dtype": "float32"}, {"name": "1429", "dtype": "float32"}, {"name": "1430", "dtype": "float32"}, {"name": "1431", "dtype": "float32"}, {"name": "1432", "dtype": "float32"}, {"name": "1433", "dtype": "float32"}, {"name": "1434", "dtype": "float32"}, {"name": "1435", "dtype": "float32"}, {"name": "1436", "dtype": "float32"}, {"name": "1437", "dtype": "float32"}, {"name": "1438", "dtype": "float32"}, {"name": "1439", "dtype": "float32"}, {"name": "1440", "dtype": "float32"}, {"name": "1441", "dtype": "float32"}, {"name": "1442", "dtype": "float32"}, {"name": "1443", "dtype": "float32"}, {"name": "1444", "dtype": "float32"}, {"name": "1445", "dtype": "float32"}, {"name": "1446", "dtype": "float32"}, {"name": "1447", "dtype": "float32"}, {"name": "1448", "dtype": "float32"}, {"name": "1449", "dtype": "float32"}, {"name": "1450", "dtype": "float32"}, {"name": "1451", "dtype": "float32"}, {"name": "1452", "dtype": "float32"}, {"name": "1453", "dtype": "float32"}, {"name": "1454", "dtype": "float32"}, {"name": "1455", "dtype": "float32"}, {"name": "1456", "dtype": "float32"}, {"name": "1457", "dtype": "float32"}, {"name": "1458", "dtype": "float32"}, {"name": "1459", "dtype": "float32"}, {"name": "1460", "dtype": "float32"}, {"name": "1461", "dtype": "float32"}, {"name": "1462", "dtype": "float32"}, {"name": "1463", "dtype": "float32"}, {"name": "1464", "dtype": "float32"}, {"name": "1465", "dtype": "float32"}, {"name": "1466", "dtype": "float32"}, {"name": "1467", "dtype": "float32"}, {"name": "1468", "dtype": "float32"}, {"name": "1469", "dtype": "float32"}, {"name": "1470", "dtype": "float32"}, {"name": "1471", "dtype": "float32"}, {"name": "1472", "dtype": "float32"}, {"name": "1473", "dtype": "float32"}, {"name": "1474", "dtype": "float32"}, {"name": "1475", "dtype": "float32"}, {"name": "1476", "dtype": "float32"}, {"name": "1477", "dtype": "float32"}, {"name": "1478", "dtype": "float32"}, {"name": "1479", "dtype": "float32"}, {"name": "1480", "dtype": "float32"}, {"name": "1481", "dtype": "float32"}, {"name": "1482", "dtype": "float32"}, {"name": "1483", "dtype": "float32"}, {"name": "1484", "dtype": "float32"}, {"name": "1485", "dtype": "float32"}, {"name": "1486", "dtype": "float32"}, {"name": "1487", "dtype": "float32"}, {"name": "1488", "dtype": "float32"}, {"name": "1489", "dtype": "float32"}, {"name": "1490", "dtype": "float32"}, {"name": "1491", "dtype": "float32"}, {"name": "1492", "dtype": "float32"}, {"name": "1493", "dtype": "float32"}, {"name": "1494", "dtype": "float32"}, {"name": "1495", "dtype": "float32"}, {"name": "1496", "dtype": "float32"}, {"name": "1497", "dtype": "float32"}, {"name": "1498", "dtype": "float32"}, {"name": "1499", "dtype": "float32"}, {"name": "1500", "dtype": "float32"}, {"name": "1501", "dtype": "float32"}, {"name": "1502", "dtype": "float32"}, {"name": "1503", "dtype": "float32"}, {"name": "1504", "dtype": "float32"}, {"name": "1505", "dtype": "float32"}, {"name": "1506", "dtype": "float32"}, {"name": "1507", "dtype": "float32"}, {"name": "1508", "dtype": "float32"}, {"name": "1509", "dtype": "float32"}, {"name": "1510", "dtype": "float32"}, {"name": "1511", "dtype": "float32"}, {"name": "1512", "dtype": "float32"}, {"name": "1513", "dtype": "float32"}, {"name": "1514", "dtype": "float32"}, {"name": "1515", "dtype": "float32"}, {"name": "1516", "dtype": "float32"}, {"name": "1517", "dtype": "float32"}, {"name": "1518", "dtype": "float32"}, {"name": "1519", "dtype": "float32"}, {"name": "1520", "dtype": "float32"}, {"name": "1521", "dtype": "float32"}, {"name": "1522", "dtype": "float32"}, {"name": "1523", "dtype": "float32"}, {"name": "1524", "dtype": "float32"}, {"name": "1525", "dtype": "float32"}, {"name": "1526", "dtype": "float32"}, {"name": "1527", "dtype": "float32"}, {"name": "1528", "dtype": "float32"}, {"name": "1529", "dtype": "float32"}, {"name": "1530", "dtype": "float32"}, {"name": "1531", "dtype": "float32"}, {"name": "1532", "dtype": "float32"}, {"name": "1533", "dtype": "float32"}, {"name": "1534", "dtype": "float32"}, {"name": "1535", "dtype": "float32"}, {"name": "1536", "dtype": "float32"}, {"name": "1537", "dtype": "float32"}, {"name": "1538", "dtype": "float32"}, {"name": "1539", "dtype": "float32"}, {"name": "1540", "dtype": "float32"}, {"name": "1541", "dtype": "float32"}, {"name": "1542", "dtype": "float32"}, {"name": "1543", "dtype": "float32"}, {"name": "1544", "dtype": "float32"}, {"name": "1545", "dtype": "float32"}, {"name": "1546", "dtype": "float32"}, {"name": "1547", "dtype": "float32"}, {"name": "1548", "dtype": "float32"}, {"name": "1549", "dtype": "float32"}, {"name": "1550", "dtype": "float32"}, {"name": "1551", "dtype": "float32"}, {"name": "1552", "dtype": "float32"}, {"name": "1553", "dtype": "float32"}, {"name": "1554", "dtype": "float32"}, {"name": "1555", "dtype": "float32"}, {"name": "1556", "dtype": "float32"}, {"name": "1557", "dtype": "float32"}, {"name": "1558", "dtype": "float32"}, {"name": "1559", "dtype": "float32"}, {"name": "1560", "dtype": "float32"}, {"name": "1561", "dtype": "float32"}, {"name": "1562", "dtype": "float32"}, {"name": "1563", "dtype": "float32"}, {"name": "1564", "dtype": "float32"}, {"name": "1565", "dtype": "float32"}, {"name": "1566", "dtype": "float32"}, {"name": "1567", "dtype": "float32"}, {"name": "1568", "dtype": "float32"}, {"name": "1569", "dtype": "float32"}, {"name": "1570", "dtype": "float32"}, {"name": "1571", "dtype": "float32"}, {"name": "1572", "dtype": "float32"}, {"name": "1573", "dtype": "float32"}, {"name": "1574", "dtype": "float32"}, {"name": "1575", "dtype": "float32"}, {"name": "1576", "dtype": "float32"}, {"name": "1577", "dtype": "float32"}, {"name": "1578", "dtype": "float32"}, {"name": "1579", "dtype": "float32"}, {"name": "1580", "dtype": "float32"}, {"name": "1581", "dtype": "float32"}, {"name": "1582", "dtype": "float32"}, {"name": "1583", "dtype": "float32"}, {"name": "1584", "dtype": "float32"}, {"name": "1585", "dtype": "float32"}, {"name": "1586", "dtype": "float32"}, {"name": "1587", "dtype": "float32"}, {"name": "1588", "dtype": "float32"}, {"name": "1589", "dtype": "float32"}, {"name": "1590", "dtype": "float32"}, {"name": "1591", "dtype": "float32"}, {"name": "1592", "dtype": "float32"}, {"name": "1593", "dtype": "float32"}, {"name": "1594", "dtype": "float32"}, {"name": "1595", "dtype": "float32"}, {"name": "1596", "dtype": "float32"}, {"name": "1597", "dtype": "float32"}, {"name": "1598", "dtype": "float32"}, {"name": "1599", "dtype": "float32"}, {"name": "1600", "dtype": "float32"}, {"name": "1601", "dtype": "float32"}, {"name": "1602", "dtype": "float32"}, {"name": "1603", "dtype": "float32"}, {"name": "1604", "dtype": "float32"}, {"name": "1605", "dtype": "float32"}, {"name": "1606", "dtype": "float32"}, {"name": "1607", "dtype": "float32"}, {"name": "1608", "dtype": "float32"}, {"name": "1609", "dtype": "float32"}, {"name": "1610", "dtype": "float32"}, {"name": "1611", "dtype": "float32"}, {"name": "1612", "dtype": "float32"}, {"name": "1613", "dtype": "float32"}, {"name": "1614", "dtype": "float32"}, {"name": "1615", "dtype": "float32"}, {"name": "1616", "dtype": "float32"}, {"name": "1617", "dtype": "float32"}, {"name": "1618", "dtype": "float32"}, {"name": "1619", "dtype": "float32"}, {"name": "1620", "dtype": "float32"}, {"name": "1621", "dtype": "float32"}, {"name": "1622", "dtype": "float32"}, {"name": "1623", "dtype": "float32"}, {"name": "1624", "dtype": "float32"}, {"name": "1625", "dtype": "float32"}, {"name": "1626", "dtype": "float32"}, {"name": "1627", "dtype": "float32"}, {"name": "1628", "dtype": "float32"}, {"name": "1629", "dtype": "float32"}, {"name": "1630", "dtype": "float32"}, {"name": "1631", "dtype": "float32"}, {"name": "1632", "dtype": "float32"}, {"name": "1633", "dtype": "float32"}, {"name": "1634", "dtype": "float32"}, {"name": "1635", "dtype": "float32"}, {"name": "1636", "dtype": "float32"}, {"name": "1637", "dtype": "float32"}, {"name": "1638", "dtype": "float32"}, {"name": "1639", "dtype": "float32"}, {"name": "1640", "dtype": "float32"}, {"name": "1641", "dtype": "float32"}, {"name": "1642", "dtype": "float32"}, {"name": "1643", "dtype": "float32"}, {"name": "1644", "dtype": "float32"}, {"name": "1645", "dtype": "float32"}, {"name": "1646", "dtype": "float32"}, {"name": "1647", "dtype": "float32"}, {"name": "1648", "dtype": "float32"}, {"name": "1649", "dtype": "float32"}, {"name": "1650", "dtype": "float32"}, {"name": "1651", "dtype": "float32"}, {"name": "1652", "dtype": "float32"}, {"name": "1653", "dtype": "float32"}, {"name": "1654", "dtype": "float32"}, {"name": "1655", "dtype": "float32"}, {"name": "1656", "dtype": "float32"}, {"name": "1657", "dtype": "float32"}, {"name": "1658", "dtype": "float32"}, {"name": "1659", "dtype": "float32"}, {"name": "1660", "dtype": "float32"}, {"name": "1661", "dtype": "float32"}, {"name": "1662", "dtype": "float32"}, {"name": "1663", "dtype": "float32"}, {"name": "1664", "dtype": "float32"}, {"name": "1665", "dtype": "float32"}, {"name": "1666", "dtype": "float32"}, {"name": "1667", "dtype": "float32"}, {"name": "1668", "dtype": "float32"}, {"name": "1669", "dtype": "float32"}, {"name": "1670", "dtype": "float32"}, {"name": "1671", "dtype": "float32"}, {"name": "1672", "dtype": "float32"}, {"name": "1673", "dtype": "float32"}, {"name": "1674", "dtype": "float32"}, {"name": "1675", "dtype": "float32"}, {"name": "1676", "dtype": "float32"}, {"name": "1677", "dtype": "float32"}, {"name": "1678", "dtype": "float32"}, {"name": "1679", "dtype": "float32"}, {"name": "1680", "dtype": "float32"}, {"name": "1681", "dtype": "float32"}, {"name": "1682", "dtype": "float32"}, {"name": "1683", "dtype": "float32"}, {"name": "1684", "dtype": "float32"}, {"name": "1685", "dtype": "float32"}, {"name": "1686", "dtype": "float32"}, {"name": "1687", "dtype": "float32"}, {"name": "1688", "dtype": "float32"}, {"name": "1689", "dtype": "float32"}, {"name": "1690", "dtype": "float32"}, {"name": "1691", "dtype": "float32"}, {"name": "1692", "dtype": "float32"}, {"name": "1693", "dtype": "float32"}, {"name": "1694", "dtype": "float32"}, {"name": "1695", "dtype": "float32"}, {"name": "1696", "dtype": "float32"}, {"name": "1697", "dtype": "float32"}, {"name": "1698", "dtype": "float32"}, {"name": "1699", "dtype": "float32"}, {"name": "1700", "dtype": "float32"}, {"name": "1701", "dtype": "float32"}, {"name": "1702", "dtype": "float32"}, {"name": "1703", "dtype": "float32"}, {"name": "1704", "dtype": "float32"}, {"name": "1705", "dtype": "float32"}, {"name": "1706", "dtype": "float32"}, {"name": "1707", "dtype": "float32"}, {"name": "1708", "dtype": "float32"}, {"name": "1709", "dtype": "float32"}, {"name": "1710", "dtype": "float32"}, {"name": "1711", "dtype": "float32"}, {"name": "1712", "dtype": "float32"}, {"name": "1713", "dtype": "float32"}, {"name": "1714", "dtype": "float32"}, {"name": "1715", "dtype": "float32"}, {"name": "1716", "dtype": "float32"}, {"name": "1717", "dtype": "float32"}, {"name": "1718", "dtype": "float32"}, {"name": "1719", "dtype": "float32"}, {"name": "1720", "dtype": "float32"}, {"name": "1721", "dtype": "float32"}, {"name": "1722", "dtype": "float32"}, {"name": "1723", "dtype": "float32"}, {"name": "1724", "dtype": "float32"}, {"name": "1725", "dtype": "float32"}, {"name": "1726", "dtype": "float32"}, {"name": "1727", "dtype": "float32"}, {"name": "1728", "dtype": "float32"}, {"name": "1729", "dtype": "float32"}, {"name": "1730", "dtype": "float32"}, {"name": "1731", "dtype": "float32"}, {"name": "1732", "dtype": "float32"}, {"name": "1733", "dtype": "float32"}, {"name": "1734", "dtype": "float32"}, {"name": "1735", "dtype": "float32"}, {"name": "1736", "dtype": "float32"}, {"name": "1737", "dtype": "float32"}, {"name": "1738", "dtype": "float32"}, {"name": "1739", "dtype": "float32"}, {"name": "1740", "dtype": "float32"}, {"name": "1741", "dtype": "float32"}, {"name": "1742", "dtype": "float32"}, {"name": "1743", "dtype": "float32"}, {"name": "1744", "dtype": "float32"}, {"name": "1745", "dtype": "float32"}, {"name": "1746", "dtype": "float32"}, {"name": "1747", "dtype": "float32"}, {"name": "1748", "dtype": "float32"}, {"name": "1749", "dtype": "float32"}, {"name": "1750", "dtype": "float32"}, {"name": "1751", "dtype": "float32"}, {"name": "1752", "dtype": "float32"}, {"name": "1753", "dtype": "float32"}, {"name": "1754", "dtype": "float32"}, {"name": "1755", "dtype": "float32"}, {"name": "1756", "dtype": "float32"}, {"name": "1757", "dtype": "float32"}, {"name": "1758", "dtype": "float32"}, {"name": "1759", "dtype": "float32"}, {"name": "1760", "dtype": "float32"}, {"name": "1761", "dtype": "float32"}, {"name": "1762", "dtype": "float32"}, {"name": "1763", "dtype": "float32"}, {"name": "1764", "dtype": "float32"}, {"name": "1765", "dtype": "float32"}, {"name": "1766", "dtype": "float32"}, {"name": "1767", "dtype": "float32"}, {"name": "1768", "dtype": "float32"}, {"name": "1769", "dtype": "float32"}, {"name": "1770", "dtype": "float32"}, {"name": "1771", "dtype": "float32"}, {"name": "1772", "dtype": "float32"}, {"name": "1773", "dtype": "float32"}, {"name": "1774", "dtype": "float32"}, {"name": "1775", "dtype": "float32"}, {"name": "1776", "dtype": "float32"}, {"name": "1777", "dtype": "float32"}, {"name": "1778", "dtype": "float32"}, {"name": "1779", "dtype": "float32"}, {"name": "1780", "dtype": "float32"}, {"name": "1781", "dtype": "float32"}, {"name": "1782", "dtype": "float32"}, {"name": "1783", "dtype": "float32"}, {"name": "1784", "dtype": "float32"}, {"name": "1785", "dtype": "float32"}, {"name": "1786", "dtype": "float32"}, {"name": "1787", "dtype": "float32"}, {"name": "1788", "dtype": "float32"}, {"name": "1789", "dtype": "float32"}, {"name": "1790", "dtype": "float32"}, {"name": "1791", "dtype": "float32"}, {"name": "1792", "dtype": "float32"}, {"name": "1793", "dtype": "float32"}, {"name": "1794", "dtype": "float32"}, {"name": "1795", "dtype": "float32"}, {"name": "1796", "dtype": "float32"}, {"name": "1797", "dtype": "float32"}, {"name": "1798", "dtype": "float32"}, {"name": "1799", "dtype": "float32"}, {"name": "1800", "dtype": "float32"}, {"name": "1801", "dtype": "float32"}, {"name": "1802", "dtype": "float32"}, {"name": "1803", "dtype": "float32"}, {"name": "1804", "dtype": "float32"}, {"name": "1805", "dtype": "float32"}, {"name": "1806", "dtype": "float32"}, {"name": "1807", "dtype": "float32"}, {"name": "1808", "dtype": "float32"}, {"name": "1809", "dtype": "float32"}, {"name": "1810", "dtype": "float32"}, {"name": "1811", "dtype": "float32"}, {"name": "1812", "dtype": "float32"}, {"name": "1813", "dtype": "float32"}, {"name": "1814", "dtype": "float32"}, {"name": "1815", "dtype": "float32"}, {"name": "1816", "dtype": "float32"}, {"name": "1817", "dtype": "float32"}, {"name": "1818", "dtype": "float32"}, {"name": "1819", "dtype": "float32"}, {"name": "1820", "dtype": "float32"}, {"name": "1821", "dtype": "float32"}, {"name": "1822", "dtype": "float32"}, {"name": "1823", "dtype": "float32"}, {"name": "1824", "dtype": "float32"}, {"name": "1825", "dtype": "float32"}, {"name": "1826", "dtype": "float32"}, {"name": "1827", "dtype": "float32"}, {"name": "1828", "dtype": "float32"}, {"name": "1829", "dtype": "float32"}, {"name": "1830", "dtype": "float32"}, {"name": "1831", "dtype": "float32"}, {"name": "1832", "dtype": "float32"}, {"name": "1833", "dtype": "float32"}, {"name": "1834", "dtype": "float32"}, {"name": "1835", "dtype": "float32"}, {"name": "1836", "dtype": "float32"}, {"name": "1837", "dtype": "float32"}, {"name": "1838", "dtype": "float32"}, {"name": "1839", "dtype": "float32"}, {"name": "1840", "dtype": "float32"}, {"name": "1841", "dtype": "float32"}, {"name": "1842", "dtype": "float32"}, {"name": "1843", "dtype": "float32"}, {"name": "1844", "dtype": "float32"}, {"name": "1845", "dtype": "float32"}, {"name": "1846", "dtype": "float32"}, {"name": "1847", "dtype": "float32"}, {"name": "1848", "dtype": "float32"}, {"name": "1849", "dtype": "float32"}, {"name": "1850", "dtype": "float32"}, {"name": "1851", "dtype": "float32"}, {"name": "1852", "dtype": "float32"}, {"name": "1853", "dtype": "float32"}, {"name": "1854", "dtype": "float32"}, {"name": "1855", "dtype": "float32"}, {"name": "1856", "dtype": "float32"}, {"name": "1857", "dtype": "float32"}, {"name": "1858", "dtype": "float32"}, {"name": "1859", "dtype": "float32"}, {"name": "1860", "dtype": "float32"}, {"name": "1861", "dtype": "float32"}, {"name": "1862", "dtype": "float32"}, {"name": "1863", "dtype": "float32"}, {"name": "1864", "dtype": "float32"}, {"name": "1865", "dtype": "float32"}, {"name": "1866", "dtype": "float32"}, {"name": "1867", "dtype": "float32"}, {"name": "1868", "dtype": "float32"}, {"name": "1869", "dtype": "float32"}, {"name": "1870", "dtype": "float32"}, {"name": "1871", "dtype": "float32"}, {"name": "1872", "dtype": "float32"}, {"name": "1873", "dtype": "float32"}, {"name": "1874", "dtype": "float32"}, {"name": "1875", "dtype": "float32"}, {"name": "1876", "dtype": "float32"}, {"name": "1877", "dtype": "float32"}, {"name": "1878", "dtype": "float32"}, {"name": "1879", "dtype": "float32"}, {"name": "1880", "dtype": "float32"}, {"name": "1881", "dtype": "float32"}, {"name": "1882", "dtype": "float32"}, {"name": "1883", "dtype": "float32"}, {"name": "1884", "dtype": "float32"}, {"name": "1885", "dtype": "float32"}, {"name": "1886", "dtype": "float32"}, {"name": "1887", "dtype": "float32"}, {"name": "1888", "dtype": "float32"}, {"name": "1889", "dtype": "float32"}, {"name": "1890", "dtype": "float32"}, {"name": "1891", "dtype": "float32"}, {"name": "1892", "dtype": "float32"}, {"name": "1893", "dtype": "float32"}, {"name": "1894", "dtype": "float32"}, {"name": "1895", "dtype": "float32"}, {"name": "1896", "dtype": "float32"}, {"name": "1897", "dtype": "float32"}, {"name": "1898", "dtype": "float32"}, {"name": "1899", "dtype": "float32"}, {"name": "1900", "dtype": "float32"}, {"name": "1901", "dtype": "float32"}, {"name": "1902", "dtype": "float32"}, {"name": "1903", "dtype": "float32"}, {"name": "1904", "dtype": "float32"}, {"name": "1905", "dtype": "float32"}, {"name": "1906", "dtype": "float32"}, {"name": "1907", "dtype": "float32"}, {"name": "1908", "dtype": "float32"}, {"name": "1909", "dtype": "float32"}, {"name": "1910", "dtype": "float32"}, {"name": "1911", "dtype": "float32"}, {"name": "1912", "dtype": "float32"}, {"name": "1913", "dtype": "float32"}, {"name": "1914", "dtype": "float32"}, {"name": "1915", "dtype": "float32"}, {"name": "1916", "dtype": "float32"}, {"name": "1917", "dtype": "float32"}, {"name": "1918", "dtype": "float32"}, {"name": "1919", "dtype": "float32"}, {"name": "1920", "dtype": "float32"}, {"name": "1921", "dtype": "float32"}, {"name": "1922", "dtype": "float32"}, {"name": "1923", "dtype": "float32"}, {"name": "1924", "dtype": "float32"}, {"name": "1925", "dtype": "float32"}, {"name": "1926", "dtype": "float32"}, {"name": "1927", "dtype": "float32"}, {"name": "1928", "dtype": "float32"}, {"name": "1929", "dtype": "float32"}, {"name": "1930", "dtype": "float32"}, {"name": "1931", "dtype": "float32"}, {"name": "1932", "dtype": "float32"}, {"name": "1933", "dtype": "float32"}, {"name": "1934", "dtype": "float32"}, {"name": "1935", "dtype": "float32"}, {"name": "1936", "dtype": "float32"}, {"name": "1937", "dtype": "float32"}, {"name": "1938", "dtype": "float32"}, {"name": "1939", "dtype": "float32"}, {"name": "1940", "dtype": "float32"}, {"name": "1941", "dtype": "float32"}, {"name": "1942", "dtype": "float32"}, {"name": "1943", "dtype": "float32"}, {"name": "1944", "dtype": "float32"}, {"name": "1945", "dtype": "float32"}, {"name": "1946", "dtype": "float32"}, {"name": "1947", "dtype": "float32"}, {"name": "1948", "dtype": "float32"}, {"name": "1949", "dtype": "float32"}, {"name": "1950", "dtype": "float32"}, {"name": "1951", "dtype": "float32"}, {"name": "1952", "dtype": "float32"}, {"name": "1953", "dtype": "float32"}, {"name": "1954", "dtype": "float32"}, {"name": "1955", "dtype": "float32"}, {"name": "1956", "dtype": "float32"}, {"name": "1957", "dtype": "float32"}, {"name": "1958", "dtype": "float32"}, {"name": "1959", "dtype": "float32"}, {"name": "1960", "dtype": "float32"}, {"name": "1961", "dtype": "float32"}, {"name": "1962", "dtype": "float32"}, {"name": "1963", "dtype": "float32"}, {"name": "1964", "dtype": "float32"}, {"name": "1965", "dtype": "float32"}, {"name": "1966", "dtype": "float32"}, {"name": "1967", "dtype": "float32"}, {"name": "1968", "dtype": "float32"}, {"name": "1969", "dtype": "float32"}, {"name": "1970", "dtype": "float32"}, {"name": "1971", "dtype": "float32"}, {"name": "1972", "dtype": "float32"}, {"name": "1973", "dtype": "float32"}, {"name": "1974", "dtype": "float32"}, {"name": "1975", "dtype": "float32"}, {"name": "1976", "dtype": "float32"}, {"name": "1977", "dtype": "float32"}, {"name": "1978", "dtype": "float32"}, {"name": "1979", "dtype": "float32"}, {"name": "1980", "dtype": "float32"}, {"name": "1981", "dtype": "float32"}, {"name": "1982", "dtype": "float32"}, {"name": "1983", "dtype": "float32"}, {"name": "1984", "dtype": "float32"}, {"name": "1985", "dtype": "float32"}, {"name": "1986", "dtype": "float32"}, {"name": "1987", "dtype": "float32"}, {"name": "1988", "dtype": "float32"}, {"name": "1989", "dtype": "float32"}, {"name": "1990", "dtype": "float32"}, {"name": "1991", "dtype": "float32"}, {"name": "1992", "dtype": "float32"}, {"name": "1993", "dtype": "float32"}, {"name": "1994", "dtype": "float32"}, {"name": "1995", "dtype": "float32"}, {"name": "1996", "dtype": "float32"}, {"name": "1997", "dtype": "float32"}, {"name": "1998", "dtype": "float32"}, {"name": "1999", "dtype": "float32"}, {"name": "2000", "dtype": "float32"}, {"name": "2001", "dtype": "float32"}, {"name": "2002", "dtype": "float32"}, {"name": "2003", "dtype": "float32"}, {"name": "2004", "dtype": "float32"}, {"name": "2005", "dtype": "float32"}, {"name": "2006", "dtype": "float32"}, {"name": "2007", "dtype": "float32"}, {"name": "2008", "dtype": "float32"}, {"name": "2009", "dtype": "float32"}, {"name": "2010", "dtype": "float32"}, {"name": "2011", "dtype": "float32"}, {"name": "2012", "dtype": "float32"}, {"name": "2013", "dtype": "float32"}, {"name": "2014", "dtype": "float32"}, {"name": "2015", "dtype": "float32"}, {"name": "2016", "dtype": "float32"}, {"name": "2017", "dtype": "float32"}, {"name": "2018", "dtype": "float32"}, {"name": "2019", "dtype": "float32"}, {"name": "2020", "dtype": "float32"}, {"name": "2021", "dtype": "float32"}, {"name": "2022", "dtype": "float32"}, {"name": "2023", "dtype": "float32"}, {"name": "2024", "dtype": "float32"}, {"name": "2025", "dtype": "float32"}, {"name": "2026", "dtype": "float32"}, {"name": "2027", "dtype": "float32"}, {"name": "2028", "dtype": "float32"}, {"name": "2029", "dtype": "float32"}, {"name": "2030", "dtype": "float32"}, {"name": "2031", "dtype": "float32"}, {"name": "2032", "dtype": "float32"}, {"name": "2033", "dtype": "float32"}, {"name": "2034", "dtype": "float32"}, {"name": "2035", "dtype": "float32"}, {"name": "2036", "dtype": "float32"}, {"name": "2037", "dtype": "float32"}, {"name": "2038", "dtype": "float32"}, {"name": "2039", "dtype": "float32"}, {"name": "2040", "dtype": "float32"}, {"name": "2041", "dtype": "float32"}, {"name": "2042", "dtype": "float32"}, {"name": "2043", "dtype": "float32"}, {"name": "2044", "dtype": "float32"}, {"name": "2045", "dtype": "float32"}, {"name": "2046", "dtype": "float32"}, {"name": "2047", "dtype": "float32"}, {"name": "label", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 213730605, "num_examples": 26057}, {"name": "test", "num_bytes": 71246376, "num_examples": 8686}], "download_size": 392417877, "dataset_size": 284976981}}
2023-09-04T07:04:31+00:00
[]
[]
TAGS #region-us
# Dataset Card for "AA_GPTNEO_FT" More Information needed
[ "# Dataset Card for \"AA_GPTNEO_FT\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"AA_GPTNEO_FT\"\n\nMore Information needed" ]
[ 6, 17 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"AA_GPTNEO_FT\"\n\nMore Information needed" ]
554c5e613224014c2736c52e5c5bf38e343c4e73
# Dataset Card for "orca_minis_uncensored_squad_format" This dataset is a part of a continued series providing interestingly formatted existing data from unrelated datasets for question/answering model use. Alternately it can provide a common format that could be converted to something else easily using available scripts and utilities fairly easily. ### This is a work in progress and is changing every few days currently. Please refrain from using it for anything, especially seriously, unless a warning or example of atrocities are needed. [Original Dataset](https://huggingface.co/datasets/psmathur/orca_minis_uncensored_dataset)
jtatman/orca_minis_uncensored_squad_format
[ "task_categories:question-answering", "size_categories:10K<n<100K", "language:en", "license:mit", "squad", "orca", "subset", "refactor", "uncensored", "qa", "questions", "region:us" ]
2023-09-04T07:05:32+00:00
{"language": ["en"], "license": "mit", "size_categories": ["10K<n<100K"], "task_categories": ["question-answering"], "pretty_name": "uncensored_orca_subset_squad", "dataset_info": {"features": [{"name": "id", "dtype": "string"}, {"name": "title", "dtype": "string"}, {"name": "context", "dtype": "string"}, {"name": "question", "dtype": "string"}, {"name": "answers", "struct": [{"name": "answer_start", "dtype": "int64"}, {"name": "text", "dtype": "string"}]}], "splits": [{"name": "train", "num_bytes": 145821574, "num_examples": 104179}], "download_size": 60327229, "dataset_size": 145821574}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}], "tags": ["squad", "orca", "subset", "refactor", "uncensored", "qa", "questions"]}
2023-09-06T07:01:34+00:00
[]
[ "en" ]
TAGS #task_categories-question-answering #size_categories-10K<n<100K #language-English #license-mit #squad #orca #subset #refactor #uncensored #qa #questions #region-us
# Dataset Card for "orca_minis_uncensored_squad_format" This dataset is a part of a continued series providing interestingly formatted existing data from unrelated datasets for question/answering model use. Alternately it can provide a common format that could be converted to something else easily using available scripts and utilities fairly easily. ### This is a work in progress and is changing every few days currently. Please refrain from using it for anything, especially seriously, unless a warning or example of atrocities are needed. Original Dataset
[ "# Dataset Card for \"orca_minis_uncensored_squad_format\"\n\nThis dataset is a part of a continued series providing interestingly formatted existing data from unrelated datasets for question/answering model use. \n\nAlternately it can provide a common format that could be converted to something else easily using available scripts and utilities fairly easily.", "### This is a work in progress and is changing every few days currently. Please refrain from using it for anything, especially seriously, unless a warning or example of atrocities are needed. \n\nOriginal Dataset" ]
[ "TAGS\n#task_categories-question-answering #size_categories-10K<n<100K #language-English #license-mit #squad #orca #subset #refactor #uncensored #qa #questions #region-us \n", "# Dataset Card for \"orca_minis_uncensored_squad_format\"\n\nThis dataset is a part of a continued series providing interestingly formatted existing data from unrelated datasets for question/answering model use. \n\nAlternately it can provide a common format that could be converted to something else easily using available scripts and utilities fairly easily.", "### This is a work in progress and is changing every few days currently. Please refrain from using it for anything, especially seriously, unless a warning or example of atrocities are needed. \n\nOriginal Dataset" ]
[ 60, 78, 44 ]
[ "passage: TAGS\n#task_categories-question-answering #size_categories-10K<n<100K #language-English #license-mit #squad #orca #subset #refactor #uncensored #qa #questions #region-us \n# Dataset Card for \"orca_minis_uncensored_squad_format\"\n\nThis dataset is a part of a continued series providing interestingly formatted existing data from unrelated datasets for question/answering model use. \n\nAlternately it can provide a common format that could be converted to something else easily using available scripts and utilities fairly easily.### This is a work in progress and is changing every few days currently. Please refrain from using it for anything, especially seriously, unless a warning or example of atrocities are needed. \n\nOriginal Dataset" ]
86289dbf98e2e1df12f66556af2981bc4ec6847f
# Dataset Card for "CSIC_BERT_FT" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
EgilKarlsen/CSIC_BERT_FT
[ "region:us" ]
2023-09-04T07:09:27+00:00
{"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "test", "path": "data/test-*"}]}], "dataset_info": {"features": [{"name": "0", "dtype": "float32"}, {"name": "1", "dtype": "float32"}, {"name": "2", "dtype": "float32"}, {"name": "3", "dtype": "float32"}, {"name": "4", "dtype": "float32"}, {"name": "5", "dtype": "float32"}, {"name": "6", "dtype": "float32"}, {"name": "7", "dtype": "float32"}, {"name": "8", "dtype": "float32"}, {"name": "9", "dtype": "float32"}, {"name": "10", "dtype": "float32"}, {"name": "11", "dtype": "float32"}, {"name": "12", "dtype": "float32"}, {"name": "13", "dtype": "float32"}, {"name": "14", "dtype": "float32"}, {"name": "15", "dtype": "float32"}, {"name": "16", "dtype": "float32"}, {"name": "17", "dtype": "float32"}, {"name": "18", "dtype": "float32"}, {"name": "19", "dtype": "float32"}, {"name": "20", "dtype": "float32"}, {"name": "21", "dtype": "float32"}, {"name": "22", "dtype": "float32"}, {"name": "23", "dtype": "float32"}, {"name": "24", "dtype": "float32"}, {"name": "25", "dtype": "float32"}, {"name": "26", "dtype": "float32"}, {"name": "27", "dtype": "float32"}, {"name": "28", "dtype": "float32"}, {"name": "29", "dtype": "float32"}, {"name": "30", "dtype": "float32"}, {"name": "31", "dtype": "float32"}, {"name": "32", "dtype": "float32"}, {"name": "33", "dtype": "float32"}, {"name": "34", "dtype": "float32"}, {"name": "35", "dtype": "float32"}, {"name": "36", "dtype": "float32"}, {"name": "37", "dtype": "float32"}, {"name": "38", "dtype": "float32"}, {"name": "39", "dtype": "float32"}, {"name": "40", "dtype": "float32"}, {"name": "41", "dtype": "float32"}, {"name": "42", "dtype": "float32"}, {"name": "43", "dtype": "float32"}, {"name": "44", "dtype": "float32"}, {"name": "45", "dtype": "float32"}, {"name": "46", "dtype": "float32"}, {"name": "47", "dtype": "float32"}, {"name": "48", "dtype": "float32"}, {"name": "49", "dtype": "float32"}, {"name": "50", "dtype": "float32"}, {"name": "51", "dtype": "float32"}, {"name": "52", "dtype": "float32"}, {"name": "53", "dtype": "float32"}, {"name": "54", "dtype": "float32"}, {"name": "55", "dtype": "float32"}, {"name": "56", "dtype": "float32"}, {"name": "57", "dtype": "float32"}, {"name": "58", "dtype": "float32"}, {"name": "59", "dtype": "float32"}, {"name": "60", "dtype": "float32"}, {"name": "61", "dtype": "float32"}, {"name": "62", "dtype": "float32"}, {"name": "63", "dtype": "float32"}, {"name": "64", "dtype": "float32"}, {"name": "65", "dtype": "float32"}, {"name": "66", "dtype": "float32"}, {"name": "67", "dtype": "float32"}, {"name": "68", "dtype": "float32"}, {"name": "69", "dtype": "float32"}, {"name": "70", "dtype": "float32"}, {"name": "71", "dtype": "float32"}, {"name": "72", "dtype": "float32"}, {"name": "73", "dtype": "float32"}, {"name": "74", "dtype": "float32"}, {"name": "75", "dtype": "float32"}, {"name": "76", "dtype": "float32"}, {"name": "77", "dtype": "float32"}, {"name": "78", "dtype": "float32"}, {"name": "79", "dtype": "float32"}, {"name": "80", "dtype": "float32"}, {"name": "81", "dtype": "float32"}, {"name": "82", "dtype": "float32"}, {"name": "83", "dtype": "float32"}, {"name": "84", "dtype": "float32"}, {"name": "85", "dtype": "float32"}, {"name": "86", "dtype": "float32"}, {"name": "87", "dtype": "float32"}, {"name": "88", "dtype": "float32"}, {"name": "89", "dtype": "float32"}, {"name": "90", "dtype": "float32"}, {"name": "91", "dtype": "float32"}, {"name": "92", "dtype": "float32"}, {"name": "93", "dtype": "float32"}, {"name": "94", "dtype": "float32"}, {"name": "95", "dtype": "float32"}, {"name": "96", "dtype": "float32"}, {"name": "97", "dtype": "float32"}, {"name": "98", "dtype": "float32"}, {"name": "99", "dtype": "float32"}, {"name": "100", "dtype": "float32"}, {"name": "101", "dtype": "float32"}, {"name": "102", "dtype": "float32"}, {"name": "103", "dtype": "float32"}, {"name": "104", "dtype": "float32"}, {"name": "105", "dtype": "float32"}, {"name": "106", "dtype": "float32"}, {"name": "107", "dtype": "float32"}, {"name": "108", "dtype": "float32"}, {"name": "109", "dtype": "float32"}, {"name": "110", "dtype": "float32"}, {"name": "111", "dtype": "float32"}, {"name": "112", "dtype": "float32"}, {"name": "113", "dtype": "float32"}, {"name": "114", "dtype": "float32"}, {"name": "115", "dtype": "float32"}, {"name": "116", "dtype": "float32"}, {"name": "117", "dtype": "float32"}, {"name": "118", "dtype": "float32"}, {"name": "119", "dtype": "float32"}, {"name": "120", "dtype": "float32"}, {"name": "121", "dtype": "float32"}, {"name": "122", "dtype": "float32"}, {"name": "123", "dtype": "float32"}, {"name": "124", "dtype": "float32"}, {"name": "125", "dtype": "float32"}, {"name": "126", "dtype": "float32"}, {"name": "127", "dtype": "float32"}, {"name": "128", "dtype": "float32"}, {"name": "129", "dtype": "float32"}, {"name": "130", "dtype": "float32"}, {"name": "131", "dtype": "float32"}, {"name": "132", "dtype": "float32"}, {"name": "133", "dtype": "float32"}, {"name": "134", "dtype": "float32"}, {"name": "135", "dtype": "float32"}, {"name": "136", "dtype": "float32"}, {"name": "137", "dtype": "float32"}, {"name": "138", "dtype": "float32"}, {"name": "139", "dtype": "float32"}, {"name": "140", "dtype": "float32"}, {"name": "141", "dtype": "float32"}, {"name": "142", "dtype": "float32"}, {"name": "143", "dtype": "float32"}, {"name": "144", "dtype": "float32"}, {"name": "145", "dtype": "float32"}, {"name": "146", "dtype": "float32"}, {"name": "147", "dtype": "float32"}, {"name": "148", "dtype": "float32"}, {"name": "149", "dtype": "float32"}, {"name": "150", "dtype": "float32"}, {"name": "151", "dtype": "float32"}, {"name": "152", "dtype": "float32"}, {"name": "153", "dtype": "float32"}, {"name": "154", "dtype": "float32"}, {"name": "155", "dtype": "float32"}, {"name": "156", "dtype": "float32"}, {"name": "157", "dtype": "float32"}, {"name": "158", "dtype": "float32"}, {"name": "159", "dtype": "float32"}, {"name": "160", "dtype": "float32"}, {"name": "161", "dtype": "float32"}, {"name": "162", "dtype": "float32"}, {"name": "163", "dtype": "float32"}, {"name": "164", "dtype": "float32"}, {"name": "165", "dtype": "float32"}, {"name": "166", "dtype": "float32"}, {"name": "167", "dtype": "float32"}, {"name": "168", "dtype": "float32"}, {"name": "169", "dtype": "float32"}, {"name": "170", "dtype": "float32"}, {"name": "171", "dtype": "float32"}, {"name": "172", "dtype": "float32"}, {"name": "173", "dtype": "float32"}, {"name": "174", "dtype": "float32"}, {"name": "175", "dtype": "float32"}, {"name": "176", "dtype": "float32"}, {"name": "177", "dtype": "float32"}, {"name": "178", "dtype": "float32"}, {"name": "179", "dtype": "float32"}, {"name": "180", "dtype": "float32"}, {"name": "181", "dtype": "float32"}, {"name": "182", "dtype": "float32"}, {"name": "183", "dtype": "float32"}, {"name": "184", "dtype": "float32"}, {"name": "185", "dtype": "float32"}, {"name": "186", "dtype": "float32"}, {"name": "187", "dtype": "float32"}, {"name": "188", "dtype": "float32"}, {"name": "189", "dtype": "float32"}, {"name": "190", "dtype": "float32"}, {"name": "191", "dtype": "float32"}, {"name": "192", "dtype": "float32"}, {"name": "193", "dtype": "float32"}, {"name": "194", "dtype": "float32"}, {"name": "195", "dtype": "float32"}, {"name": "196", "dtype": "float32"}, {"name": "197", "dtype": "float32"}, {"name": "198", "dtype": "float32"}, {"name": "199", "dtype": "float32"}, {"name": "200", "dtype": "float32"}, {"name": "201", "dtype": "float32"}, {"name": "202", "dtype": "float32"}, {"name": "203", "dtype": "float32"}, {"name": "204", "dtype": "float32"}, {"name": "205", "dtype": "float32"}, {"name": "206", "dtype": "float32"}, {"name": "207", "dtype": "float32"}, {"name": "208", "dtype": "float32"}, {"name": "209", "dtype": "float32"}, {"name": "210", "dtype": "float32"}, {"name": "211", "dtype": "float32"}, {"name": "212", "dtype": "float32"}, {"name": "213", "dtype": "float32"}, {"name": "214", "dtype": "float32"}, {"name": "215", "dtype": "float32"}, {"name": "216", "dtype": "float32"}, {"name": "217", "dtype": "float32"}, {"name": "218", "dtype": "float32"}, {"name": "219", "dtype": "float32"}, {"name": "220", "dtype": "float32"}, {"name": "221", "dtype": "float32"}, {"name": "222", "dtype": "float32"}, {"name": "223", "dtype": "float32"}, {"name": "224", "dtype": "float32"}, {"name": "225", "dtype": "float32"}, {"name": "226", "dtype": "float32"}, {"name": "227", "dtype": "float32"}, {"name": "228", "dtype": "float32"}, {"name": "229", "dtype": "float32"}, {"name": "230", "dtype": "float32"}, {"name": "231", "dtype": "float32"}, {"name": "232", "dtype": "float32"}, {"name": "233", "dtype": "float32"}, {"name": "234", "dtype": "float32"}, {"name": "235", "dtype": "float32"}, {"name": "236", "dtype": "float32"}, {"name": "237", "dtype": "float32"}, {"name": "238", "dtype": "float32"}, {"name": "239", "dtype": "float32"}, {"name": "240", "dtype": "float32"}, {"name": "241", "dtype": "float32"}, {"name": "242", "dtype": "float32"}, {"name": "243", "dtype": "float32"}, {"name": "244", "dtype": "float32"}, {"name": "245", "dtype": "float32"}, {"name": "246", "dtype": "float32"}, {"name": "247", "dtype": "float32"}, {"name": "248", "dtype": "float32"}, {"name": "249", "dtype": "float32"}, {"name": "250", "dtype": "float32"}, {"name": "251", "dtype": "float32"}, {"name": "252", "dtype": "float32"}, {"name": "253", "dtype": "float32"}, {"name": "254", "dtype": "float32"}, {"name": "255", "dtype": "float32"}, {"name": "256", "dtype": "float32"}, {"name": "257", "dtype": "float32"}, {"name": "258", "dtype": "float32"}, {"name": "259", "dtype": "float32"}, {"name": "260", "dtype": "float32"}, {"name": "261", "dtype": "float32"}, {"name": "262", "dtype": "float32"}, {"name": "263", "dtype": "float32"}, {"name": "264", "dtype": "float32"}, {"name": "265", "dtype": "float32"}, {"name": "266", "dtype": "float32"}, {"name": "267", "dtype": "float32"}, {"name": "268", "dtype": "float32"}, {"name": "269", "dtype": "float32"}, {"name": "270", "dtype": "float32"}, {"name": "271", "dtype": "float32"}, {"name": "272", "dtype": "float32"}, {"name": "273", "dtype": "float32"}, {"name": "274", "dtype": "float32"}, {"name": "275", "dtype": "float32"}, {"name": "276", "dtype": "float32"}, {"name": "277", "dtype": "float32"}, {"name": "278", "dtype": "float32"}, {"name": "279", "dtype": "float32"}, {"name": "280", "dtype": "float32"}, {"name": "281", "dtype": "float32"}, {"name": "282", "dtype": "float32"}, {"name": "283", "dtype": "float32"}, {"name": "284", "dtype": "float32"}, {"name": "285", "dtype": "float32"}, {"name": "286", "dtype": "float32"}, {"name": "287", "dtype": "float32"}, {"name": "288", "dtype": "float32"}, {"name": "289", "dtype": "float32"}, {"name": "290", "dtype": "float32"}, {"name": "291", "dtype": "float32"}, {"name": "292", "dtype": "float32"}, {"name": "293", "dtype": "float32"}, {"name": "294", "dtype": "float32"}, {"name": "295", "dtype": "float32"}, {"name": "296", "dtype": "float32"}, {"name": "297", "dtype": "float32"}, {"name": "298", "dtype": "float32"}, {"name": "299", "dtype": "float32"}, {"name": "300", "dtype": "float32"}, {"name": "301", "dtype": "float32"}, {"name": "302", "dtype": "float32"}, {"name": "303", "dtype": "float32"}, {"name": "304", "dtype": "float32"}, {"name": "305", "dtype": "float32"}, {"name": "306", "dtype": "float32"}, {"name": "307", "dtype": "float32"}, {"name": "308", "dtype": "float32"}, {"name": "309", "dtype": "float32"}, {"name": "310", "dtype": "float32"}, {"name": "311", "dtype": "float32"}, {"name": "312", "dtype": "float32"}, {"name": "313", "dtype": "float32"}, {"name": "314", "dtype": "float32"}, {"name": "315", "dtype": "float32"}, {"name": "316", "dtype": "float32"}, {"name": "317", "dtype": "float32"}, {"name": "318", "dtype": "float32"}, {"name": "319", "dtype": "float32"}, {"name": "320", "dtype": "float32"}, {"name": "321", "dtype": "float32"}, {"name": "322", "dtype": "float32"}, {"name": "323", "dtype": "float32"}, {"name": "324", "dtype": "float32"}, {"name": "325", "dtype": "float32"}, {"name": "326", "dtype": "float32"}, {"name": "327", "dtype": "float32"}, {"name": "328", "dtype": "float32"}, {"name": "329", "dtype": "float32"}, {"name": "330", "dtype": "float32"}, {"name": "331", "dtype": "float32"}, {"name": "332", "dtype": "float32"}, {"name": "333", "dtype": "float32"}, {"name": "334", "dtype": "float32"}, {"name": "335", "dtype": "float32"}, {"name": "336", "dtype": "float32"}, {"name": "337", "dtype": "float32"}, {"name": "338", "dtype": "float32"}, {"name": "339", "dtype": "float32"}, {"name": "340", "dtype": "float32"}, {"name": "341", "dtype": "float32"}, {"name": "342", "dtype": "float32"}, {"name": "343", "dtype": "float32"}, {"name": "344", "dtype": "float32"}, {"name": "345", "dtype": "float32"}, {"name": "346", "dtype": "float32"}, {"name": "347", "dtype": "float32"}, {"name": "348", "dtype": "float32"}, {"name": "349", "dtype": "float32"}, {"name": "350", "dtype": "float32"}, {"name": "351", "dtype": "float32"}, {"name": "352", "dtype": "float32"}, {"name": "353", "dtype": "float32"}, {"name": "354", "dtype": "float32"}, {"name": "355", "dtype": "float32"}, {"name": "356", "dtype": "float32"}, {"name": "357", "dtype": "float32"}, {"name": "358", "dtype": "float32"}, {"name": "359", "dtype": "float32"}, {"name": "360", "dtype": "float32"}, {"name": "361", "dtype": "float32"}, {"name": "362", "dtype": "float32"}, {"name": "363", "dtype": "float32"}, {"name": "364", "dtype": "float32"}, {"name": "365", "dtype": "float32"}, {"name": "366", "dtype": "float32"}, {"name": "367", "dtype": "float32"}, {"name": "368", "dtype": "float32"}, {"name": "369", "dtype": "float32"}, {"name": "370", "dtype": "float32"}, {"name": "371", "dtype": "float32"}, {"name": "372", "dtype": "float32"}, {"name": "373", "dtype": "float32"}, {"name": "374", "dtype": "float32"}, {"name": "375", "dtype": "float32"}, {"name": "376", "dtype": "float32"}, {"name": "377", "dtype": "float32"}, {"name": "378", "dtype": "float32"}, {"name": "379", "dtype": "float32"}, {"name": "380", "dtype": "float32"}, {"name": "381", "dtype": "float32"}, {"name": "382", "dtype": "float32"}, {"name": "383", "dtype": "float32"}, {"name": "384", "dtype": "float32"}, {"name": "385", "dtype": "float32"}, {"name": "386", "dtype": "float32"}, {"name": "387", "dtype": "float32"}, {"name": "388", "dtype": "float32"}, {"name": "389", "dtype": "float32"}, {"name": "390", "dtype": "float32"}, {"name": "391", "dtype": "float32"}, {"name": "392", "dtype": "float32"}, {"name": "393", "dtype": "float32"}, {"name": "394", "dtype": "float32"}, {"name": "395", "dtype": "float32"}, {"name": "396", "dtype": "float32"}, {"name": "397", "dtype": "float32"}, {"name": "398", "dtype": "float32"}, {"name": "399", "dtype": "float32"}, {"name": "400", "dtype": "float32"}, {"name": "401", "dtype": "float32"}, {"name": "402", "dtype": "float32"}, {"name": "403", "dtype": "float32"}, {"name": "404", "dtype": "float32"}, {"name": "405", "dtype": "float32"}, {"name": "406", "dtype": "float32"}, {"name": "407", "dtype": "float32"}, {"name": "408", "dtype": "float32"}, {"name": "409", "dtype": "float32"}, {"name": "410", "dtype": "float32"}, {"name": "411", "dtype": "float32"}, {"name": "412", "dtype": "float32"}, {"name": "413", "dtype": "float32"}, {"name": "414", "dtype": "float32"}, {"name": "415", "dtype": "float32"}, {"name": "416", "dtype": "float32"}, {"name": "417", "dtype": "float32"}, {"name": "418", "dtype": "float32"}, {"name": "419", "dtype": "float32"}, {"name": "420", "dtype": "float32"}, {"name": "421", "dtype": "float32"}, {"name": "422", "dtype": "float32"}, {"name": "423", "dtype": "float32"}, {"name": "424", "dtype": "float32"}, {"name": "425", "dtype": "float32"}, {"name": "426", "dtype": "float32"}, {"name": "427", "dtype": "float32"}, {"name": "428", "dtype": "float32"}, {"name": "429", "dtype": "float32"}, {"name": "430", "dtype": "float32"}, {"name": "431", "dtype": "float32"}, {"name": "432", "dtype": "float32"}, {"name": "433", "dtype": "float32"}, {"name": "434", "dtype": "float32"}, {"name": "435", "dtype": "float32"}, {"name": "436", "dtype": "float32"}, {"name": "437", "dtype": "float32"}, {"name": "438", "dtype": "float32"}, {"name": "439", "dtype": "float32"}, {"name": "440", "dtype": "float32"}, {"name": "441", "dtype": "float32"}, {"name": "442", "dtype": "float32"}, {"name": "443", "dtype": "float32"}, {"name": "444", "dtype": "float32"}, {"name": "445", "dtype": "float32"}, {"name": "446", "dtype": "float32"}, {"name": "447", "dtype": "float32"}, {"name": "448", "dtype": "float32"}, {"name": "449", "dtype": "float32"}, {"name": "450", "dtype": "float32"}, {"name": "451", "dtype": "float32"}, {"name": "452", "dtype": "float32"}, {"name": "453", "dtype": "float32"}, {"name": "454", "dtype": "float32"}, {"name": "455", "dtype": "float32"}, {"name": "456", "dtype": "float32"}, {"name": "457", "dtype": "float32"}, {"name": "458", "dtype": "float32"}, {"name": "459", "dtype": "float32"}, {"name": "460", "dtype": "float32"}, {"name": "461", "dtype": "float32"}, {"name": "462", "dtype": "float32"}, {"name": "463", "dtype": "float32"}, {"name": "464", "dtype": "float32"}, {"name": "465", "dtype": "float32"}, {"name": "466", "dtype": "float32"}, {"name": "467", "dtype": "float32"}, {"name": "468", "dtype": "float32"}, {"name": "469", "dtype": "float32"}, {"name": "470", "dtype": "float32"}, {"name": "471", "dtype": "float32"}, {"name": "472", "dtype": "float32"}, {"name": "473", "dtype": "float32"}, {"name": "474", "dtype": "float32"}, {"name": "475", "dtype": "float32"}, {"name": "476", "dtype": "float32"}, {"name": "477", "dtype": "float32"}, {"name": "478", "dtype": "float32"}, {"name": "479", "dtype": "float32"}, {"name": "480", "dtype": "float32"}, {"name": "481", "dtype": "float32"}, {"name": "482", "dtype": "float32"}, {"name": "483", "dtype": "float32"}, {"name": "484", "dtype": "float32"}, {"name": "485", "dtype": "float32"}, {"name": "486", "dtype": "float32"}, {"name": "487", "dtype": "float32"}, {"name": "488", "dtype": "float32"}, {"name": "489", "dtype": "float32"}, {"name": "490", "dtype": "float32"}, {"name": "491", "dtype": "float32"}, {"name": "492", "dtype": "float32"}, {"name": "493", "dtype": "float32"}, {"name": "494", "dtype": "float32"}, {"name": "495", "dtype": "float32"}, {"name": "496", "dtype": "float32"}, {"name": "497", "dtype": "float32"}, {"name": "498", "dtype": "float32"}, {"name": "499", "dtype": "float32"}, {"name": "500", "dtype": "float32"}, {"name": "501", "dtype": "float32"}, {"name": "502", "dtype": "float32"}, {"name": "503", "dtype": "float32"}, {"name": "504", "dtype": "float32"}, {"name": "505", "dtype": "float32"}, {"name": "506", "dtype": "float32"}, {"name": "507", "dtype": "float32"}, {"name": "508", "dtype": "float32"}, {"name": "509", "dtype": "float32"}, {"name": "510", "dtype": "float32"}, {"name": "511", "dtype": "float32"}, {"name": "512", "dtype": "float32"}, {"name": "513", "dtype": "float32"}, {"name": "514", "dtype": "float32"}, {"name": "515", "dtype": "float32"}, {"name": "516", "dtype": "float32"}, {"name": "517", "dtype": "float32"}, {"name": "518", "dtype": "float32"}, {"name": "519", "dtype": "float32"}, {"name": "520", "dtype": "float32"}, {"name": "521", "dtype": "float32"}, {"name": "522", "dtype": "float32"}, {"name": "523", "dtype": "float32"}, {"name": "524", "dtype": "float32"}, {"name": "525", "dtype": "float32"}, {"name": "526", "dtype": "float32"}, {"name": "527", "dtype": "float32"}, {"name": "528", "dtype": "float32"}, {"name": "529", "dtype": "float32"}, {"name": "530", "dtype": "float32"}, {"name": "531", "dtype": "float32"}, {"name": "532", "dtype": "float32"}, {"name": "533", "dtype": "float32"}, {"name": "534", "dtype": "float32"}, {"name": "535", "dtype": "float32"}, {"name": "536", "dtype": "float32"}, {"name": "537", "dtype": "float32"}, {"name": "538", "dtype": "float32"}, {"name": "539", "dtype": "float32"}, {"name": "540", "dtype": "float32"}, {"name": "541", "dtype": "float32"}, {"name": "542", "dtype": "float32"}, {"name": "543", "dtype": "float32"}, {"name": "544", "dtype": "float32"}, {"name": "545", "dtype": "float32"}, {"name": "546", "dtype": "float32"}, {"name": "547", "dtype": "float32"}, {"name": "548", "dtype": "float32"}, {"name": "549", "dtype": "float32"}, {"name": "550", "dtype": "float32"}, {"name": "551", "dtype": "float32"}, {"name": "552", "dtype": "float32"}, {"name": "553", "dtype": "float32"}, {"name": "554", "dtype": "float32"}, {"name": "555", "dtype": "float32"}, {"name": "556", "dtype": "float32"}, {"name": "557", "dtype": "float32"}, {"name": "558", "dtype": "float32"}, {"name": "559", "dtype": "float32"}, {"name": "560", "dtype": "float32"}, {"name": "561", "dtype": "float32"}, {"name": "562", "dtype": "float32"}, {"name": "563", "dtype": "float32"}, {"name": "564", "dtype": "float32"}, {"name": "565", "dtype": "float32"}, {"name": "566", "dtype": "float32"}, {"name": "567", "dtype": "float32"}, {"name": "568", "dtype": "float32"}, {"name": "569", "dtype": "float32"}, {"name": "570", "dtype": "float32"}, {"name": "571", "dtype": "float32"}, {"name": "572", "dtype": "float32"}, {"name": "573", "dtype": "float32"}, {"name": "574", "dtype": "float32"}, {"name": "575", "dtype": "float32"}, {"name": "576", "dtype": "float32"}, {"name": "577", "dtype": "float32"}, {"name": "578", "dtype": "float32"}, {"name": "579", "dtype": "float32"}, {"name": "580", "dtype": "float32"}, {"name": "581", "dtype": "float32"}, {"name": "582", "dtype": "float32"}, {"name": "583", "dtype": "float32"}, {"name": "584", "dtype": "float32"}, {"name": "585", "dtype": "float32"}, {"name": "586", "dtype": "float32"}, {"name": "587", "dtype": "float32"}, {"name": "588", "dtype": "float32"}, {"name": "589", "dtype": "float32"}, {"name": "590", "dtype": "float32"}, {"name": "591", "dtype": "float32"}, {"name": "592", "dtype": "float32"}, {"name": "593", "dtype": "float32"}, {"name": "594", "dtype": "float32"}, {"name": "595", "dtype": "float32"}, {"name": "596", "dtype": "float32"}, {"name": "597", "dtype": "float32"}, {"name": "598", "dtype": "float32"}, {"name": "599", "dtype": "float32"}, {"name": "600", "dtype": "float32"}, {"name": "601", "dtype": "float32"}, {"name": "602", "dtype": "float32"}, {"name": "603", "dtype": "float32"}, {"name": "604", "dtype": "float32"}, {"name": "605", "dtype": "float32"}, {"name": "606", "dtype": "float32"}, {"name": "607", "dtype": "float32"}, {"name": "608", "dtype": "float32"}, {"name": "609", "dtype": "float32"}, {"name": "610", "dtype": "float32"}, {"name": "611", "dtype": "float32"}, {"name": "612", "dtype": "float32"}, {"name": "613", "dtype": "float32"}, {"name": "614", "dtype": "float32"}, {"name": "615", "dtype": "float32"}, {"name": "616", "dtype": "float32"}, {"name": "617", "dtype": "float32"}, {"name": "618", "dtype": "float32"}, {"name": "619", "dtype": "float32"}, {"name": "620", "dtype": "float32"}, {"name": "621", "dtype": "float32"}, {"name": "622", "dtype": "float32"}, {"name": "623", "dtype": "float32"}, {"name": "624", "dtype": "float32"}, {"name": "625", "dtype": "float32"}, {"name": "626", "dtype": "float32"}, {"name": "627", "dtype": "float32"}, {"name": "628", "dtype": "float32"}, {"name": "629", "dtype": "float32"}, {"name": "630", "dtype": "float32"}, {"name": "631", "dtype": "float32"}, {"name": "632", "dtype": "float32"}, {"name": "633", "dtype": "float32"}, {"name": "634", "dtype": "float32"}, {"name": "635", "dtype": "float32"}, {"name": "636", "dtype": "float32"}, {"name": "637", "dtype": "float32"}, {"name": "638", "dtype": "float32"}, {"name": "639", "dtype": "float32"}, {"name": "640", "dtype": "float32"}, {"name": "641", "dtype": "float32"}, {"name": "642", "dtype": "float32"}, {"name": "643", "dtype": "float32"}, {"name": "644", "dtype": "float32"}, {"name": "645", "dtype": "float32"}, {"name": "646", "dtype": "float32"}, {"name": "647", "dtype": "float32"}, {"name": "648", "dtype": "float32"}, {"name": "649", "dtype": "float32"}, {"name": "650", "dtype": "float32"}, {"name": "651", "dtype": "float32"}, {"name": "652", "dtype": "float32"}, {"name": "653", "dtype": "float32"}, {"name": "654", "dtype": "float32"}, {"name": "655", "dtype": "float32"}, {"name": "656", "dtype": "float32"}, {"name": "657", "dtype": "float32"}, {"name": "658", "dtype": "float32"}, {"name": "659", "dtype": "float32"}, {"name": "660", "dtype": "float32"}, {"name": "661", "dtype": "float32"}, {"name": "662", "dtype": "float32"}, {"name": "663", "dtype": "float32"}, {"name": "664", "dtype": "float32"}, {"name": "665", "dtype": "float32"}, {"name": "666", "dtype": "float32"}, {"name": "667", "dtype": "float32"}, {"name": "668", "dtype": "float32"}, {"name": "669", "dtype": "float32"}, {"name": "670", "dtype": "float32"}, {"name": "671", "dtype": "float32"}, {"name": "672", "dtype": "float32"}, {"name": "673", "dtype": "float32"}, {"name": "674", "dtype": "float32"}, {"name": "675", "dtype": "float32"}, {"name": "676", "dtype": "float32"}, {"name": "677", "dtype": "float32"}, {"name": "678", "dtype": "float32"}, {"name": "679", "dtype": "float32"}, {"name": "680", "dtype": "float32"}, {"name": "681", "dtype": "float32"}, {"name": "682", "dtype": "float32"}, {"name": "683", "dtype": "float32"}, {"name": "684", "dtype": "float32"}, {"name": "685", "dtype": "float32"}, {"name": "686", "dtype": "float32"}, {"name": "687", "dtype": "float32"}, {"name": "688", "dtype": "float32"}, {"name": "689", "dtype": "float32"}, {"name": "690", "dtype": "float32"}, {"name": "691", "dtype": "float32"}, {"name": "692", "dtype": "float32"}, {"name": "693", "dtype": "float32"}, {"name": "694", "dtype": "float32"}, {"name": "695", "dtype": "float32"}, {"name": "696", "dtype": "float32"}, {"name": "697", "dtype": "float32"}, {"name": "698", "dtype": "float32"}, {"name": "699", "dtype": "float32"}, {"name": "700", "dtype": "float32"}, {"name": "701", "dtype": "float32"}, {"name": "702", "dtype": "float32"}, {"name": "703", "dtype": "float32"}, {"name": "704", "dtype": "float32"}, {"name": "705", "dtype": "float32"}, {"name": "706", "dtype": "float32"}, {"name": "707", "dtype": "float32"}, {"name": "708", "dtype": "float32"}, {"name": "709", "dtype": "float32"}, {"name": "710", "dtype": "float32"}, {"name": "711", "dtype": "float32"}, {"name": "712", "dtype": "float32"}, {"name": "713", "dtype": "float32"}, {"name": "714", "dtype": "float32"}, {"name": "715", "dtype": "float32"}, {"name": "716", "dtype": "float32"}, {"name": "717", "dtype": "float32"}, {"name": "718", "dtype": "float32"}, {"name": "719", "dtype": "float32"}, {"name": "720", "dtype": "float32"}, {"name": "721", "dtype": "float32"}, {"name": "722", "dtype": "float32"}, {"name": "723", "dtype": "float32"}, {"name": "724", "dtype": "float32"}, {"name": "725", "dtype": "float32"}, {"name": "726", "dtype": "float32"}, {"name": "727", "dtype": "float32"}, {"name": "728", "dtype": "float32"}, {"name": "729", "dtype": "float32"}, {"name": "730", "dtype": "float32"}, {"name": "731", "dtype": "float32"}, {"name": "732", "dtype": "float32"}, {"name": "733", "dtype": "float32"}, {"name": "734", "dtype": "float32"}, {"name": "735", "dtype": "float32"}, {"name": "736", "dtype": "float32"}, {"name": "737", "dtype": "float32"}, {"name": "738", "dtype": "float32"}, {"name": "739", "dtype": "float32"}, {"name": "740", "dtype": "float32"}, {"name": "741", "dtype": "float32"}, {"name": "742", "dtype": "float32"}, {"name": "743", "dtype": "float32"}, {"name": "744", "dtype": "float32"}, {"name": "745", "dtype": "float32"}, {"name": "746", "dtype": "float32"}, {"name": "747", "dtype": "float32"}, {"name": "748", "dtype": "float32"}, {"name": "749", "dtype": "float32"}, {"name": "750", "dtype": "float32"}, {"name": "751", "dtype": "float32"}, {"name": "752", "dtype": "float32"}, {"name": "753", "dtype": "float32"}, {"name": "754", "dtype": "float32"}, {"name": "755", "dtype": "float32"}, {"name": "756", "dtype": "float32"}, {"name": "757", "dtype": "float32"}, {"name": "758", "dtype": "float32"}, {"name": "759", "dtype": "float32"}, {"name": "760", "dtype": "float32"}, {"name": "761", "dtype": "float32"}, {"name": "762", "dtype": "float32"}, {"name": "763", "dtype": "float32"}, {"name": "764", "dtype": "float32"}, {"name": "765", "dtype": "float32"}, {"name": "766", "dtype": "float32"}, {"name": "767", "dtype": "float32"}, {"name": "label", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 115621182, "num_examples": 37500}, {"name": "test", "num_bytes": 38540387, "num_examples": 12500}], "download_size": 211874717, "dataset_size": 154161569}}
2023-09-04T07:10:01+00:00
[]
[]
TAGS #region-us
# Dataset Card for "CSIC_BERT_FT" More Information needed
[ "# Dataset Card for \"CSIC_BERT_FT\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"CSIC_BERT_FT\"\n\nMore Information needed" ]
[ 6, 17 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"CSIC_BERT_FT\"\n\nMore Information needed" ]