sha
stringlengths 40
40
| text
stringlengths 1
13.4M
| id
stringlengths 2
117
| tags
listlengths 1
7.91k
| created_at
stringlengths 25
25
| metadata
stringlengths 2
875k
| last_modified
stringlengths 25
25
| arxiv
listlengths 0
25
| languages
listlengths 0
7.91k
| tags_str
stringlengths 17
159k
| text_str
stringlengths 1
447k
| text_lists
listlengths 0
352
| processed_texts
listlengths 1
353
| tokens_length
listlengths 1
353
| input_texts
listlengths 1
40
|
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
867e81bdcb47b5d9675f497842d8330d914e20e7 | # Dataset Card for "criminal-sketch-Hr"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | SujinHwang/criminal-sketch-Hr | [
"region:us"
]
| 2023-11-19T14:04:36+00:00 | {"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}], "dataset_info": {"features": [{"name": "image", "dtype": "image"}, {"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 182934925.572, "num_examples": 8071}], "download_size": 166876827, "dataset_size": 182934925.572}} | 2023-11-25T06:19:44+00:00 | []
| []
| TAGS
#region-us
| # Dataset Card for "criminal-sketch-Hr"
More Information needed | [
"# Dataset Card for \"criminal-sketch-Hr\"\n\nMore Information needed"
]
| [
"TAGS\n#region-us \n",
"# Dataset Card for \"criminal-sketch-Hr\"\n\nMore Information needed"
]
| [
6,
19
]
| [
"passage: TAGS\n#region-us \n# Dataset Card for \"criminal-sketch-Hr\"\n\nMore Information needed"
]
|
5ff8191938913ad71bb44135928cad6aec0f0ada |
# Dataset Card for Evaluation run of openaccess-ai-collective/grendel
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/openaccess-ai-collective/grendel
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [openaccess-ai-collective/grendel](https://huggingface.co/openaccess-ai-collective/grendel) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_openaccess-ai-collective__grendel_public",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-11-19T14:02:28.206445](https://huggingface.co/datasets/open-llm-leaderboard/details_openaccess-ai-collective__grendel_public/blob/main/results_2023-11-19T14-02-28.206445.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5874222006151991,
"acc_stderr": 0.03333785718767842,
"acc_norm": 0.5935899001354114,
"acc_norm_stderr": 0.034032890288055895,
"mc1": 0.3659730722154223,
"mc1_stderr": 0.016862941684088365,
"mc2": 0.5267824071398005,
"mc2_stderr": 0.015695608410958812,
"em": 0.611996644295302,
"em_stderr": 0.0049903604159338,
"f1": 0.6469746224832212,
"f1_stderr": 0.0047180171110879675
},
"harness|arc:challenge|25": {
"acc": 0.5597269624573379,
"acc_stderr": 0.01450676952480424,
"acc_norm": 0.6049488054607508,
"acc_norm_stderr": 0.014285898292938163
},
"harness|hellaswag|10": {
"acc": 0.6158135829516033,
"acc_stderr": 0.004854082479916909,
"acc_norm": 0.7999402509460267,
"acc_norm_stderr": 0.003992272261659567
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252606,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252606
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5703703703703704,
"acc_stderr": 0.04276349494376599,
"acc_norm": 0.5703703703703704,
"acc_norm_stderr": 0.04276349494376599
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6118421052631579,
"acc_stderr": 0.03965842097512744,
"acc_norm": 0.6118421052631579,
"acc_norm_stderr": 0.03965842097512744
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.59,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.59,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6339622641509434,
"acc_stderr": 0.02964781353936525,
"acc_norm": 0.6339622641509434,
"acc_norm_stderr": 0.02964781353936525
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6875,
"acc_stderr": 0.038760854559127644,
"acc_norm": 0.6875,
"acc_norm_stderr": 0.038760854559127644
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.41,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.41,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252605,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252605
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6069364161849711,
"acc_stderr": 0.037242495958177295,
"acc_norm": 0.6069364161849711,
"acc_norm_stderr": 0.037242495958177295
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.37254901960784315,
"acc_stderr": 0.04810840148082635,
"acc_norm": 0.37254901960784315,
"acc_norm_stderr": 0.04810840148082635
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768079,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768079
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.502127659574468,
"acc_stderr": 0.03268572658667492,
"acc_norm": 0.502127659574468,
"acc_norm_stderr": 0.03268572658667492
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.40350877192982454,
"acc_stderr": 0.046151869625837026,
"acc_norm": 0.40350877192982454,
"acc_norm_stderr": 0.046151869625837026
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.496551724137931,
"acc_stderr": 0.041665675771015785,
"acc_norm": 0.496551724137931,
"acc_norm_stderr": 0.041665675771015785
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3994708994708995,
"acc_stderr": 0.02522545028406788,
"acc_norm": 0.3994708994708995,
"acc_norm_stderr": 0.02522545028406788
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.30952380952380953,
"acc_stderr": 0.04134913018303316,
"acc_norm": 0.30952380952380953,
"acc_norm_stderr": 0.04134913018303316
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.4,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.4,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6903225806451613,
"acc_stderr": 0.026302774983517414,
"acc_norm": 0.6903225806451613,
"acc_norm_stderr": 0.026302774983517414
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4975369458128079,
"acc_stderr": 0.03517945038691063,
"acc_norm": 0.4975369458128079,
"acc_norm_stderr": 0.03517945038691063
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.62,
"acc_stderr": 0.04878317312145633,
"acc_norm": 0.62,
"acc_norm_stderr": 0.04878317312145633
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7454545454545455,
"acc_stderr": 0.03401506715249039,
"acc_norm": 0.7454545454545455,
"acc_norm_stderr": 0.03401506715249039
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7575757575757576,
"acc_stderr": 0.030532892233932026,
"acc_norm": 0.7575757575757576,
"acc_norm_stderr": 0.030532892233932026
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8238341968911918,
"acc_stderr": 0.027493504244548057,
"acc_norm": 0.8238341968911918,
"acc_norm_stderr": 0.027493504244548057
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5435897435897435,
"acc_stderr": 0.0252544854247996,
"acc_norm": 0.5435897435897435,
"acc_norm_stderr": 0.0252544854247996
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2962962962962963,
"acc_stderr": 0.02784081149587193,
"acc_norm": 0.2962962962962963,
"acc_norm_stderr": 0.02784081149587193
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6176470588235294,
"acc_stderr": 0.03156663099215416,
"acc_norm": 0.6176470588235294,
"acc_norm_stderr": 0.03156663099215416
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3509933774834437,
"acc_stderr": 0.03896981964257375,
"acc_norm": 0.3509933774834437,
"acc_norm_stderr": 0.03896981964257375
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7853211009174312,
"acc_stderr": 0.01760430414925648,
"acc_norm": 0.7853211009174312,
"acc_norm_stderr": 0.01760430414925648
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.3888888888888889,
"acc_stderr": 0.033247089118091176,
"acc_norm": 0.3888888888888889,
"acc_norm_stderr": 0.033247089118091176
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7401960784313726,
"acc_stderr": 0.030778554678693254,
"acc_norm": 0.7401960784313726,
"acc_norm_stderr": 0.030778554678693254
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7848101265822784,
"acc_stderr": 0.02675082699467617,
"acc_norm": 0.7848101265822784,
"acc_norm_stderr": 0.02675082699467617
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6816143497757847,
"acc_stderr": 0.03126580522513713,
"acc_norm": 0.6816143497757847,
"acc_norm_stderr": 0.03126580522513713
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7022900763358778,
"acc_stderr": 0.04010358942462203,
"acc_norm": 0.7022900763358778,
"acc_norm_stderr": 0.04010358942462203
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7107438016528925,
"acc_stderr": 0.041391127276354626,
"acc_norm": 0.7107438016528925,
"acc_norm_stderr": 0.041391127276354626
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7407407407407407,
"acc_stderr": 0.04236511258094632,
"acc_norm": 0.7407407407407407,
"acc_norm_stderr": 0.04236511258094632
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6748466257668712,
"acc_stderr": 0.036803503712864616,
"acc_norm": 0.6748466257668712,
"acc_norm_stderr": 0.036803503712864616
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5089285714285714,
"acc_stderr": 0.04745033255489123,
"acc_norm": 0.5089285714285714,
"acc_norm_stderr": 0.04745033255489123
},
"harness|hendrycksTest-management|5": {
"acc": 0.7572815533980582,
"acc_stderr": 0.042450224863844935,
"acc_norm": 0.7572815533980582,
"acc_norm_stderr": 0.042450224863844935
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8376068376068376,
"acc_stderr": 0.024161618127987745,
"acc_norm": 0.8376068376068376,
"acc_norm_stderr": 0.024161618127987745
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695237,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695237
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7816091954022989,
"acc_stderr": 0.014774358319934486,
"acc_norm": 0.7816091954022989,
"acc_norm_stderr": 0.014774358319934486
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6965317919075145,
"acc_stderr": 0.024752411960917212,
"acc_norm": 0.6965317919075145,
"acc_norm_stderr": 0.024752411960917212
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.21899441340782122,
"acc_stderr": 0.01383167668730318,
"acc_norm": 0.21899441340782122,
"acc_norm_stderr": 0.01383167668730318
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.02699254433929724,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.02699254433929724
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6559485530546624,
"acc_stderr": 0.026981478043648043,
"acc_norm": 0.6559485530546624,
"acc_norm_stderr": 0.026981478043648043
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6481481481481481,
"acc_stderr": 0.026571483480719964,
"acc_norm": 0.6481481481481481,
"acc_norm_stderr": 0.026571483480719964
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.425531914893617,
"acc_stderr": 0.02949482760014437,
"acc_norm": 0.425531914893617,
"acc_norm_stderr": 0.02949482760014437
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4198174706649283,
"acc_stderr": 0.01260496081608737,
"acc_norm": 0.4198174706649283,
"acc_norm_stderr": 0.01260496081608737
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5808823529411765,
"acc_stderr": 0.02997280717046462,
"acc_norm": 0.5808823529411765,
"acc_norm_stderr": 0.02997280717046462
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5996732026143791,
"acc_stderr": 0.01982184368827175,
"acc_norm": 0.5996732026143791,
"acc_norm_stderr": 0.01982184368827175
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5909090909090909,
"acc_stderr": 0.04709306978661895,
"acc_norm": 0.5909090909090909,
"acc_norm_stderr": 0.04709306978661895
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6653061224489796,
"acc_stderr": 0.030209235226242307,
"acc_norm": 0.6653061224489796,
"acc_norm_stderr": 0.030209235226242307
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8159203980099502,
"acc_stderr": 0.027403859410786845,
"acc_norm": 0.8159203980099502,
"acc_norm_stderr": 0.027403859410786845
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.81,
"acc_stderr": 0.03942772444036623,
"acc_norm": 0.81,
"acc_norm_stderr": 0.03942772444036623
},
"harness|hendrycksTest-virology|5": {
"acc": 0.46987951807228917,
"acc_stderr": 0.03885425420866766,
"acc_norm": 0.46987951807228917,
"acc_norm_stderr": 0.03885425420866766
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7602339181286549,
"acc_stderr": 0.03274485211946956,
"acc_norm": 0.7602339181286549,
"acc_norm_stderr": 0.03274485211946956
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3659730722154223,
"mc1_stderr": 0.016862941684088365,
"mc2": 0.5267824071398005,
"mc2_stderr": 0.015695608410958812
},
"harness|winogrande|5": {
"acc": 0.7529597474348856,
"acc_stderr": 0.01212140294285556
},
"harness|drop|3": {
"em": 0.611996644295302,
"em_stderr": 0.0049903604159338,
"f1": 0.6469746224832212,
"f1_stderr": 0.0047180171110879675
},
"harness|gsm8k|5": {
"acc": 0.287338893100834,
"acc_stderr": 0.012464677060107086
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] | open-llm-leaderboard/details_openaccess-ai-collective__grendel | [
"region:us"
]
| 2023-11-19T14:05:26+00:00 | {"pretty_name": "Evaluation run of openaccess-ai-collective/grendel", "dataset_summary": "Dataset automatically created during the evaluation run of model [openaccess-ai-collective/grendel](https://huggingface.co/openaccess-ai-collective/grendel) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_openaccess-ai-collective__grendel_public\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-11-19T14:02:28.206445](https://huggingface.co/datasets/open-llm-leaderboard/details_openaccess-ai-collective__grendel_public/blob/main/results_2023-11-19T14-02-28.206445.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5874222006151991,\n \"acc_stderr\": 0.03333785718767842,\n \"acc_norm\": 0.5935899001354114,\n \"acc_norm_stderr\": 0.034032890288055895,\n \"mc1\": 0.3659730722154223,\n \"mc1_stderr\": 0.016862941684088365,\n \"mc2\": 0.5267824071398005,\n \"mc2_stderr\": 0.015695608410958812,\n \"em\": 0.611996644295302,\n \"em_stderr\": 0.0049903604159338,\n \"f1\": 0.6469746224832212,\n \"f1_stderr\": 0.0047180171110879675\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5597269624573379,\n \"acc_stderr\": 0.01450676952480424,\n \"acc_norm\": 0.6049488054607508,\n \"acc_norm_stderr\": 0.014285898292938163\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6158135829516033,\n \"acc_stderr\": 0.004854082479916909,\n \"acc_norm\": 0.7999402509460267,\n \"acc_norm_stderr\": 0.003992272261659567\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252606,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252606\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5703703703703704,\n \"acc_stderr\": 0.04276349494376599,\n \"acc_norm\": 0.5703703703703704,\n \"acc_norm_stderr\": 0.04276349494376599\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6118421052631579,\n \"acc_stderr\": 0.03965842097512744,\n \"acc_norm\": 0.6118421052631579,\n \"acc_norm_stderr\": 0.03965842097512744\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.59,\n \"acc_stderr\": 0.04943110704237102,\n \"acc_norm\": 0.59,\n \"acc_norm_stderr\": 0.04943110704237102\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6339622641509434,\n \"acc_stderr\": 0.02964781353936525,\n \"acc_norm\": 0.6339622641509434,\n \"acc_norm_stderr\": 0.02964781353936525\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6875,\n \"acc_stderr\": 0.038760854559127644,\n \"acc_norm\": 0.6875,\n \"acc_norm_stderr\": 0.038760854559127644\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.41,\n \"acc_stderr\": 0.04943110704237102,\n \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.04943110704237102\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252605,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252605\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6069364161849711,\n \"acc_stderr\": 0.037242495958177295,\n \"acc_norm\": 0.6069364161849711,\n \"acc_norm_stderr\": 0.037242495958177295\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.37254901960784315,\n \"acc_stderr\": 0.04810840148082635,\n \"acc_norm\": 0.37254901960784315,\n \"acc_norm_stderr\": 0.04810840148082635\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768079,\n \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.04408440022768079\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.502127659574468,\n \"acc_stderr\": 0.03268572658667492,\n \"acc_norm\": 0.502127659574468,\n \"acc_norm_stderr\": 0.03268572658667492\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.40350877192982454,\n \"acc_stderr\": 0.046151869625837026,\n \"acc_norm\": 0.40350877192982454,\n \"acc_norm_stderr\": 0.046151869625837026\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.496551724137931,\n \"acc_stderr\": 0.041665675771015785,\n \"acc_norm\": 0.496551724137931,\n \"acc_norm_stderr\": 0.041665675771015785\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.3994708994708995,\n \"acc_stderr\": 0.02522545028406788,\n \"acc_norm\": 0.3994708994708995,\n \"acc_norm_stderr\": 0.02522545028406788\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.30952380952380953,\n \"acc_stderr\": 0.04134913018303316,\n \"acc_norm\": 0.30952380952380953,\n \"acc_norm_stderr\": 0.04134913018303316\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6903225806451613,\n \"acc_stderr\": 0.026302774983517414,\n \"acc_norm\": 0.6903225806451613,\n \"acc_norm_stderr\": 0.026302774983517414\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4975369458128079,\n \"acc_stderr\": 0.03517945038691063,\n \"acc_norm\": 0.4975369458128079,\n \"acc_norm_stderr\": 0.03517945038691063\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.62,\n \"acc_stderr\": 0.04878317312145633,\n \"acc_norm\": 0.62,\n \"acc_norm_stderr\": 0.04878317312145633\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7454545454545455,\n \"acc_stderr\": 0.03401506715249039,\n \"acc_norm\": 0.7454545454545455,\n \"acc_norm_stderr\": 0.03401506715249039\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7575757575757576,\n \"acc_stderr\": 0.030532892233932026,\n \"acc_norm\": 0.7575757575757576,\n \"acc_norm_stderr\": 0.030532892233932026\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8238341968911918,\n \"acc_stderr\": 0.027493504244548057,\n \"acc_norm\": 0.8238341968911918,\n \"acc_norm_stderr\": 0.027493504244548057\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.5435897435897435,\n \"acc_stderr\": 0.0252544854247996,\n \"acc_norm\": 0.5435897435897435,\n \"acc_norm_stderr\": 0.0252544854247996\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.2962962962962963,\n \"acc_stderr\": 0.02784081149587193,\n \"acc_norm\": 0.2962962962962963,\n \"acc_norm_stderr\": 0.02784081149587193\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6176470588235294,\n \"acc_stderr\": 0.03156663099215416,\n \"acc_norm\": 0.6176470588235294,\n \"acc_norm_stderr\": 0.03156663099215416\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3509933774834437,\n \"acc_stderr\": 0.03896981964257375,\n \"acc_norm\": 0.3509933774834437,\n \"acc_norm_stderr\": 0.03896981964257375\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.7853211009174312,\n \"acc_stderr\": 0.01760430414925648,\n \"acc_norm\": 0.7853211009174312,\n \"acc_norm_stderr\": 0.01760430414925648\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.3888888888888889,\n \"acc_stderr\": 0.033247089118091176,\n \"acc_norm\": 0.3888888888888889,\n \"acc_norm_stderr\": 0.033247089118091176\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7401960784313726,\n \"acc_stderr\": 0.030778554678693254,\n \"acc_norm\": 0.7401960784313726,\n \"acc_norm_stderr\": 0.030778554678693254\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7848101265822784,\n \"acc_stderr\": 0.02675082699467617,\n \"acc_norm\": 0.7848101265822784,\n \"acc_norm_stderr\": 0.02675082699467617\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6816143497757847,\n \"acc_stderr\": 0.03126580522513713,\n \"acc_norm\": 0.6816143497757847,\n \"acc_norm_stderr\": 0.03126580522513713\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7022900763358778,\n \"acc_stderr\": 0.04010358942462203,\n \"acc_norm\": 0.7022900763358778,\n \"acc_norm_stderr\": 0.04010358942462203\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7107438016528925,\n \"acc_stderr\": 0.041391127276354626,\n \"acc_norm\": 0.7107438016528925,\n \"acc_norm_stderr\": 0.041391127276354626\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7407407407407407,\n \"acc_stderr\": 0.04236511258094632,\n \"acc_norm\": 0.7407407407407407,\n \"acc_norm_stderr\": 0.04236511258094632\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.6748466257668712,\n \"acc_stderr\": 0.036803503712864616,\n \"acc_norm\": 0.6748466257668712,\n \"acc_norm_stderr\": 0.036803503712864616\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5089285714285714,\n \"acc_stderr\": 0.04745033255489123,\n \"acc_norm\": 0.5089285714285714,\n \"acc_norm_stderr\": 0.04745033255489123\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7572815533980582,\n \"acc_stderr\": 0.042450224863844935,\n \"acc_norm\": 0.7572815533980582,\n \"acc_norm_stderr\": 0.042450224863844935\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8376068376068376,\n \"acc_stderr\": 0.024161618127987745,\n \"acc_norm\": 0.8376068376068376,\n \"acc_norm_stderr\": 0.024161618127987745\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \"acc_norm\": 0.66,\n \"acc_norm_stderr\": 0.04760952285695237\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7816091954022989,\n \"acc_stderr\": 0.014774358319934486,\n \"acc_norm\": 0.7816091954022989,\n \"acc_norm_stderr\": 0.014774358319934486\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.6965317919075145,\n \"acc_stderr\": 0.024752411960917212,\n \"acc_norm\": 0.6965317919075145,\n \"acc_norm_stderr\": 0.024752411960917212\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.21899441340782122,\n \"acc_stderr\": 0.01383167668730318,\n \"acc_norm\": 0.21899441340782122,\n \"acc_norm_stderr\": 0.01383167668730318\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.02699254433929724,\n \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.02699254433929724\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6559485530546624,\n \"acc_stderr\": 0.026981478043648043,\n \"acc_norm\": 0.6559485530546624,\n \"acc_norm_stderr\": 0.026981478043648043\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.6481481481481481,\n \"acc_stderr\": 0.026571483480719964,\n \"acc_norm\": 0.6481481481481481,\n \"acc_norm_stderr\": 0.026571483480719964\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.425531914893617,\n \"acc_stderr\": 0.02949482760014437,\n \"acc_norm\": 0.425531914893617,\n \"acc_norm_stderr\": 0.02949482760014437\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4198174706649283,\n \"acc_stderr\": 0.01260496081608737,\n \"acc_norm\": 0.4198174706649283,\n \"acc_norm_stderr\": 0.01260496081608737\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.5808823529411765,\n \"acc_stderr\": 0.02997280717046462,\n \"acc_norm\": 0.5808823529411765,\n \"acc_norm_stderr\": 0.02997280717046462\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.5996732026143791,\n \"acc_stderr\": 0.01982184368827175,\n \"acc_norm\": 0.5996732026143791,\n \"acc_norm_stderr\": 0.01982184368827175\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5909090909090909,\n \"acc_stderr\": 0.04709306978661895,\n \"acc_norm\": 0.5909090909090909,\n \"acc_norm_stderr\": 0.04709306978661895\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.6653061224489796,\n \"acc_stderr\": 0.030209235226242307,\n \"acc_norm\": 0.6653061224489796,\n \"acc_norm_stderr\": 0.030209235226242307\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8159203980099502,\n \"acc_stderr\": 0.027403859410786845,\n \"acc_norm\": 0.8159203980099502,\n \"acc_norm_stderr\": 0.027403859410786845\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.81,\n \"acc_stderr\": 0.03942772444036623,\n \"acc_norm\": 0.81,\n \"acc_norm_stderr\": 0.03942772444036623\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.46987951807228917,\n \"acc_stderr\": 0.03885425420866766,\n \"acc_norm\": 0.46987951807228917,\n \"acc_norm_stderr\": 0.03885425420866766\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.7602339181286549,\n \"acc_stderr\": 0.03274485211946956,\n \"acc_norm\": 0.7602339181286549,\n \"acc_norm_stderr\": 0.03274485211946956\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3659730722154223,\n \"mc1_stderr\": 0.016862941684088365,\n \"mc2\": 0.5267824071398005,\n \"mc2_stderr\": 0.015695608410958812\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7529597474348856,\n \"acc_stderr\": 0.01212140294285556\n },\n \"harness|drop|3\": {\n \"em\": 0.611996644295302,\n \"em_stderr\": 0.0049903604159338,\n \"f1\": 0.6469746224832212,\n \"f1_stderr\": 0.0047180171110879675\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.287338893100834,\n \"acc_stderr\": 0.012464677060107086\n }\n}\n```", "repo_url": "https://huggingface.co/openaccess-ai-collective/grendel", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_11_19T14_02_28.206445", "path": ["**/details_harness|arc:challenge|25_2023-11-19T14-02-28.206445.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-11-19T14-02-28.206445.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_11_19T14_02_28.206445", "path": ["**/details_harness|drop|3_2023-11-19T14-02-28.206445.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-11-19T14-02-28.206445.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_11_19T14_02_28.206445", "path": ["**/details_harness|gsm8k|5_2023-11-19T14-02-28.206445.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-11-19T14-02-28.206445.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_11_19T14_02_28.206445", "path": ["**/details_harness|hellaswag|10_2023-11-19T14-02-28.206445.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-11-19T14-02-28.206445.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_11_19T14_02_28.206445", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-19T14-02-28.206445.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-11-19T14-02-28.206445.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-11-19T14-02-28.206445.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-11-19T14-02-28.206445.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-19T14-02-28.206445.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-11-19T14-02-28.206445.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-11-19T14-02-28.206445.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-11-19T14-02-28.206445.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-11-19T14-02-28.206445.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-11-19T14-02-28.206445.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-11-19T14-02-28.206445.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-11-19T14-02-28.206445.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-19T14-02-28.206445.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-11-19T14-02-28.206445.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-19T14-02-28.206445.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-19T14-02-28.206445.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-11-19T14-02-28.206445.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-11-19T14-02-28.206445.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-11-19T14-02-28.206445.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-19T14-02-28.206445.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-19T14-02-28.206445.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-19T14-02-28.206445.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-11-19T14-02-28.206445.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-19T14-02-28.206445.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-19T14-02-28.206445.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-19T14-02-28.206445.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-19T14-02-28.206445.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-11-19T14-02-28.206445.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-19T14-02-28.206445.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-19T14-02-28.206445.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-19T14-02-28.206445.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-19T14-02-28.206445.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-11-19T14-02-28.206445.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-11-19T14-02-28.206445.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-11-19T14-02-28.206445.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-11-19T14-02-28.206445.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-19T14-02-28.206445.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-11-19T14-02-28.206445.parquet", "**/details_harness|hendrycksTest-management|5_2023-11-19T14-02-28.206445.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-11-19T14-02-28.206445.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-11-19T14-02-28.206445.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-11-19T14-02-28.206445.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-11-19T14-02-28.206445.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-19T14-02-28.206445.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-11-19T14-02-28.206445.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-11-19T14-02-28.206445.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-11-19T14-02-28.206445.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-11-19T14-02-28.206445.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-11-19T14-02-28.206445.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-11-19T14-02-28.206445.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-11-19T14-02-28.206445.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-11-19T14-02-28.206445.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-11-19T14-02-28.206445.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-11-19T14-02-28.206445.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-19T14-02-28.206445.parquet", "**/details_harness|hendrycksTest-virology|5_2023-11-19T14-02-28.206445.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-11-19T14-02-28.206445.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-19T14-02-28.206445.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-11-19T14-02-28.206445.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-11-19T14-02-28.206445.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-11-19T14-02-28.206445.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-19T14-02-28.206445.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-11-19T14-02-28.206445.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-11-19T14-02-28.206445.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-11-19T14-02-28.206445.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-11-19T14-02-28.206445.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-11-19T14-02-28.206445.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-11-19T14-02-28.206445.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-11-19T14-02-28.206445.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-19T14-02-28.206445.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-11-19T14-02-28.206445.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-19T14-02-28.206445.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-19T14-02-28.206445.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-11-19T14-02-28.206445.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-11-19T14-02-28.206445.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-11-19T14-02-28.206445.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-19T14-02-28.206445.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-19T14-02-28.206445.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-19T14-02-28.206445.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-11-19T14-02-28.206445.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-19T14-02-28.206445.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-19T14-02-28.206445.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-19T14-02-28.206445.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-19T14-02-28.206445.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-11-19T14-02-28.206445.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-19T14-02-28.206445.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-19T14-02-28.206445.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-19T14-02-28.206445.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-19T14-02-28.206445.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-11-19T14-02-28.206445.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-11-19T14-02-28.206445.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-11-19T14-02-28.206445.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-11-19T14-02-28.206445.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-19T14-02-28.206445.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-11-19T14-02-28.206445.parquet", "**/details_harness|hendrycksTest-management|5_2023-11-19T14-02-28.206445.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-11-19T14-02-28.206445.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-11-19T14-02-28.206445.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-11-19T14-02-28.206445.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-11-19T14-02-28.206445.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-19T14-02-28.206445.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-11-19T14-02-28.206445.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-11-19T14-02-28.206445.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-11-19T14-02-28.206445.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-11-19T14-02-28.206445.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-11-19T14-02-28.206445.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-11-19T14-02-28.206445.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-11-19T14-02-28.206445.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-11-19T14-02-28.206445.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-11-19T14-02-28.206445.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-11-19T14-02-28.206445.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-19T14-02-28.206445.parquet", "**/details_harness|hendrycksTest-virology|5_2023-11-19T14-02-28.206445.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-11-19T14-02-28.206445.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_11_19T14_02_28.206445", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-19T14-02-28.206445.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-19T14-02-28.206445.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_11_19T14_02_28.206445", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-11-19T14-02-28.206445.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-11-19T14-02-28.206445.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_11_19T14_02_28.206445", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-11-19T14-02-28.206445.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-11-19T14-02-28.206445.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_11_19T14_02_28.206445", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-11-19T14-02-28.206445.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-11-19T14-02-28.206445.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_11_19T14_02_28.206445", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-19T14-02-28.206445.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-19T14-02-28.206445.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_11_19T14_02_28.206445", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-11-19T14-02-28.206445.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-11-19T14-02-28.206445.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_11_19T14_02_28.206445", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-11-19T14-02-28.206445.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-11-19T14-02-28.206445.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_11_19T14_02_28.206445", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-11-19T14-02-28.206445.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-11-19T14-02-28.206445.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_11_19T14_02_28.206445", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-11-19T14-02-28.206445.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-11-19T14-02-28.206445.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_11_19T14_02_28.206445", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-11-19T14-02-28.206445.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-11-19T14-02-28.206445.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_11_19T14_02_28.206445", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-11-19T14-02-28.206445.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-11-19T14-02-28.206445.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_11_19T14_02_28.206445", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-11-19T14-02-28.206445.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-11-19T14-02-28.206445.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_11_19T14_02_28.206445", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-19T14-02-28.206445.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-19T14-02-28.206445.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_11_19T14_02_28.206445", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-11-19T14-02-28.206445.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-11-19T14-02-28.206445.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_11_19T14_02_28.206445", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-19T14-02-28.206445.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-19T14-02-28.206445.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_11_19T14_02_28.206445", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-19T14-02-28.206445.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-19T14-02-28.206445.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_11_19T14_02_28.206445", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-11-19T14-02-28.206445.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-11-19T14-02-28.206445.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_11_19T14_02_28.206445", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-11-19T14-02-28.206445.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-11-19T14-02-28.206445.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_11_19T14_02_28.206445", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-11-19T14-02-28.206445.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-11-19T14-02-28.206445.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_11_19T14_02_28.206445", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-19T14-02-28.206445.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-19T14-02-28.206445.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_11_19T14_02_28.206445", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-19T14-02-28.206445.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-19T14-02-28.206445.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_11_19T14_02_28.206445", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-19T14-02-28.206445.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-19T14-02-28.206445.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_11_19T14_02_28.206445", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-11-19T14-02-28.206445.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-11-19T14-02-28.206445.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_11_19T14_02_28.206445", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-19T14-02-28.206445.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-19T14-02-28.206445.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_11_19T14_02_28.206445", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-19T14-02-28.206445.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-19T14-02-28.206445.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_11_19T14_02_28.206445", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-19T14-02-28.206445.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-19T14-02-28.206445.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_11_19T14_02_28.206445", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-19T14-02-28.206445.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-19T14-02-28.206445.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_11_19T14_02_28.206445", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-11-19T14-02-28.206445.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-11-19T14-02-28.206445.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_11_19T14_02_28.206445", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-19T14-02-28.206445.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-19T14-02-28.206445.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_11_19T14_02_28.206445", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-19T14-02-28.206445.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-19T14-02-28.206445.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_11_19T14_02_28.206445", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-19T14-02-28.206445.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-19T14-02-28.206445.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_11_19T14_02_28.206445", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-19T14-02-28.206445.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-19T14-02-28.206445.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_11_19T14_02_28.206445", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-11-19T14-02-28.206445.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-11-19T14-02-28.206445.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_11_19T14_02_28.206445", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-11-19T14-02-28.206445.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-11-19T14-02-28.206445.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_11_19T14_02_28.206445", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-11-19T14-02-28.206445.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-11-19T14-02-28.206445.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_11_19T14_02_28.206445", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-11-19T14-02-28.206445.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-11-19T14-02-28.206445.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_11_19T14_02_28.206445", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-19T14-02-28.206445.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-19T14-02-28.206445.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_11_19T14_02_28.206445", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-11-19T14-02-28.206445.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-11-19T14-02-28.206445.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_11_19T14_02_28.206445", "path": ["**/details_harness|hendrycksTest-management|5_2023-11-19T14-02-28.206445.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-11-19T14-02-28.206445.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_11_19T14_02_28.206445", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-11-19T14-02-28.206445.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-11-19T14-02-28.206445.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_11_19T14_02_28.206445", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-11-19T14-02-28.206445.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-11-19T14-02-28.206445.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_11_19T14_02_28.206445", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-11-19T14-02-28.206445.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-11-19T14-02-28.206445.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_11_19T14_02_28.206445", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-11-19T14-02-28.206445.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-11-19T14-02-28.206445.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_11_19T14_02_28.206445", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-19T14-02-28.206445.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-19T14-02-28.206445.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_11_19T14_02_28.206445", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-11-19T14-02-28.206445.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-11-19T14-02-28.206445.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_11_19T14_02_28.206445", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-11-19T14-02-28.206445.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-11-19T14-02-28.206445.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_11_19T14_02_28.206445", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-11-19T14-02-28.206445.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-11-19T14-02-28.206445.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_11_19T14_02_28.206445", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-11-19T14-02-28.206445.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-11-19T14-02-28.206445.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_11_19T14_02_28.206445", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-11-19T14-02-28.206445.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-11-19T14-02-28.206445.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_11_19T14_02_28.206445", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-11-19T14-02-28.206445.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-11-19T14-02-28.206445.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_11_19T14_02_28.206445", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-11-19T14-02-28.206445.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-11-19T14-02-28.206445.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_11_19T14_02_28.206445", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-11-19T14-02-28.206445.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-11-19T14-02-28.206445.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_11_19T14_02_28.206445", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-11-19T14-02-28.206445.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-11-19T14-02-28.206445.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_11_19T14_02_28.206445", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-11-19T14-02-28.206445.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-11-19T14-02-28.206445.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_11_19T14_02_28.206445", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-19T14-02-28.206445.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-19T14-02-28.206445.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_11_19T14_02_28.206445", "path": ["**/details_harness|hendrycksTest-virology|5_2023-11-19T14-02-28.206445.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-11-19T14-02-28.206445.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_11_19T14_02_28.206445", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-11-19T14-02-28.206445.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-11-19T14-02-28.206445.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_11_19T14_02_28.206445", "path": ["**/details_harness|truthfulqa:mc|0_2023-11-19T14-02-28.206445.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-11-19T14-02-28.206445.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_11_19T14_02_28.206445", "path": ["**/details_harness|winogrande|5_2023-11-19T14-02-28.206445.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-11-19T14-02-28.206445.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_11_19T14_02_28.206445", "path": ["results_2023-11-19T14-02-28.206445.parquet"]}, {"split": "latest", "path": ["results_2023-11-19T14-02-28.206445.parquet"]}]}]} | 2023-11-19T14:06:14+00:00 | []
| []
| TAGS
#region-us
|
# Dataset Card for Evaluation run of openaccess-ai-collective/grendel
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model openaccess-ai-collective/grendel on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-11-19T14:02:28.206445(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
| [
"# Dataset Card for Evaluation run of openaccess-ai-collective/grendel",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model openaccess-ai-collective/grendel on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-11-19T14:02:28.206445(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
]
| [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of openaccess-ai-collective/grendel",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model openaccess-ai-collective/grendel on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-11-19T14:02:28.206445(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
]
| [
6,
19,
31,
168,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
]
| [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of openaccess-ai-collective/grendel## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model openaccess-ai-collective/grendel on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-11-19T14:02:28.206445(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
]
|
c6569fc6d3ce84a4b4c30ff2e1a5ca24b332273a | 1tb(960gb) 19743 h. 3518 sp. | MrPrometheus/books_full_mini | [
"license:mit",
"region:us"
]
| 2023-11-19T14:18:56+00:00 | {"license": "mit"} | 2023-12-17T07:01:47+00:00 | []
| []
| TAGS
#license-mit #region-us
| 1tb(960gb) 19743 h. 3518 sp. | []
| [
"TAGS\n#license-mit #region-us \n"
]
| [
11
]
| [
"passage: TAGS\n#license-mit #region-us \n"
]
|
f73aae7458165464a1d05001fdb2747f4685a72b |
# Dataset Card for Evaluation run of AI-Sweden-Models/gpt-sw3-6.7b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/AI-Sweden-Models/gpt-sw3-6.7b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [AI-Sweden-Models/gpt-sw3-6.7b](https://huggingface.co/AI-Sweden-Models/gpt-sw3-6.7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_AI-Sweden-Models__gpt-sw3-6.7b_public",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-11-19T14:20:27.241108](https://huggingface.co/datasets/open-llm-leaderboard/details_AI-Sweden-Models__gpt-sw3-6.7b_public/blob/main/results_2023-11-19T14-20-27.241108.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.26572380055428,
"acc_stderr": 0.03108769443217598,
"acc_norm": 0.26762761682482883,
"acc_norm_stderr": 0.031880299374266195,
"mc1": 0.23255813953488372,
"mc1_stderr": 0.014789157531080508,
"mc2": 0.39038444805230477,
"mc2_stderr": 0.013873169667140115,
"em": 0.04058305369127517,
"em_stderr": 0.002020764506779191,
"f1": 0.08923238255033483,
"f1_stderr": 0.0023387637858432947
},
"harness|arc:challenge|25": {
"acc": 0.32081911262798635,
"acc_stderr": 0.013640943091946526,
"acc_norm": 0.363481228668942,
"acc_norm_stderr": 0.014056207319068285
},
"harness|hellaswag|10": {
"acc": 0.4570802628958375,
"acc_stderr": 0.004971364031062592,
"acc_norm": 0.6075482971519618,
"acc_norm_stderr": 0.0048729844929680105
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768081,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768081
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.34074074074074073,
"acc_stderr": 0.04094376269996793,
"acc_norm": 0.34074074074074073,
"acc_norm_stderr": 0.04094376269996793
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.3026315789473684,
"acc_stderr": 0.037385206761196665,
"acc_norm": 0.3026315789473684,
"acc_norm_stderr": 0.037385206761196665
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816506,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816506
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.2339622641509434,
"acc_stderr": 0.02605529690115292,
"acc_norm": 0.2339622641509434,
"acc_norm_stderr": 0.02605529690115292
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2638888888888889,
"acc_stderr": 0.03685651095897532,
"acc_norm": 0.2638888888888889,
"acc_norm_stderr": 0.03685651095897532
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.18,
"acc_stderr": 0.03861229196653694,
"acc_norm": 0.18,
"acc_norm_stderr": 0.03861229196653694
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542129,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542129
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.24855491329479767,
"acc_stderr": 0.03295304696818318,
"acc_norm": 0.24855491329479767,
"acc_norm_stderr": 0.03295304696818318
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.19607843137254902,
"acc_stderr": 0.03950581861179963,
"acc_norm": 0.19607843137254902,
"acc_norm_stderr": 0.03950581861179963
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.26,
"acc_stderr": 0.044084400227680794,
"acc_norm": 0.26,
"acc_norm_stderr": 0.044084400227680794
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.2127659574468085,
"acc_stderr": 0.026754391348039776,
"acc_norm": 0.2127659574468085,
"acc_norm_stderr": 0.026754391348039776
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2631578947368421,
"acc_stderr": 0.041424397194893624,
"acc_norm": 0.2631578947368421,
"acc_norm_stderr": 0.041424397194893624
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.27586206896551724,
"acc_stderr": 0.037245636197746325,
"acc_norm": 0.27586206896551724,
"acc_norm_stderr": 0.037245636197746325
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2671957671957672,
"acc_stderr": 0.02278967314577656,
"acc_norm": 0.2671957671957672,
"acc_norm_stderr": 0.02278967314577656
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.21428571428571427,
"acc_stderr": 0.03670066451047181,
"acc_norm": 0.21428571428571427,
"acc_norm_stderr": 0.03670066451047181
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145633,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145633
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.2645161290322581,
"acc_stderr": 0.02509189237885928,
"acc_norm": 0.2645161290322581,
"acc_norm_stderr": 0.02509189237885928
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.26108374384236455,
"acc_stderr": 0.03090379695211449,
"acc_norm": 0.26108374384236455,
"acc_norm_stderr": 0.03090379695211449
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.296969696969697,
"acc_stderr": 0.035679697722680474,
"acc_norm": 0.296969696969697,
"acc_norm_stderr": 0.035679697722680474
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.29797979797979796,
"acc_stderr": 0.03258630383836556,
"acc_norm": 0.29797979797979796,
"acc_norm_stderr": 0.03258630383836556
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.22797927461139897,
"acc_stderr": 0.03027690994517826,
"acc_norm": 0.22797927461139897,
"acc_norm_stderr": 0.03027690994517826
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.2153846153846154,
"acc_stderr": 0.020843034557462878,
"acc_norm": 0.2153846153846154,
"acc_norm_stderr": 0.020843034557462878
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.25555555555555554,
"acc_stderr": 0.02659393910184407,
"acc_norm": 0.25555555555555554,
"acc_norm_stderr": 0.02659393910184407
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.21008403361344538,
"acc_stderr": 0.026461398717471874,
"acc_norm": 0.21008403361344538,
"acc_norm_stderr": 0.026461398717471874
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.271523178807947,
"acc_stderr": 0.03631329803969653,
"acc_norm": 0.271523178807947,
"acc_norm_stderr": 0.03631329803969653
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.24220183486238533,
"acc_stderr": 0.018368176306598615,
"acc_norm": 0.24220183486238533,
"acc_norm_stderr": 0.018368176306598615
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.2175925925925926,
"acc_stderr": 0.0281396894448597,
"acc_norm": 0.2175925925925926,
"acc_norm_stderr": 0.0281396894448597
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.27450980392156865,
"acc_stderr": 0.03132179803083291,
"acc_norm": 0.27450980392156865,
"acc_norm_stderr": 0.03132179803083291
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.2742616033755274,
"acc_stderr": 0.029041333510598018,
"acc_norm": 0.2742616033755274,
"acc_norm_stderr": 0.029041333510598018
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.26905829596412556,
"acc_stderr": 0.029763779406874975,
"acc_norm": 0.26905829596412556,
"acc_norm_stderr": 0.029763779406874975
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.25190839694656486,
"acc_stderr": 0.03807387116306086,
"acc_norm": 0.25190839694656486,
"acc_norm_stderr": 0.03807387116306086
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.371900826446281,
"acc_stderr": 0.04412015806624504,
"acc_norm": 0.371900826446281,
"acc_norm_stderr": 0.04412015806624504
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.24074074074074073,
"acc_stderr": 0.041331194402438376,
"acc_norm": 0.24074074074074073,
"acc_norm_stderr": 0.041331194402438376
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.3006134969325153,
"acc_stderr": 0.03602511318806771,
"acc_norm": 0.3006134969325153,
"acc_norm_stderr": 0.03602511318806771
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.25,
"acc_stderr": 0.04109974682633932,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04109974682633932
},
"harness|hendrycksTest-management|5": {
"acc": 0.1941747572815534,
"acc_stderr": 0.03916667762822583,
"acc_norm": 0.1941747572815534,
"acc_norm_stderr": 0.03916667762822583
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.25213675213675213,
"acc_stderr": 0.02844796547623102,
"acc_norm": 0.25213675213675213,
"acc_norm_stderr": 0.02844796547623102
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.16,
"acc_stderr": 0.0368452949177471,
"acc_norm": 0.16,
"acc_norm_stderr": 0.0368452949177471
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.2796934865900383,
"acc_stderr": 0.016050792148036543,
"acc_norm": 0.2796934865900383,
"acc_norm_stderr": 0.016050792148036543
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.2861271676300578,
"acc_stderr": 0.02433214677913413,
"acc_norm": 0.2861271676300578,
"acc_norm_stderr": 0.02433214677913413
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.24692737430167597,
"acc_stderr": 0.014422292204808835,
"acc_norm": 0.24692737430167597,
"acc_norm_stderr": 0.014422292204808835
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.25163398692810457,
"acc_stderr": 0.024848018263875195,
"acc_norm": 0.25163398692810457,
"acc_norm_stderr": 0.024848018263875195
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.3054662379421222,
"acc_stderr": 0.02616058445014049,
"acc_norm": 0.3054662379421222,
"acc_norm_stderr": 0.02616058445014049
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.2808641975308642,
"acc_stderr": 0.025006469755799208,
"acc_norm": 0.2808641975308642,
"acc_norm_stderr": 0.025006469755799208
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.26595744680851063,
"acc_stderr": 0.02635806569888059,
"acc_norm": 0.26595744680851063,
"acc_norm_stderr": 0.02635806569888059
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.27444589308996087,
"acc_stderr": 0.011397043163078154,
"acc_norm": 0.27444589308996087,
"acc_norm_stderr": 0.011397043163078154
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.16176470588235295,
"acc_stderr": 0.02236867256288675,
"acc_norm": 0.16176470588235295,
"acc_norm_stderr": 0.02236867256288675
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.27941176470588236,
"acc_stderr": 0.018152871051538816,
"acc_norm": 0.27941176470588236,
"acc_norm_stderr": 0.018152871051538816
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.20909090909090908,
"acc_stderr": 0.038950910157241364,
"acc_norm": 0.20909090909090908,
"acc_norm_stderr": 0.038950910157241364
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.2612244897959184,
"acc_stderr": 0.02812342933514278,
"acc_norm": 0.2612244897959184,
"acc_norm_stderr": 0.02812342933514278
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.2537313432835821,
"acc_stderr": 0.03076944496729601,
"acc_norm": 0.2537313432835821,
"acc_norm_stderr": 0.03076944496729601
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.28,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.28,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-virology|5": {
"acc": 0.24096385542168675,
"acc_stderr": 0.033293941190735296,
"acc_norm": 0.24096385542168675,
"acc_norm_stderr": 0.033293941190735296
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.29239766081871343,
"acc_stderr": 0.034886477134579215,
"acc_norm": 0.29239766081871343,
"acc_norm_stderr": 0.034886477134579215
},
"harness|truthfulqa:mc|0": {
"mc1": 0.23255813953488372,
"mc1_stderr": 0.014789157531080508,
"mc2": 0.39038444805230477,
"mc2_stderr": 0.013873169667140115
},
"harness|winogrande|5": {
"acc": 0.6069455406471981,
"acc_stderr": 0.013727276249108437
},
"harness|drop|3": {
"em": 0.04058305369127517,
"em_stderr": 0.002020764506779191,
"f1": 0.08923238255033483,
"f1_stderr": 0.0023387637858432947
},
"harness|gsm8k|5": {
"acc": 0.00530705079605762,
"acc_stderr": 0.002001305720948049
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] | open-llm-leaderboard/details_AI-Sweden-Models__gpt-sw3-6.7b | [
"region:us"
]
| 2023-11-19T14:22:48+00:00 | {"pretty_name": "Evaluation run of AI-Sweden-Models/gpt-sw3-6.7b", "dataset_summary": "Dataset automatically created during the evaluation run of model [AI-Sweden-Models/gpt-sw3-6.7b](https://huggingface.co/AI-Sweden-Models/gpt-sw3-6.7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_AI-Sweden-Models__gpt-sw3-6.7b_public\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-11-19T14:20:27.241108](https://huggingface.co/datasets/open-llm-leaderboard/details_AI-Sweden-Models__gpt-sw3-6.7b_public/blob/main/results_2023-11-19T14-20-27.241108.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.26572380055428,\n \"acc_stderr\": 0.03108769443217598,\n \"acc_norm\": 0.26762761682482883,\n \"acc_norm_stderr\": 0.031880299374266195,\n \"mc1\": 0.23255813953488372,\n \"mc1_stderr\": 0.014789157531080508,\n \"mc2\": 0.39038444805230477,\n \"mc2_stderr\": 0.013873169667140115,\n \"em\": 0.04058305369127517,\n \"em_stderr\": 0.002020764506779191,\n \"f1\": 0.08923238255033483,\n \"f1_stderr\": 0.0023387637858432947\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.32081911262798635,\n \"acc_stderr\": 0.013640943091946526,\n \"acc_norm\": 0.363481228668942,\n \"acc_norm_stderr\": 0.014056207319068285\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.4570802628958375,\n \"acc_stderr\": 0.004971364031062592,\n \"acc_norm\": 0.6075482971519618,\n \"acc_norm_stderr\": 0.0048729844929680105\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768081,\n \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768081\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.34074074074074073,\n \"acc_stderr\": 0.04094376269996793,\n \"acc_norm\": 0.34074074074074073,\n \"acc_norm_stderr\": 0.04094376269996793\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.3026315789473684,\n \"acc_stderr\": 0.037385206761196665,\n \"acc_norm\": 0.3026315789473684,\n \"acc_norm_stderr\": 0.037385206761196665\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816506,\n \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816506\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.2339622641509434,\n \"acc_stderr\": 0.02605529690115292,\n \"acc_norm\": 0.2339622641509434,\n \"acc_norm_stderr\": 0.02605529690115292\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2638888888888889,\n \"acc_stderr\": 0.03685651095897532,\n \"acc_norm\": 0.2638888888888889,\n \"acc_norm_stderr\": 0.03685651095897532\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.18,\n \"acc_stderr\": 0.03861229196653694,\n \"acc_norm\": 0.18,\n \"acc_norm_stderr\": 0.03861229196653694\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542129,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542129\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.24855491329479767,\n \"acc_stderr\": 0.03295304696818318,\n \"acc_norm\": 0.24855491329479767,\n \"acc_norm_stderr\": 0.03295304696818318\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.19607843137254902,\n \"acc_stderr\": 0.03950581861179963,\n \"acc_norm\": 0.19607843137254902,\n \"acc_norm_stderr\": 0.03950581861179963\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.26,\n \"acc_stderr\": 0.044084400227680794,\n \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.044084400227680794\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.2127659574468085,\n \"acc_stderr\": 0.026754391348039776,\n \"acc_norm\": 0.2127659574468085,\n \"acc_norm_stderr\": 0.026754391348039776\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2631578947368421,\n \"acc_stderr\": 0.041424397194893624,\n \"acc_norm\": 0.2631578947368421,\n \"acc_norm_stderr\": 0.041424397194893624\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.27586206896551724,\n \"acc_stderr\": 0.037245636197746325,\n \"acc_norm\": 0.27586206896551724,\n \"acc_norm_stderr\": 0.037245636197746325\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.2671957671957672,\n \"acc_stderr\": 0.02278967314577656,\n \"acc_norm\": 0.2671957671957672,\n \"acc_norm_stderr\": 0.02278967314577656\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.21428571428571427,\n \"acc_stderr\": 0.03670066451047181,\n \"acc_norm\": 0.21428571428571427,\n \"acc_norm_stderr\": 0.03670066451047181\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145633,\n \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145633\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.2645161290322581,\n \"acc_stderr\": 0.02509189237885928,\n \"acc_norm\": 0.2645161290322581,\n \"acc_norm_stderr\": 0.02509189237885928\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.26108374384236455,\n \"acc_stderr\": 0.03090379695211449,\n \"acc_norm\": 0.26108374384236455,\n \"acc_norm_stderr\": 0.03090379695211449\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.296969696969697,\n \"acc_stderr\": 0.035679697722680474,\n \"acc_norm\": 0.296969696969697,\n \"acc_norm_stderr\": 0.035679697722680474\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.29797979797979796,\n \"acc_stderr\": 0.03258630383836556,\n \"acc_norm\": 0.29797979797979796,\n \"acc_norm_stderr\": 0.03258630383836556\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.22797927461139897,\n \"acc_stderr\": 0.03027690994517826,\n \"acc_norm\": 0.22797927461139897,\n \"acc_norm_stderr\": 0.03027690994517826\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.2153846153846154,\n \"acc_stderr\": 0.020843034557462878,\n \"acc_norm\": 0.2153846153846154,\n \"acc_norm_stderr\": 0.020843034557462878\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.25555555555555554,\n \"acc_stderr\": 0.02659393910184407,\n \"acc_norm\": 0.25555555555555554,\n \"acc_norm_stderr\": 0.02659393910184407\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.21008403361344538,\n \"acc_stderr\": 0.026461398717471874,\n \"acc_norm\": 0.21008403361344538,\n \"acc_norm_stderr\": 0.026461398717471874\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.271523178807947,\n \"acc_stderr\": 0.03631329803969653,\n \"acc_norm\": 0.271523178807947,\n \"acc_norm_stderr\": 0.03631329803969653\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.24220183486238533,\n \"acc_stderr\": 0.018368176306598615,\n \"acc_norm\": 0.24220183486238533,\n \"acc_norm_stderr\": 0.018368176306598615\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.2175925925925926,\n \"acc_stderr\": 0.0281396894448597,\n \"acc_norm\": 0.2175925925925926,\n \"acc_norm_stderr\": 0.0281396894448597\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.27450980392156865,\n \"acc_stderr\": 0.03132179803083291,\n \"acc_norm\": 0.27450980392156865,\n \"acc_norm_stderr\": 0.03132179803083291\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.2742616033755274,\n \"acc_stderr\": 0.029041333510598018,\n \"acc_norm\": 0.2742616033755274,\n \"acc_norm_stderr\": 0.029041333510598018\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.26905829596412556,\n \"acc_stderr\": 0.029763779406874975,\n \"acc_norm\": 0.26905829596412556,\n \"acc_norm_stderr\": 0.029763779406874975\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.25190839694656486,\n \"acc_stderr\": 0.03807387116306086,\n \"acc_norm\": 0.25190839694656486,\n \"acc_norm_stderr\": 0.03807387116306086\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.371900826446281,\n \"acc_stderr\": 0.04412015806624504,\n \"acc_norm\": 0.371900826446281,\n \"acc_norm_stderr\": 0.04412015806624504\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.24074074074074073,\n \"acc_stderr\": 0.041331194402438376,\n \"acc_norm\": 0.24074074074074073,\n \"acc_norm_stderr\": 0.041331194402438376\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.3006134969325153,\n \"acc_stderr\": 0.03602511318806771,\n \"acc_norm\": 0.3006134969325153,\n \"acc_norm_stderr\": 0.03602511318806771\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04109974682633932,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04109974682633932\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.1941747572815534,\n \"acc_stderr\": 0.03916667762822583,\n \"acc_norm\": 0.1941747572815534,\n \"acc_norm_stderr\": 0.03916667762822583\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.25213675213675213,\n \"acc_stderr\": 0.02844796547623102,\n \"acc_norm\": 0.25213675213675213,\n \"acc_norm_stderr\": 0.02844796547623102\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.16,\n \"acc_stderr\": 0.0368452949177471,\n \"acc_norm\": 0.16,\n \"acc_norm_stderr\": 0.0368452949177471\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.2796934865900383,\n \"acc_stderr\": 0.016050792148036543,\n \"acc_norm\": 0.2796934865900383,\n \"acc_norm_stderr\": 0.016050792148036543\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.2861271676300578,\n \"acc_stderr\": 0.02433214677913413,\n \"acc_norm\": 0.2861271676300578,\n \"acc_norm_stderr\": 0.02433214677913413\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24692737430167597,\n \"acc_stderr\": 0.014422292204808835,\n \"acc_norm\": 0.24692737430167597,\n \"acc_norm_stderr\": 0.014422292204808835\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.25163398692810457,\n \"acc_stderr\": 0.024848018263875195,\n \"acc_norm\": 0.25163398692810457,\n \"acc_norm_stderr\": 0.024848018263875195\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.3054662379421222,\n \"acc_stderr\": 0.02616058445014049,\n \"acc_norm\": 0.3054662379421222,\n \"acc_norm_stderr\": 0.02616058445014049\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.2808641975308642,\n \"acc_stderr\": 0.025006469755799208,\n \"acc_norm\": 0.2808641975308642,\n \"acc_norm_stderr\": 0.025006469755799208\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.26595744680851063,\n \"acc_stderr\": 0.02635806569888059,\n \"acc_norm\": 0.26595744680851063,\n \"acc_norm_stderr\": 0.02635806569888059\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.27444589308996087,\n \"acc_stderr\": 0.011397043163078154,\n \"acc_norm\": 0.27444589308996087,\n \"acc_norm_stderr\": 0.011397043163078154\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.16176470588235295,\n \"acc_stderr\": 0.02236867256288675,\n \"acc_norm\": 0.16176470588235295,\n \"acc_norm_stderr\": 0.02236867256288675\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.27941176470588236,\n \"acc_stderr\": 0.018152871051538816,\n \"acc_norm\": 0.27941176470588236,\n \"acc_norm_stderr\": 0.018152871051538816\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.20909090909090908,\n \"acc_stderr\": 0.038950910157241364,\n \"acc_norm\": 0.20909090909090908,\n \"acc_norm_stderr\": 0.038950910157241364\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.2612244897959184,\n \"acc_stderr\": 0.02812342933514278,\n \"acc_norm\": 0.2612244897959184,\n \"acc_norm_stderr\": 0.02812342933514278\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.2537313432835821,\n \"acc_stderr\": 0.03076944496729601,\n \"acc_norm\": 0.2537313432835821,\n \"acc_norm_stderr\": 0.03076944496729601\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.045126085985421276,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.045126085985421276\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.24096385542168675,\n \"acc_stderr\": 0.033293941190735296,\n \"acc_norm\": 0.24096385542168675,\n \"acc_norm_stderr\": 0.033293941190735296\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.29239766081871343,\n \"acc_stderr\": 0.034886477134579215,\n \"acc_norm\": 0.29239766081871343,\n \"acc_norm_stderr\": 0.034886477134579215\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.23255813953488372,\n \"mc1_stderr\": 0.014789157531080508,\n \"mc2\": 0.39038444805230477,\n \"mc2_stderr\": 0.013873169667140115\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.6069455406471981,\n \"acc_stderr\": 0.013727276249108437\n },\n \"harness|drop|3\": {\n \"em\": 0.04058305369127517,\n \"em_stderr\": 0.002020764506779191,\n \"f1\": 0.08923238255033483,\n \"f1_stderr\": 0.0023387637858432947\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.00530705079605762,\n \"acc_stderr\": 0.002001305720948049\n }\n}\n```", "repo_url": "https://huggingface.co/AI-Sweden-Models/gpt-sw3-6.7b", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_11_19T14_20_27.241108", "path": ["**/details_harness|arc:challenge|25_2023-11-19T14-20-27.241108.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-11-19T14-20-27.241108.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_11_19T14_20_27.241108", "path": ["**/details_harness|drop|3_2023-11-19T14-20-27.241108.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-11-19T14-20-27.241108.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_11_19T14_20_27.241108", "path": ["**/details_harness|gsm8k|5_2023-11-19T14-20-27.241108.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-11-19T14-20-27.241108.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_11_19T14_20_27.241108", "path": ["**/details_harness|hellaswag|10_2023-11-19T14-20-27.241108.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-11-19T14-20-27.241108.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_11_19T14_20_27.241108", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-19T14-20-27.241108.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-11-19T14-20-27.241108.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-11-19T14-20-27.241108.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-11-19T14-20-27.241108.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-19T14-20-27.241108.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-11-19T14-20-27.241108.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-11-19T14-20-27.241108.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-11-19T14-20-27.241108.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-11-19T14-20-27.241108.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-11-19T14-20-27.241108.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-11-19T14-20-27.241108.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-11-19T14-20-27.241108.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-19T14-20-27.241108.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-11-19T14-20-27.241108.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-19T14-20-27.241108.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-19T14-20-27.241108.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-11-19T14-20-27.241108.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-11-19T14-20-27.241108.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-11-19T14-20-27.241108.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-19T14-20-27.241108.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-19T14-20-27.241108.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-19T14-20-27.241108.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-11-19T14-20-27.241108.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-19T14-20-27.241108.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-19T14-20-27.241108.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-19T14-20-27.241108.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-19T14-20-27.241108.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-11-19T14-20-27.241108.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-19T14-20-27.241108.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-19T14-20-27.241108.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-19T14-20-27.241108.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-19T14-20-27.241108.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-11-19T14-20-27.241108.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-11-19T14-20-27.241108.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-11-19T14-20-27.241108.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-11-19T14-20-27.241108.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-19T14-20-27.241108.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-11-19T14-20-27.241108.parquet", "**/details_harness|hendrycksTest-management|5_2023-11-19T14-20-27.241108.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-11-19T14-20-27.241108.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-11-19T14-20-27.241108.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-11-19T14-20-27.241108.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-11-19T14-20-27.241108.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-19T14-20-27.241108.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-11-19T14-20-27.241108.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-11-19T14-20-27.241108.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-11-19T14-20-27.241108.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-11-19T14-20-27.241108.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-11-19T14-20-27.241108.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-11-19T14-20-27.241108.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-11-19T14-20-27.241108.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-11-19T14-20-27.241108.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-11-19T14-20-27.241108.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-11-19T14-20-27.241108.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-19T14-20-27.241108.parquet", "**/details_harness|hendrycksTest-virology|5_2023-11-19T14-20-27.241108.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-11-19T14-20-27.241108.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-19T14-20-27.241108.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-11-19T14-20-27.241108.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-11-19T14-20-27.241108.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-11-19T14-20-27.241108.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-19T14-20-27.241108.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-11-19T14-20-27.241108.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-11-19T14-20-27.241108.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-11-19T14-20-27.241108.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-11-19T14-20-27.241108.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-11-19T14-20-27.241108.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-11-19T14-20-27.241108.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-11-19T14-20-27.241108.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-19T14-20-27.241108.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-11-19T14-20-27.241108.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-19T14-20-27.241108.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-19T14-20-27.241108.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-11-19T14-20-27.241108.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-11-19T14-20-27.241108.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-11-19T14-20-27.241108.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-19T14-20-27.241108.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-19T14-20-27.241108.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-19T14-20-27.241108.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-11-19T14-20-27.241108.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-19T14-20-27.241108.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-19T14-20-27.241108.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-19T14-20-27.241108.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-19T14-20-27.241108.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-11-19T14-20-27.241108.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-19T14-20-27.241108.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-19T14-20-27.241108.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-19T14-20-27.241108.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-19T14-20-27.241108.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-11-19T14-20-27.241108.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-11-19T14-20-27.241108.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-11-19T14-20-27.241108.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-11-19T14-20-27.241108.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-19T14-20-27.241108.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-11-19T14-20-27.241108.parquet", "**/details_harness|hendrycksTest-management|5_2023-11-19T14-20-27.241108.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-11-19T14-20-27.241108.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-11-19T14-20-27.241108.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-11-19T14-20-27.241108.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-11-19T14-20-27.241108.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-19T14-20-27.241108.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-11-19T14-20-27.241108.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-11-19T14-20-27.241108.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-11-19T14-20-27.241108.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-11-19T14-20-27.241108.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-11-19T14-20-27.241108.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-11-19T14-20-27.241108.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-11-19T14-20-27.241108.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-11-19T14-20-27.241108.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-11-19T14-20-27.241108.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-11-19T14-20-27.241108.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-19T14-20-27.241108.parquet", "**/details_harness|hendrycksTest-virology|5_2023-11-19T14-20-27.241108.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-11-19T14-20-27.241108.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_11_19T14_20_27.241108", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-19T14-20-27.241108.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-19T14-20-27.241108.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_11_19T14_20_27.241108", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-11-19T14-20-27.241108.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-11-19T14-20-27.241108.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_11_19T14_20_27.241108", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-11-19T14-20-27.241108.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-11-19T14-20-27.241108.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_11_19T14_20_27.241108", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-11-19T14-20-27.241108.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-11-19T14-20-27.241108.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_11_19T14_20_27.241108", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-19T14-20-27.241108.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-19T14-20-27.241108.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_11_19T14_20_27.241108", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-11-19T14-20-27.241108.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-11-19T14-20-27.241108.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_11_19T14_20_27.241108", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-11-19T14-20-27.241108.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-11-19T14-20-27.241108.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_11_19T14_20_27.241108", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-11-19T14-20-27.241108.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-11-19T14-20-27.241108.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_11_19T14_20_27.241108", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-11-19T14-20-27.241108.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-11-19T14-20-27.241108.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_11_19T14_20_27.241108", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-11-19T14-20-27.241108.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-11-19T14-20-27.241108.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_11_19T14_20_27.241108", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-11-19T14-20-27.241108.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-11-19T14-20-27.241108.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_11_19T14_20_27.241108", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-11-19T14-20-27.241108.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-11-19T14-20-27.241108.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_11_19T14_20_27.241108", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-19T14-20-27.241108.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-19T14-20-27.241108.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_11_19T14_20_27.241108", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-11-19T14-20-27.241108.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-11-19T14-20-27.241108.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_11_19T14_20_27.241108", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-19T14-20-27.241108.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-19T14-20-27.241108.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_11_19T14_20_27.241108", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-19T14-20-27.241108.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-19T14-20-27.241108.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_11_19T14_20_27.241108", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-11-19T14-20-27.241108.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-11-19T14-20-27.241108.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_11_19T14_20_27.241108", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-11-19T14-20-27.241108.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-11-19T14-20-27.241108.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_11_19T14_20_27.241108", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-11-19T14-20-27.241108.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-11-19T14-20-27.241108.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_11_19T14_20_27.241108", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-19T14-20-27.241108.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-19T14-20-27.241108.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_11_19T14_20_27.241108", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-19T14-20-27.241108.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-19T14-20-27.241108.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_11_19T14_20_27.241108", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-19T14-20-27.241108.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-19T14-20-27.241108.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_11_19T14_20_27.241108", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-11-19T14-20-27.241108.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-11-19T14-20-27.241108.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_11_19T14_20_27.241108", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-19T14-20-27.241108.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-19T14-20-27.241108.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_11_19T14_20_27.241108", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-19T14-20-27.241108.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-19T14-20-27.241108.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_11_19T14_20_27.241108", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-19T14-20-27.241108.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-19T14-20-27.241108.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_11_19T14_20_27.241108", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-19T14-20-27.241108.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-19T14-20-27.241108.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_11_19T14_20_27.241108", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-11-19T14-20-27.241108.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-11-19T14-20-27.241108.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_11_19T14_20_27.241108", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-19T14-20-27.241108.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-19T14-20-27.241108.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_11_19T14_20_27.241108", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-19T14-20-27.241108.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-19T14-20-27.241108.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_11_19T14_20_27.241108", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-19T14-20-27.241108.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-19T14-20-27.241108.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_11_19T14_20_27.241108", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-19T14-20-27.241108.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-19T14-20-27.241108.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_11_19T14_20_27.241108", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-11-19T14-20-27.241108.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-11-19T14-20-27.241108.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_11_19T14_20_27.241108", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-11-19T14-20-27.241108.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-11-19T14-20-27.241108.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_11_19T14_20_27.241108", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-11-19T14-20-27.241108.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-11-19T14-20-27.241108.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_11_19T14_20_27.241108", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-11-19T14-20-27.241108.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-11-19T14-20-27.241108.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_11_19T14_20_27.241108", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-19T14-20-27.241108.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-19T14-20-27.241108.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_11_19T14_20_27.241108", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-11-19T14-20-27.241108.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-11-19T14-20-27.241108.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_11_19T14_20_27.241108", "path": ["**/details_harness|hendrycksTest-management|5_2023-11-19T14-20-27.241108.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-11-19T14-20-27.241108.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_11_19T14_20_27.241108", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-11-19T14-20-27.241108.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-11-19T14-20-27.241108.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_11_19T14_20_27.241108", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-11-19T14-20-27.241108.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-11-19T14-20-27.241108.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_11_19T14_20_27.241108", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-11-19T14-20-27.241108.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-11-19T14-20-27.241108.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_11_19T14_20_27.241108", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-11-19T14-20-27.241108.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-11-19T14-20-27.241108.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_11_19T14_20_27.241108", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-19T14-20-27.241108.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-19T14-20-27.241108.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_11_19T14_20_27.241108", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-11-19T14-20-27.241108.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-11-19T14-20-27.241108.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_11_19T14_20_27.241108", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-11-19T14-20-27.241108.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-11-19T14-20-27.241108.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_11_19T14_20_27.241108", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-11-19T14-20-27.241108.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-11-19T14-20-27.241108.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_11_19T14_20_27.241108", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-11-19T14-20-27.241108.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-11-19T14-20-27.241108.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_11_19T14_20_27.241108", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-11-19T14-20-27.241108.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-11-19T14-20-27.241108.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_11_19T14_20_27.241108", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-11-19T14-20-27.241108.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-11-19T14-20-27.241108.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_11_19T14_20_27.241108", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-11-19T14-20-27.241108.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-11-19T14-20-27.241108.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_11_19T14_20_27.241108", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-11-19T14-20-27.241108.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-11-19T14-20-27.241108.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_11_19T14_20_27.241108", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-11-19T14-20-27.241108.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-11-19T14-20-27.241108.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_11_19T14_20_27.241108", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-11-19T14-20-27.241108.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-11-19T14-20-27.241108.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_11_19T14_20_27.241108", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-19T14-20-27.241108.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-19T14-20-27.241108.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_11_19T14_20_27.241108", "path": ["**/details_harness|hendrycksTest-virology|5_2023-11-19T14-20-27.241108.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-11-19T14-20-27.241108.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_11_19T14_20_27.241108", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-11-19T14-20-27.241108.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-11-19T14-20-27.241108.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_11_19T14_20_27.241108", "path": ["**/details_harness|truthfulqa:mc|0_2023-11-19T14-20-27.241108.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-11-19T14-20-27.241108.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_11_19T14_20_27.241108", "path": ["**/details_harness|winogrande|5_2023-11-19T14-20-27.241108.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-11-19T14-20-27.241108.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_11_19T14_20_27.241108", "path": ["results_2023-11-19T14-20-27.241108.parquet"]}, {"split": "latest", "path": ["results_2023-11-19T14-20-27.241108.parquet"]}]}]} | 2023-11-19T14:23:33+00:00 | []
| []
| TAGS
#region-us
|
# Dataset Card for Evaluation run of AI-Sweden-Models/gpt-sw3-6.7b
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model AI-Sweden-Models/gpt-sw3-6.7b on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-11-19T14:20:27.241108(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
| [
"# Dataset Card for Evaluation run of AI-Sweden-Models/gpt-sw3-6.7b",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model AI-Sweden-Models/gpt-sw3-6.7b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-11-19T14:20:27.241108(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
]
| [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of AI-Sweden-Models/gpt-sw3-6.7b",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model AI-Sweden-Models/gpt-sw3-6.7b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-11-19T14:20:27.241108(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
]
| [
6,
26,
31,
175,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
]
| [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of AI-Sweden-Models/gpt-sw3-6.7b## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model AI-Sweden-Models/gpt-sw3-6.7b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-11-19T14:20:27.241108(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
]
|
ebee8e5f15364b6153b072ce3f8ab46449628f59 | # Dataset Card for "lsc_binaryclassification_top2vec"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | tomashs/lsc_binaryclassification_top2vec | [
"region:us"
]
| 2023-11-19T14:25:18+00:00 | {"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}], "dataset_info": {"features": [{"name": "text", "dtype": "string"}, {"name": "short_form", "dtype": "string"}, {"name": "long_form", "dtype": "string"}, {"name": "label", "dtype": "int64"}, {"name": "topic_vector", "sequence": "float64"}], "splits": [{"name": "train", "num_bytes": 534759420, "num_examples": 400268}], "download_size": 137100468, "dataset_size": 534759420}} | 2023-11-19T14:29:43+00:00 | []
| []
| TAGS
#region-us
| # Dataset Card for "lsc_binaryclassification_top2vec"
More Information needed | [
"# Dataset Card for \"lsc_binaryclassification_top2vec\"\n\nMore Information needed"
]
| [
"TAGS\n#region-us \n",
"# Dataset Card for \"lsc_binaryclassification_top2vec\"\n\nMore Information needed"
]
| [
6,
21
]
| [
"passage: TAGS\n#region-us \n# Dataset Card for \"lsc_binaryclassification_top2vec\"\n\nMore Information needed"
]
|
5f4526d9a628ca1521f4c20918ece704d149d064 |
# Dataset Card for Evaluation run of Locutusque/TinyMistral-248m
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Locutusque/TinyMistral-248m
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [Locutusque/TinyMistral-248m](https://huggingface.co/Locutusque/TinyMistral-248m) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Locutusque__TinyMistral-248m",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-04T09:49:00.111287](https://huggingface.co/datasets/open-llm-leaderboard/details_Locutusque__TinyMistral-248m/blob/main/results_2023-12-04T09-49-00.111287.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.23192114355412077,
"acc_stderr": 0.029929988471137366,
"acc_norm": 0.2323220720193673,
"acc_norm_stderr": 0.030724605536997993,
"mc1": 0.22276621787025705,
"mc1_stderr": 0.014566506961396743,
"mc2": 0.42524397939784214,
"mc2_stderr": 0.015138993526497108
},
"harness|arc:challenge|25": {
"acc": 0.18088737201365188,
"acc_stderr": 0.01124857446740703,
"acc_norm": 0.22866894197952217,
"acc_norm_stderr": 0.012272853582540795
},
"harness|hellaswag|10": {
"acc": 0.27016530571599284,
"acc_stderr": 0.004431375549911367,
"acc_norm": 0.2802230631348337,
"acc_norm_stderr": 0.004481902637505664
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.22,
"acc_stderr": 0.04163331998932268,
"acc_norm": 0.22,
"acc_norm_stderr": 0.04163331998932268
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.18518518518518517,
"acc_stderr": 0.03355677216313142,
"acc_norm": 0.18518518518518517,
"acc_norm_stderr": 0.03355677216313142
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.17105263157894737,
"acc_stderr": 0.030643607071677088,
"acc_norm": 0.17105263157894737,
"acc_norm_stderr": 0.030643607071677088
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.24150943396226415,
"acc_stderr": 0.026341480371118362,
"acc_norm": 0.24150943396226415,
"acc_norm_stderr": 0.026341480371118362
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2638888888888889,
"acc_stderr": 0.03685651095897532,
"acc_norm": 0.2638888888888889,
"acc_norm_stderr": 0.03685651095897532
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.2,
"acc_stderr": 0.04020151261036845,
"acc_norm": 0.2,
"acc_norm_stderr": 0.04020151261036845
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909283,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909283
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.21965317919075145,
"acc_stderr": 0.031568093627031744,
"acc_norm": 0.21965317919075145,
"acc_norm_stderr": 0.031568093627031744
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.21568627450980393,
"acc_stderr": 0.04092563958237654,
"acc_norm": 0.21568627450980393,
"acc_norm_stderr": 0.04092563958237654
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.27,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.27,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.26382978723404255,
"acc_stderr": 0.028809989854102973,
"acc_norm": 0.26382978723404255,
"acc_norm_stderr": 0.028809989854102973
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.23684210526315788,
"acc_stderr": 0.039994238792813365,
"acc_norm": 0.23684210526315788,
"acc_norm_stderr": 0.039994238792813365
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2413793103448276,
"acc_stderr": 0.03565998174135302,
"acc_norm": 0.2413793103448276,
"acc_norm_stderr": 0.03565998174135302
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.21428571428571427,
"acc_stderr": 0.021132859182754454,
"acc_norm": 0.21428571428571427,
"acc_norm_stderr": 0.021132859182754454
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.04006168083848876,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.04006168083848876
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.18,
"acc_stderr": 0.038612291966536934,
"acc_norm": 0.18,
"acc_norm_stderr": 0.038612291966536934
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.19032258064516128,
"acc_stderr": 0.022331707611823088,
"acc_norm": 0.19032258064516128,
"acc_norm_stderr": 0.022331707611823088
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.1625615763546798,
"acc_stderr": 0.025960300064605576,
"acc_norm": 0.1625615763546798,
"acc_norm_stderr": 0.025960300064605576
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.21212121212121213,
"acc_stderr": 0.03192271569548299,
"acc_norm": 0.21212121212121213,
"acc_norm_stderr": 0.03192271569548299
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.17676767676767677,
"acc_stderr": 0.027178752639044915,
"acc_norm": 0.17676767676767677,
"acc_norm_stderr": 0.027178752639044915
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.19689119170984457,
"acc_stderr": 0.028697873971860664,
"acc_norm": 0.19689119170984457,
"acc_norm_stderr": 0.028697873971860664
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.2205128205128205,
"acc_stderr": 0.021020672680827912,
"acc_norm": 0.2205128205128205,
"acc_norm_stderr": 0.021020672680827912
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.21481481481481482,
"acc_stderr": 0.025040443877000683,
"acc_norm": 0.21481481481481482,
"acc_norm_stderr": 0.025040443877000683
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.21008403361344538,
"acc_stderr": 0.026461398717471874,
"acc_norm": 0.21008403361344538,
"acc_norm_stderr": 0.026461398717471874
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.1986754966887417,
"acc_stderr": 0.03257847384436776,
"acc_norm": 0.1986754966887417,
"acc_norm_stderr": 0.03257847384436776
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.1908256880733945,
"acc_stderr": 0.016847676400091112,
"acc_norm": 0.1908256880733945,
"acc_norm_stderr": 0.016847676400091112
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.1527777777777778,
"acc_stderr": 0.024536326026134224,
"acc_norm": 0.1527777777777778,
"acc_norm_stderr": 0.024536326026134224
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.2549019607843137,
"acc_stderr": 0.030587591351604246,
"acc_norm": 0.2549019607843137,
"acc_norm_stderr": 0.030587591351604246
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.26582278481012656,
"acc_stderr": 0.02875679962965834,
"acc_norm": 0.26582278481012656,
"acc_norm_stderr": 0.02875679962965834
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.30493273542600896,
"acc_stderr": 0.030898610882477515,
"acc_norm": 0.30493273542600896,
"acc_norm_stderr": 0.030898610882477515
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.2748091603053435,
"acc_stderr": 0.03915345408847836,
"acc_norm": 0.2748091603053435,
"acc_norm_stderr": 0.03915345408847836
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.23140495867768596,
"acc_stderr": 0.03849856098794088,
"acc_norm": 0.23140495867768596,
"acc_norm_stderr": 0.03849856098794088
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.042365112580946336,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.042365112580946336
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.22085889570552147,
"acc_stderr": 0.032591773927421776,
"acc_norm": 0.22085889570552147,
"acc_norm_stderr": 0.032591773927421776
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.30357142857142855,
"acc_stderr": 0.04364226155841044,
"acc_norm": 0.30357142857142855,
"acc_norm_stderr": 0.04364226155841044
},
"harness|hendrycksTest-management|5": {
"acc": 0.18446601941747573,
"acc_stderr": 0.03840423627288276,
"acc_norm": 0.18446601941747573,
"acc_norm_stderr": 0.03840423627288276
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.2863247863247863,
"acc_stderr": 0.029614323690456648,
"acc_norm": 0.2863247863247863,
"acc_norm_stderr": 0.029614323690456648
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.2413793103448276,
"acc_stderr": 0.01530238012354209,
"acc_norm": 0.2413793103448276,
"acc_norm_stderr": 0.01530238012354209
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.25722543352601157,
"acc_stderr": 0.02353292543104428,
"acc_norm": 0.25722543352601157,
"acc_norm_stderr": 0.02353292543104428
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23798882681564246,
"acc_stderr": 0.014242630070574915,
"acc_norm": 0.23798882681564246,
"acc_norm_stderr": 0.014242630070574915
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.21568627450980393,
"acc_stderr": 0.02355083135199509,
"acc_norm": 0.21568627450980393,
"acc_norm_stderr": 0.02355083135199509
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.1864951768488746,
"acc_stderr": 0.02212243977248077,
"acc_norm": 0.1864951768488746,
"acc_norm_stderr": 0.02212243977248077
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.2345679012345679,
"acc_stderr": 0.023576881744005716,
"acc_norm": 0.2345679012345679,
"acc_norm_stderr": 0.023576881744005716
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.23049645390070922,
"acc_stderr": 0.025123739226872405,
"acc_norm": 0.23049645390070922,
"acc_norm_stderr": 0.025123739226872405
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.24641460234680573,
"acc_stderr": 0.011005971399927237,
"acc_norm": 0.24641460234680573,
"acc_norm_stderr": 0.011005971399927237
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.1801470588235294,
"acc_stderr": 0.02334516361654485,
"acc_norm": 0.1801470588235294,
"acc_norm_stderr": 0.02334516361654485
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.25163398692810457,
"acc_stderr": 0.017555818091322256,
"acc_norm": 0.25163398692810457,
"acc_norm_stderr": 0.017555818091322256
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.21818181818181817,
"acc_stderr": 0.03955932861795833,
"acc_norm": 0.21818181818181817,
"acc_norm_stderr": 0.03955932861795833
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.17959183673469387,
"acc_stderr": 0.024573293589585637,
"acc_norm": 0.17959183673469387,
"acc_norm_stderr": 0.024573293589585637
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.23383084577114427,
"acc_stderr": 0.029929415408348373,
"acc_norm": 0.23383084577114427,
"acc_norm_stderr": 0.029929415408348373
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.27,
"acc_stderr": 0.0446196043338474,
"acc_norm": 0.27,
"acc_norm_stderr": 0.0446196043338474
},
"harness|hendrycksTest-virology|5": {
"acc": 0.25903614457831325,
"acc_stderr": 0.03410646614071856,
"acc_norm": 0.25903614457831325,
"acc_norm_stderr": 0.03410646614071856
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.3216374269005848,
"acc_stderr": 0.03582529442573122,
"acc_norm": 0.3216374269005848,
"acc_norm_stderr": 0.03582529442573122
},
"harness|truthfulqa:mc|0": {
"mc1": 0.22276621787025705,
"mc1_stderr": 0.014566506961396743,
"mc2": 0.42524397939784214,
"mc2_stderr": 0.015138993526497108
},
"harness|winogrande|5": {
"acc": 0.4980268350434096,
"acc_stderr": 0.014052376259225627
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] | open-llm-leaderboard/details_Locutusque__TinyMistral-248m | [
"region:us"
]
| 2023-11-19T14:35:50+00:00 | {"pretty_name": "Evaluation run of Locutusque/TinyMistral-248m", "dataset_summary": "Dataset automatically created during the evaluation run of model [Locutusque/TinyMistral-248m](https://huggingface.co/Locutusque/TinyMistral-248m) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Locutusque__TinyMistral-248m\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-04T09:49:00.111287](https://huggingface.co/datasets/open-llm-leaderboard/details_Locutusque__TinyMistral-248m/blob/main/results_2023-12-04T09-49-00.111287.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.23192114355412077,\n \"acc_stderr\": 0.029929988471137366,\n \"acc_norm\": 0.2323220720193673,\n \"acc_norm_stderr\": 0.030724605536997993,\n \"mc1\": 0.22276621787025705,\n \"mc1_stderr\": 0.014566506961396743,\n \"mc2\": 0.42524397939784214,\n \"mc2_stderr\": 0.015138993526497108\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.18088737201365188,\n \"acc_stderr\": 0.01124857446740703,\n \"acc_norm\": 0.22866894197952217,\n \"acc_norm_stderr\": 0.012272853582540795\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.27016530571599284,\n \"acc_stderr\": 0.004431375549911367,\n \"acc_norm\": 0.2802230631348337,\n \"acc_norm_stderr\": 0.004481902637505664\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.22,\n \"acc_stderr\": 0.04163331998932268,\n \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.04163331998932268\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.18518518518518517,\n \"acc_stderr\": 0.03355677216313142,\n \"acc_norm\": 0.18518518518518517,\n \"acc_norm_stderr\": 0.03355677216313142\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.17105263157894737,\n \"acc_stderr\": 0.030643607071677088,\n \"acc_norm\": 0.17105263157894737,\n \"acc_norm_stderr\": 0.030643607071677088\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.24150943396226415,\n \"acc_stderr\": 0.026341480371118362,\n \"acc_norm\": 0.24150943396226415,\n \"acc_norm_stderr\": 0.026341480371118362\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2638888888888889,\n \"acc_stderr\": 0.03685651095897532,\n \"acc_norm\": 0.2638888888888889,\n \"acc_norm_stderr\": 0.03685651095897532\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.2,\n \"acc_stderr\": 0.04020151261036845,\n \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.04020151261036845\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.24,\n \"acc_stderr\": 0.04292346959909283,\n \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.04292346959909283\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.21965317919075145,\n \"acc_stderr\": 0.031568093627031744,\n \"acc_norm\": 0.21965317919075145,\n \"acc_norm_stderr\": 0.031568093627031744\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.21568627450980393,\n \"acc_stderr\": 0.04092563958237654,\n \"acc_norm\": 0.21568627450980393,\n \"acc_norm_stderr\": 0.04092563958237654\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.26382978723404255,\n \"acc_stderr\": 0.028809989854102973,\n \"acc_norm\": 0.26382978723404255,\n \"acc_norm_stderr\": 0.028809989854102973\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.23684210526315788,\n \"acc_stderr\": 0.039994238792813365,\n \"acc_norm\": 0.23684210526315788,\n \"acc_norm_stderr\": 0.039994238792813365\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.2413793103448276,\n \"acc_stderr\": 0.03565998174135302,\n \"acc_norm\": 0.2413793103448276,\n \"acc_norm_stderr\": 0.03565998174135302\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.21428571428571427,\n \"acc_stderr\": 0.021132859182754454,\n \"acc_norm\": 0.21428571428571427,\n \"acc_norm_stderr\": 0.021132859182754454\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2777777777777778,\n \"acc_stderr\": 0.04006168083848876,\n \"acc_norm\": 0.2777777777777778,\n \"acc_norm_stderr\": 0.04006168083848876\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.18,\n \"acc_stderr\": 0.038612291966536934,\n \"acc_norm\": 0.18,\n \"acc_norm_stderr\": 0.038612291966536934\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.19032258064516128,\n \"acc_stderr\": 0.022331707611823088,\n \"acc_norm\": 0.19032258064516128,\n \"acc_norm_stderr\": 0.022331707611823088\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.1625615763546798,\n \"acc_stderr\": 0.025960300064605576,\n \"acc_norm\": 0.1625615763546798,\n \"acc_norm_stderr\": 0.025960300064605576\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.21212121212121213,\n \"acc_stderr\": 0.03192271569548299,\n \"acc_norm\": 0.21212121212121213,\n \"acc_norm_stderr\": 0.03192271569548299\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.17676767676767677,\n \"acc_stderr\": 0.027178752639044915,\n \"acc_norm\": 0.17676767676767677,\n \"acc_norm_stderr\": 0.027178752639044915\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.19689119170984457,\n \"acc_stderr\": 0.028697873971860664,\n \"acc_norm\": 0.19689119170984457,\n \"acc_norm_stderr\": 0.028697873971860664\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.2205128205128205,\n \"acc_stderr\": 0.021020672680827912,\n \"acc_norm\": 0.2205128205128205,\n \"acc_norm_stderr\": 0.021020672680827912\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.21481481481481482,\n \"acc_stderr\": 0.025040443877000683,\n \"acc_norm\": 0.21481481481481482,\n \"acc_norm_stderr\": 0.025040443877000683\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.21008403361344538,\n \"acc_stderr\": 0.026461398717471874,\n \"acc_norm\": 0.21008403361344538,\n \"acc_norm_stderr\": 0.026461398717471874\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.1986754966887417,\n \"acc_stderr\": 0.03257847384436776,\n \"acc_norm\": 0.1986754966887417,\n \"acc_norm_stderr\": 0.03257847384436776\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.1908256880733945,\n \"acc_stderr\": 0.016847676400091112,\n \"acc_norm\": 0.1908256880733945,\n \"acc_norm_stderr\": 0.016847676400091112\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.1527777777777778,\n \"acc_stderr\": 0.024536326026134224,\n \"acc_norm\": 0.1527777777777778,\n \"acc_norm_stderr\": 0.024536326026134224\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.2549019607843137,\n \"acc_stderr\": 0.030587591351604246,\n \"acc_norm\": 0.2549019607843137,\n \"acc_norm_stderr\": 0.030587591351604246\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.26582278481012656,\n \"acc_stderr\": 0.02875679962965834,\n \"acc_norm\": 0.26582278481012656,\n \"acc_norm_stderr\": 0.02875679962965834\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.30493273542600896,\n \"acc_stderr\": 0.030898610882477515,\n \"acc_norm\": 0.30493273542600896,\n \"acc_norm_stderr\": 0.030898610882477515\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.2748091603053435,\n \"acc_stderr\": 0.03915345408847836,\n \"acc_norm\": 0.2748091603053435,\n \"acc_norm_stderr\": 0.03915345408847836\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.23140495867768596,\n \"acc_stderr\": 0.03849856098794088,\n \"acc_norm\": 0.23140495867768596,\n \"acc_norm_stderr\": 0.03849856098794088\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.25925925925925924,\n \"acc_stderr\": 0.042365112580946336,\n \"acc_norm\": 0.25925925925925924,\n \"acc_norm_stderr\": 0.042365112580946336\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.22085889570552147,\n \"acc_stderr\": 0.032591773927421776,\n \"acc_norm\": 0.22085889570552147,\n \"acc_norm_stderr\": 0.032591773927421776\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.30357142857142855,\n \"acc_stderr\": 0.04364226155841044,\n \"acc_norm\": 0.30357142857142855,\n \"acc_norm_stderr\": 0.04364226155841044\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.18446601941747573,\n \"acc_stderr\": 0.03840423627288276,\n \"acc_norm\": 0.18446601941747573,\n \"acc_norm_stderr\": 0.03840423627288276\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2863247863247863,\n \"acc_stderr\": 0.029614323690456648,\n \"acc_norm\": 0.2863247863247863,\n \"acc_norm_stderr\": 0.029614323690456648\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.2413793103448276,\n \"acc_stderr\": 0.01530238012354209,\n \"acc_norm\": 0.2413793103448276,\n \"acc_norm_stderr\": 0.01530238012354209\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.25722543352601157,\n \"acc_stderr\": 0.02353292543104428,\n \"acc_norm\": 0.25722543352601157,\n \"acc_norm_stderr\": 0.02353292543104428\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23798882681564246,\n \"acc_stderr\": 0.014242630070574915,\n \"acc_norm\": 0.23798882681564246,\n \"acc_norm_stderr\": 0.014242630070574915\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.21568627450980393,\n \"acc_stderr\": 0.02355083135199509,\n \"acc_norm\": 0.21568627450980393,\n \"acc_norm_stderr\": 0.02355083135199509\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.1864951768488746,\n \"acc_stderr\": 0.02212243977248077,\n \"acc_norm\": 0.1864951768488746,\n \"acc_norm_stderr\": 0.02212243977248077\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.2345679012345679,\n \"acc_stderr\": 0.023576881744005716,\n \"acc_norm\": 0.2345679012345679,\n \"acc_norm_stderr\": 0.023576881744005716\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.23049645390070922,\n \"acc_stderr\": 0.025123739226872405,\n \"acc_norm\": 0.23049645390070922,\n \"acc_norm_stderr\": 0.025123739226872405\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.24641460234680573,\n \"acc_stderr\": 0.011005971399927237,\n \"acc_norm\": 0.24641460234680573,\n \"acc_norm_stderr\": 0.011005971399927237\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.1801470588235294,\n \"acc_stderr\": 0.02334516361654485,\n \"acc_norm\": 0.1801470588235294,\n \"acc_norm_stderr\": 0.02334516361654485\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.25163398692810457,\n \"acc_stderr\": 0.017555818091322256,\n \"acc_norm\": 0.25163398692810457,\n \"acc_norm_stderr\": 0.017555818091322256\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.21818181818181817,\n \"acc_stderr\": 0.03955932861795833,\n \"acc_norm\": 0.21818181818181817,\n \"acc_norm_stderr\": 0.03955932861795833\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.17959183673469387,\n \"acc_stderr\": 0.024573293589585637,\n \"acc_norm\": 0.17959183673469387,\n \"acc_norm_stderr\": 0.024573293589585637\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.23383084577114427,\n \"acc_stderr\": 0.029929415408348373,\n \"acc_norm\": 0.23383084577114427,\n \"acc_norm_stderr\": 0.029929415408348373\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.27,\n \"acc_stderr\": 0.0446196043338474,\n \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.0446196043338474\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.25903614457831325,\n \"acc_stderr\": 0.03410646614071856,\n \"acc_norm\": 0.25903614457831325,\n \"acc_norm_stderr\": 0.03410646614071856\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.3216374269005848,\n \"acc_stderr\": 0.03582529442573122,\n \"acc_norm\": 0.3216374269005848,\n \"acc_norm_stderr\": 0.03582529442573122\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.22276621787025705,\n \"mc1_stderr\": 0.014566506961396743,\n \"mc2\": 0.42524397939784214,\n \"mc2_stderr\": 0.015138993526497108\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.4980268350434096,\n \"acc_stderr\": 0.014052376259225627\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\": 0.0\n }\n}\n```", "repo_url": "https://huggingface.co/Locutusque/TinyMistral-248m", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_11_19T14_32_50.215947", "path": ["**/details_harness|arc:challenge|25_2023-11-19T14-32-50.215947.parquet"]}, {"split": "2023_12_04T09_49_00.111287", "path": ["**/details_harness|arc:challenge|25_2023-12-04T09-49-00.111287.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-04T09-49-00.111287.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_11_19T14_32_50.215947", "path": ["**/details_harness|drop|3_2023-11-19T14-32-50.215947.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-11-19T14-32-50.215947.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_11_19T14_32_50.215947", "path": ["**/details_harness|gsm8k|5_2023-11-19T14-32-50.215947.parquet"]}, {"split": "2023_12_04T09_49_00.111287", "path": ["**/details_harness|gsm8k|5_2023-12-04T09-49-00.111287.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-04T09-49-00.111287.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_11_19T14_32_50.215947", "path": ["**/details_harness|hellaswag|10_2023-11-19T14-32-50.215947.parquet"]}, {"split": "2023_12_04T09_49_00.111287", "path": ["**/details_harness|hellaswag|10_2023-12-04T09-49-00.111287.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-04T09-49-00.111287.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_11_19T14_32_50.215947", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-19T14-32-50.215947.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-11-19T14-32-50.215947.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-11-19T14-32-50.215947.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-11-19T14-32-50.215947.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-19T14-32-50.215947.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-11-19T14-32-50.215947.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-11-19T14-32-50.215947.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-11-19T14-32-50.215947.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-11-19T14-32-50.215947.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-11-19T14-32-50.215947.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-11-19T14-32-50.215947.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-11-19T14-32-50.215947.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-19T14-32-50.215947.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-11-19T14-32-50.215947.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-19T14-32-50.215947.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-19T14-32-50.215947.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-11-19T14-32-50.215947.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-11-19T14-32-50.215947.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-11-19T14-32-50.215947.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-19T14-32-50.215947.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-19T14-32-50.215947.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-19T14-32-50.215947.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-11-19T14-32-50.215947.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-19T14-32-50.215947.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-19T14-32-50.215947.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-19T14-32-50.215947.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-19T14-32-50.215947.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-11-19T14-32-50.215947.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-19T14-32-50.215947.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-19T14-32-50.215947.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-19T14-32-50.215947.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-19T14-32-50.215947.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-11-19T14-32-50.215947.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-11-19T14-32-50.215947.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-11-19T14-32-50.215947.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-11-19T14-32-50.215947.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-19T14-32-50.215947.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-11-19T14-32-50.215947.parquet", "**/details_harness|hendrycksTest-management|5_2023-11-19T14-32-50.215947.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-11-19T14-32-50.215947.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-11-19T14-32-50.215947.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-11-19T14-32-50.215947.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-11-19T14-32-50.215947.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-19T14-32-50.215947.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-11-19T14-32-50.215947.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-11-19T14-32-50.215947.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-11-19T14-32-50.215947.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-11-19T14-32-50.215947.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-11-19T14-32-50.215947.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-11-19T14-32-50.215947.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-11-19T14-32-50.215947.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-11-19T14-32-50.215947.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-11-19T14-32-50.215947.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-11-19T14-32-50.215947.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-19T14-32-50.215947.parquet", "**/details_harness|hendrycksTest-virology|5_2023-11-19T14-32-50.215947.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-11-19T14-32-50.215947.parquet"]}, {"split": "2023_12_04T09_49_00.111287", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T09-49-00.111287.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-04T09-49-00.111287.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-04T09-49-00.111287.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T09-49-00.111287.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T09-49-00.111287.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-04T09-49-00.111287.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T09-49-00.111287.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T09-49-00.111287.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T09-49-00.111287.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T09-49-00.111287.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-04T09-49-00.111287.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-04T09-49-00.111287.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T09-49-00.111287.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-04T09-49-00.111287.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T09-49-00.111287.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T09-49-00.111287.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T09-49-00.111287.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-04T09-49-00.111287.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T09-49-00.111287.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T09-49-00.111287.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T09-49-00.111287.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T09-49-00.111287.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T09-49-00.111287.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T09-49-00.111287.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T09-49-00.111287.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T09-49-00.111287.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T09-49-00.111287.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T09-49-00.111287.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T09-49-00.111287.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T09-49-00.111287.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T09-49-00.111287.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T09-49-00.111287.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-04T09-49-00.111287.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T09-49-00.111287.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-04T09-49-00.111287.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T09-49-00.111287.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T09-49-00.111287.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T09-49-00.111287.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-04T09-49-00.111287.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-04T09-49-00.111287.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T09-49-00.111287.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T09-49-00.111287.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T09-49-00.111287.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T09-49-00.111287.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-04T09-49-00.111287.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-04T09-49-00.111287.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-04T09-49-00.111287.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T09-49-00.111287.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-04T09-49-00.111287.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T09-49-00.111287.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T09-49-00.111287.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-04T09-49-00.111287.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-04T09-49-00.111287.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-04T09-49-00.111287.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T09-49-00.111287.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-04T09-49-00.111287.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-04T09-49-00.111287.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T09-49-00.111287.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-04T09-49-00.111287.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-04T09-49-00.111287.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T09-49-00.111287.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T09-49-00.111287.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-04T09-49-00.111287.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T09-49-00.111287.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T09-49-00.111287.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T09-49-00.111287.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T09-49-00.111287.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-04T09-49-00.111287.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-04T09-49-00.111287.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T09-49-00.111287.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-04T09-49-00.111287.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T09-49-00.111287.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T09-49-00.111287.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T09-49-00.111287.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-04T09-49-00.111287.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T09-49-00.111287.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T09-49-00.111287.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T09-49-00.111287.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T09-49-00.111287.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T09-49-00.111287.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T09-49-00.111287.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T09-49-00.111287.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T09-49-00.111287.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T09-49-00.111287.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T09-49-00.111287.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T09-49-00.111287.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T09-49-00.111287.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T09-49-00.111287.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T09-49-00.111287.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-04T09-49-00.111287.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T09-49-00.111287.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-04T09-49-00.111287.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T09-49-00.111287.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T09-49-00.111287.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T09-49-00.111287.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-04T09-49-00.111287.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-04T09-49-00.111287.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T09-49-00.111287.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T09-49-00.111287.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T09-49-00.111287.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T09-49-00.111287.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-04T09-49-00.111287.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-04T09-49-00.111287.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-04T09-49-00.111287.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T09-49-00.111287.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-04T09-49-00.111287.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T09-49-00.111287.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T09-49-00.111287.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-04T09-49-00.111287.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-04T09-49-00.111287.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-04T09-49-00.111287.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T09-49-00.111287.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-04T09-49-00.111287.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-04T09-49-00.111287.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_11_19T14_32_50.215947", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-19T14-32-50.215947.parquet"]}, {"split": "2023_12_04T09_49_00.111287", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T09-49-00.111287.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T09-49-00.111287.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_11_19T14_32_50.215947", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-11-19T14-32-50.215947.parquet"]}, {"split": "2023_12_04T09_49_00.111287", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-04T09-49-00.111287.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-04T09-49-00.111287.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_11_19T14_32_50.215947", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-11-19T14-32-50.215947.parquet"]}, {"split": "2023_12_04T09_49_00.111287", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-04T09-49-00.111287.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-04T09-49-00.111287.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_11_19T14_32_50.215947", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-11-19T14-32-50.215947.parquet"]}, {"split": "2023_12_04T09_49_00.111287", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T09-49-00.111287.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T09-49-00.111287.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_11_19T14_32_50.215947", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-19T14-32-50.215947.parquet"]}, {"split": "2023_12_04T09_49_00.111287", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T09-49-00.111287.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T09-49-00.111287.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_11_19T14_32_50.215947", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-11-19T14-32-50.215947.parquet"]}, {"split": "2023_12_04T09_49_00.111287", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-04T09-49-00.111287.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-04T09-49-00.111287.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_11_19T14_32_50.215947", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-11-19T14-32-50.215947.parquet"]}, {"split": "2023_12_04T09_49_00.111287", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T09-49-00.111287.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T09-49-00.111287.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_11_19T14_32_50.215947", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-11-19T14-32-50.215947.parquet"]}, {"split": "2023_12_04T09_49_00.111287", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T09-49-00.111287.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T09-49-00.111287.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_11_19T14_32_50.215947", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-11-19T14-32-50.215947.parquet"]}, {"split": "2023_12_04T09_49_00.111287", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T09-49-00.111287.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T09-49-00.111287.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_11_19T14_32_50.215947", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-11-19T14-32-50.215947.parquet"]}, {"split": "2023_12_04T09_49_00.111287", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T09-49-00.111287.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T09-49-00.111287.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_11_19T14_32_50.215947", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-11-19T14-32-50.215947.parquet"]}, {"split": "2023_12_04T09_49_00.111287", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-04T09-49-00.111287.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-04T09-49-00.111287.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_11_19T14_32_50.215947", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-11-19T14-32-50.215947.parquet"]}, {"split": "2023_12_04T09_49_00.111287", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-04T09-49-00.111287.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-04T09-49-00.111287.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_11_19T14_32_50.215947", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-19T14-32-50.215947.parquet"]}, {"split": "2023_12_04T09_49_00.111287", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T09-49-00.111287.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T09-49-00.111287.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_11_19T14_32_50.215947", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-11-19T14-32-50.215947.parquet"]}, {"split": "2023_12_04T09_49_00.111287", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-04T09-49-00.111287.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-04T09-49-00.111287.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_11_19T14_32_50.215947", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-19T14-32-50.215947.parquet"]}, {"split": "2023_12_04T09_49_00.111287", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T09-49-00.111287.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T09-49-00.111287.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_11_19T14_32_50.215947", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-19T14-32-50.215947.parquet"]}, {"split": "2023_12_04T09_49_00.111287", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T09-49-00.111287.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T09-49-00.111287.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_11_19T14_32_50.215947", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-11-19T14-32-50.215947.parquet"]}, {"split": "2023_12_04T09_49_00.111287", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T09-49-00.111287.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T09-49-00.111287.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_11_19T14_32_50.215947", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-11-19T14-32-50.215947.parquet"]}, {"split": "2023_12_04T09_49_00.111287", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-04T09-49-00.111287.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-04T09-49-00.111287.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_11_19T14_32_50.215947", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-11-19T14-32-50.215947.parquet"]}, {"split": "2023_12_04T09_49_00.111287", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T09-49-00.111287.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T09-49-00.111287.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_11_19T14_32_50.215947", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-19T14-32-50.215947.parquet"]}, {"split": "2023_12_04T09_49_00.111287", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T09-49-00.111287.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T09-49-00.111287.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_11_19T14_32_50.215947", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-19T14-32-50.215947.parquet"]}, {"split": "2023_12_04T09_49_00.111287", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T09-49-00.111287.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T09-49-00.111287.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_11_19T14_32_50.215947", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-19T14-32-50.215947.parquet"]}, {"split": "2023_12_04T09_49_00.111287", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T09-49-00.111287.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T09-49-00.111287.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_11_19T14_32_50.215947", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-11-19T14-32-50.215947.parquet"]}, {"split": "2023_12_04T09_49_00.111287", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T09-49-00.111287.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T09-49-00.111287.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_11_19T14_32_50.215947", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-19T14-32-50.215947.parquet"]}, {"split": "2023_12_04T09_49_00.111287", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T09-49-00.111287.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T09-49-00.111287.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_11_19T14_32_50.215947", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-19T14-32-50.215947.parquet"]}, {"split": "2023_12_04T09_49_00.111287", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T09-49-00.111287.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T09-49-00.111287.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_11_19T14_32_50.215947", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-19T14-32-50.215947.parquet"]}, {"split": "2023_12_04T09_49_00.111287", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T09-49-00.111287.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T09-49-00.111287.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_11_19T14_32_50.215947", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-19T14-32-50.215947.parquet"]}, {"split": "2023_12_04T09_49_00.111287", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T09-49-00.111287.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T09-49-00.111287.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_11_19T14_32_50.215947", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-11-19T14-32-50.215947.parquet"]}, {"split": "2023_12_04T09_49_00.111287", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T09-49-00.111287.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T09-49-00.111287.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_11_19T14_32_50.215947", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-19T14-32-50.215947.parquet"]}, {"split": "2023_12_04T09_49_00.111287", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T09-49-00.111287.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T09-49-00.111287.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_11_19T14_32_50.215947", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-19T14-32-50.215947.parquet"]}, {"split": "2023_12_04T09_49_00.111287", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T09-49-00.111287.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T09-49-00.111287.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_11_19T14_32_50.215947", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-19T14-32-50.215947.parquet"]}, {"split": "2023_12_04T09_49_00.111287", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T09-49-00.111287.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T09-49-00.111287.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_11_19T14_32_50.215947", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-19T14-32-50.215947.parquet"]}, {"split": "2023_12_04T09_49_00.111287", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T09-49-00.111287.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T09-49-00.111287.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_11_19T14_32_50.215947", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-11-19T14-32-50.215947.parquet"]}, {"split": "2023_12_04T09_49_00.111287", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-04T09-49-00.111287.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-04T09-49-00.111287.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_11_19T14_32_50.215947", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-11-19T14-32-50.215947.parquet"]}, {"split": "2023_12_04T09_49_00.111287", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T09-49-00.111287.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T09-49-00.111287.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_11_19T14_32_50.215947", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-11-19T14-32-50.215947.parquet"]}, {"split": "2023_12_04T09_49_00.111287", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-04T09-49-00.111287.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-04T09-49-00.111287.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_11_19T14_32_50.215947", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-11-19T14-32-50.215947.parquet"]}, {"split": "2023_12_04T09_49_00.111287", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T09-49-00.111287.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T09-49-00.111287.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_11_19T14_32_50.215947", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-19T14-32-50.215947.parquet"]}, {"split": "2023_12_04T09_49_00.111287", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T09-49-00.111287.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T09-49-00.111287.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_11_19T14_32_50.215947", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-11-19T14-32-50.215947.parquet"]}, {"split": "2023_12_04T09_49_00.111287", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T09-49-00.111287.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T09-49-00.111287.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_11_19T14_32_50.215947", "path": ["**/details_harness|hendrycksTest-management|5_2023-11-19T14-32-50.215947.parquet"]}, {"split": "2023_12_04T09_49_00.111287", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-04T09-49-00.111287.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-04T09-49-00.111287.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_11_19T14_32_50.215947", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-11-19T14-32-50.215947.parquet"]}, {"split": "2023_12_04T09_49_00.111287", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-04T09-49-00.111287.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-04T09-49-00.111287.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_11_19T14_32_50.215947", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-11-19T14-32-50.215947.parquet"]}, {"split": "2023_12_04T09_49_00.111287", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T09-49-00.111287.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T09-49-00.111287.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_11_19T14_32_50.215947", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-11-19T14-32-50.215947.parquet"]}, {"split": "2023_12_04T09_49_00.111287", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T09-49-00.111287.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T09-49-00.111287.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_11_19T14_32_50.215947", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-11-19T14-32-50.215947.parquet"]}, {"split": "2023_12_04T09_49_00.111287", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T09-49-00.111287.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T09-49-00.111287.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_11_19T14_32_50.215947", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-19T14-32-50.215947.parquet"]}, {"split": "2023_12_04T09_49_00.111287", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T09-49-00.111287.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T09-49-00.111287.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_11_19T14_32_50.215947", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-11-19T14-32-50.215947.parquet"]}, {"split": "2023_12_04T09_49_00.111287", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-04T09-49-00.111287.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-04T09-49-00.111287.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_11_19T14_32_50.215947", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-11-19T14-32-50.215947.parquet"]}, {"split": "2023_12_04T09_49_00.111287", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-04T09-49-00.111287.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-04T09-49-00.111287.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_11_19T14_32_50.215947", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-11-19T14-32-50.215947.parquet"]}, {"split": "2023_12_04T09_49_00.111287", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-04T09-49-00.111287.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-04T09-49-00.111287.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_11_19T14_32_50.215947", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-11-19T14-32-50.215947.parquet"]}, {"split": "2023_12_04T09_49_00.111287", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T09-49-00.111287.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T09-49-00.111287.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_11_19T14_32_50.215947", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-11-19T14-32-50.215947.parquet"]}, {"split": "2023_12_04T09_49_00.111287", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-04T09-49-00.111287.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-04T09-49-00.111287.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_11_19T14_32_50.215947", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-11-19T14-32-50.215947.parquet"]}, {"split": "2023_12_04T09_49_00.111287", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T09-49-00.111287.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T09-49-00.111287.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_11_19T14_32_50.215947", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-11-19T14-32-50.215947.parquet"]}, {"split": "2023_12_04T09_49_00.111287", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T09-49-00.111287.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T09-49-00.111287.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_11_19T14_32_50.215947", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-11-19T14-32-50.215947.parquet"]}, {"split": "2023_12_04T09_49_00.111287", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-04T09-49-00.111287.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-04T09-49-00.111287.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_11_19T14_32_50.215947", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-11-19T14-32-50.215947.parquet"]}, {"split": "2023_12_04T09_49_00.111287", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-04T09-49-00.111287.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-04T09-49-00.111287.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_11_19T14_32_50.215947", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-11-19T14-32-50.215947.parquet"]}, {"split": "2023_12_04T09_49_00.111287", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-04T09-49-00.111287.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-04T09-49-00.111287.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_11_19T14_32_50.215947", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-19T14-32-50.215947.parquet"]}, {"split": "2023_12_04T09_49_00.111287", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T09-49-00.111287.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T09-49-00.111287.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_11_19T14_32_50.215947", "path": ["**/details_harness|hendrycksTest-virology|5_2023-11-19T14-32-50.215947.parquet"]}, {"split": "2023_12_04T09_49_00.111287", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-04T09-49-00.111287.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-04T09-49-00.111287.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_11_19T14_32_50.215947", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-11-19T14-32-50.215947.parquet"]}, {"split": "2023_12_04T09_49_00.111287", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-04T09-49-00.111287.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-04T09-49-00.111287.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_11_19T14_32_50.215947", "path": ["**/details_harness|truthfulqa:mc|0_2023-11-19T14-32-50.215947.parquet"]}, {"split": "2023_12_04T09_49_00.111287", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-04T09-49-00.111287.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-04T09-49-00.111287.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_11_19T14_32_50.215947", "path": ["**/details_harness|winogrande|5_2023-11-19T14-32-50.215947.parquet"]}, {"split": "2023_12_04T09_49_00.111287", "path": ["**/details_harness|winogrande|5_2023-12-04T09-49-00.111287.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-04T09-49-00.111287.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_11_19T14_32_50.215947", "path": ["results_2023-11-19T14-32-50.215947.parquet"]}, {"split": "2023_12_04T09_49_00.111287", "path": ["results_2023-12-04T09-49-00.111287.parquet"]}, {"split": "latest", "path": ["results_2023-12-04T09-49-00.111287.parquet"]}]}]} | 2023-12-04T09:52:39+00:00 | []
| []
| TAGS
#region-us
|
# Dataset Card for Evaluation run of Locutusque/TinyMistral-248m
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model Locutusque/TinyMistral-248m on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-04T09:49:00.111287(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
| [
"# Dataset Card for Evaluation run of Locutusque/TinyMistral-248m",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model Locutusque/TinyMistral-248m on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-04T09:49:00.111287(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
]
| [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of Locutusque/TinyMistral-248m",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model Locutusque/TinyMistral-248m on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-04T09:49:00.111287(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
]
| [
6,
20,
31,
169,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
]
| [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of Locutusque/TinyMistral-248m## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model Locutusque/TinyMistral-248m on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-12-04T09:49:00.111287(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
]
|
2e1a829bd21aa4aace9a3ba9c974581983f97611 |
# Dataset Card for Evaluation run of l3utterfly/mistral-7b-v0.1-layla-v1
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/l3utterfly/mistral-7b-v0.1-layla-v1
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [l3utterfly/mistral-7b-v0.1-layla-v1](https://huggingface.co/l3utterfly/mistral-7b-v0.1-layla-v1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_l3utterfly__mistral-7b-v0.1-layla-v1_public",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-11-19T14:33:16.714547](https://huggingface.co/datasets/open-llm-leaderboard/details_l3utterfly__mistral-7b-v0.1-layla-v1_public/blob/main/results_2023-11-19T14-33-16.714547.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5984567193500476,
"acc_stderr": 0.03306896727223633,
"acc_norm": 0.6069576555275077,
"acc_norm_stderr": 0.03379036516702569,
"mc1": 0.33659730722154224,
"mc1_stderr": 0.016542412809494887,
"mc2": 0.48895630003672147,
"mc2_stderr": 0.01547031853751192,
"em": 0.3457424496644295,
"em_stderr": 0.0048706887210275595,
"f1": 0.40009542785235,
"f1_stderr": 0.004747563788656675
},
"harness|arc:challenge|25": {
"acc": 0.5639931740614335,
"acc_stderr": 0.014491225699230916,
"acc_norm": 0.6015358361774744,
"acc_norm_stderr": 0.014306946052735567
},
"harness|hellaswag|10": {
"acc": 0.6378211511651065,
"acc_stderr": 0.0047964786644038426,
"acc_norm": 0.8325034853614818,
"acc_norm_stderr": 0.003726554129348462
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6074074074074074,
"acc_stderr": 0.04218506215368881,
"acc_norm": 0.6074074074074074,
"acc_norm_stderr": 0.04218506215368881
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6842105263157895,
"acc_stderr": 0.03782728980865469,
"acc_norm": 0.6842105263157895,
"acc_norm_stderr": 0.03782728980865469
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6716981132075471,
"acc_stderr": 0.02890159361241178,
"acc_norm": 0.6716981132075471,
"acc_norm_stderr": 0.02890159361241178
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6736111111111112,
"acc_stderr": 0.03921067198982266,
"acc_norm": 0.6736111111111112,
"acc_norm_stderr": 0.03921067198982266
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252604,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252604
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6184971098265896,
"acc_stderr": 0.03703851193099521,
"acc_norm": 0.6184971098265896,
"acc_norm_stderr": 0.03703851193099521
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.38235294117647056,
"acc_stderr": 0.04835503696107223,
"acc_norm": 0.38235294117647056,
"acc_norm_stderr": 0.04835503696107223
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.73,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.73,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5234042553191489,
"acc_stderr": 0.03265019475033582,
"acc_norm": 0.5234042553191489,
"acc_norm_stderr": 0.03265019475033582
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.37719298245614036,
"acc_stderr": 0.04559522141958216,
"acc_norm": 0.37719298245614036,
"acc_norm_stderr": 0.04559522141958216
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5379310344827586,
"acc_stderr": 0.04154659671707548,
"acc_norm": 0.5379310344827586,
"acc_norm_stderr": 0.04154659671707548
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41005291005291006,
"acc_stderr": 0.02533120243894444,
"acc_norm": 0.41005291005291006,
"acc_norm_stderr": 0.02533120243894444
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.04444444444444449,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.04444444444444449
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6967741935483871,
"acc_stderr": 0.02614868593067175,
"acc_norm": 0.6967741935483871,
"acc_norm_stderr": 0.02614868593067175
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.43349753694581283,
"acc_stderr": 0.034867317274198714,
"acc_norm": 0.43349753694581283,
"acc_norm_stderr": 0.034867317274198714
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.59,
"acc_stderr": 0.04943110704237101,
"acc_norm": 0.59,
"acc_norm_stderr": 0.04943110704237101
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6909090909090909,
"acc_stderr": 0.036085410115739666,
"acc_norm": 0.6909090909090909,
"acc_norm_stderr": 0.036085410115739666
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7525252525252525,
"acc_stderr": 0.030746300742124484,
"acc_norm": 0.7525252525252525,
"acc_norm_stderr": 0.030746300742124484
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8238341968911918,
"acc_stderr": 0.027493504244548057,
"acc_norm": 0.8238341968911918,
"acc_norm_stderr": 0.027493504244548057
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5794871794871795,
"acc_stderr": 0.025028610276710855,
"acc_norm": 0.5794871794871795,
"acc_norm_stderr": 0.025028610276710855
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.027940457136228412,
"acc_norm": 0.3,
"acc_norm_stderr": 0.027940457136228412
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.634453781512605,
"acc_stderr": 0.031282177063684614,
"acc_norm": 0.634453781512605,
"acc_norm_stderr": 0.031282177063684614
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33112582781456956,
"acc_stderr": 0.038425817186598696,
"acc_norm": 0.33112582781456956,
"acc_norm_stderr": 0.038425817186598696
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7926605504587156,
"acc_stderr": 0.017381415563608674,
"acc_norm": 0.7926605504587156,
"acc_norm_stderr": 0.017381415563608674
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4074074074074074,
"acc_stderr": 0.03350991604696043,
"acc_norm": 0.4074074074074074,
"acc_norm_stderr": 0.03350991604696043
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7450980392156863,
"acc_stderr": 0.03058759135160425,
"acc_norm": 0.7450980392156863,
"acc_norm_stderr": 0.03058759135160425
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7510548523206751,
"acc_stderr": 0.028146970599422644,
"acc_norm": 0.7510548523206751,
"acc_norm_stderr": 0.028146970599422644
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6681614349775785,
"acc_stderr": 0.031602951437766785,
"acc_norm": 0.6681614349775785,
"acc_norm_stderr": 0.031602951437766785
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7175572519083969,
"acc_stderr": 0.03948406125768361,
"acc_norm": 0.7175572519083969,
"acc_norm_stderr": 0.03948406125768361
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.768595041322314,
"acc_stderr": 0.03849856098794088,
"acc_norm": 0.768595041322314,
"acc_norm_stderr": 0.03849856098794088
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.75,
"acc_stderr": 0.04186091791394607,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04186091791394607
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.754601226993865,
"acc_stderr": 0.03380939813943354,
"acc_norm": 0.754601226993865,
"acc_norm_stderr": 0.03380939813943354
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.375,
"acc_stderr": 0.04595091388086298,
"acc_norm": 0.375,
"acc_norm_stderr": 0.04595091388086298
},
"harness|hendrycksTest-management|5": {
"acc": 0.7669902912621359,
"acc_stderr": 0.04185832598928315,
"acc_norm": 0.7669902912621359,
"acc_norm_stderr": 0.04185832598928315
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8504273504273504,
"acc_stderr": 0.023365051491753715,
"acc_norm": 0.8504273504273504,
"acc_norm_stderr": 0.023365051491753715
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7662835249042146,
"acc_stderr": 0.015133383278988836,
"acc_norm": 0.7662835249042146,
"acc_norm_stderr": 0.015133383278988836
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6820809248554913,
"acc_stderr": 0.025070713719153183,
"acc_norm": 0.6820809248554913,
"acc_norm_stderr": 0.025070713719153183
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4201117318435754,
"acc_stderr": 0.016507671073256402,
"acc_norm": 0.4201117318435754,
"acc_norm_stderr": 0.016507671073256402
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6928104575163399,
"acc_stderr": 0.026415601914388995,
"acc_norm": 0.6928104575163399,
"acc_norm_stderr": 0.026415601914388995
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6784565916398714,
"acc_stderr": 0.026527724079528872,
"acc_norm": 0.6784565916398714,
"acc_norm_stderr": 0.026527724079528872
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.02622964917882116,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.02622964917882116
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.43617021276595747,
"acc_stderr": 0.029583452036284062,
"acc_norm": 0.43617021276595747,
"acc_norm_stderr": 0.029583452036284062
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4406779661016949,
"acc_stderr": 0.01268003799409707,
"acc_norm": 0.4406779661016949,
"acc_norm_stderr": 0.01268003799409707
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6102941176470589,
"acc_stderr": 0.0296246635811597,
"acc_norm": 0.6102941176470589,
"acc_norm_stderr": 0.0296246635811597
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6045751633986928,
"acc_stderr": 0.019780465954777515,
"acc_norm": 0.6045751633986928,
"acc_norm_stderr": 0.019780465954777515
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6454545454545455,
"acc_stderr": 0.045820048415054174,
"acc_norm": 0.6454545454545455,
"acc_norm_stderr": 0.045820048415054174
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6775510204081633,
"acc_stderr": 0.029923100563683906,
"acc_norm": 0.6775510204081633,
"acc_norm_stderr": 0.029923100563683906
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8407960199004975,
"acc_stderr": 0.02587064676616913,
"acc_norm": 0.8407960199004975,
"acc_norm_stderr": 0.02587064676616913
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.84,
"acc_stderr": 0.036845294917747115,
"acc_norm": 0.84,
"acc_norm_stderr": 0.036845294917747115
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5240963855421686,
"acc_stderr": 0.03887971849597264,
"acc_norm": 0.5240963855421686,
"acc_norm_stderr": 0.03887971849597264
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7894736842105263,
"acc_stderr": 0.03126781714663179,
"acc_norm": 0.7894736842105263,
"acc_norm_stderr": 0.03126781714663179
},
"harness|truthfulqa:mc|0": {
"mc1": 0.33659730722154224,
"mc1_stderr": 0.016542412809494887,
"mc2": 0.48895630003672147,
"mc2_stderr": 0.01547031853751192
},
"harness|winogrande|5": {
"acc": 0.7592738752959748,
"acc_stderr": 0.012015559212224176
},
"harness|drop|3": {
"em": 0.3457424496644295,
"em_stderr": 0.0048706887210275595,
"f1": 0.40009542785235,
"f1_stderr": 0.004747563788656675
},
"harness|gsm8k|5": {
"acc": 0.1683093252463988,
"acc_stderr": 0.010305695358125519
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] | open-llm-leaderboard/details_l3utterfly__mistral-7b-v0.1-layla-v1 | [
"region:us"
]
| 2023-11-19T14:36:17+00:00 | {"pretty_name": "Evaluation run of l3utterfly/mistral-7b-v0.1-layla-v1", "dataset_summary": "Dataset automatically created during the evaluation run of model [l3utterfly/mistral-7b-v0.1-layla-v1](https://huggingface.co/l3utterfly/mistral-7b-v0.1-layla-v1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_l3utterfly__mistral-7b-v0.1-layla-v1_public\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-11-19T14:33:16.714547](https://huggingface.co/datasets/open-llm-leaderboard/details_l3utterfly__mistral-7b-v0.1-layla-v1_public/blob/main/results_2023-11-19T14-33-16.714547.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5984567193500476,\n \"acc_stderr\": 0.03306896727223633,\n \"acc_norm\": 0.6069576555275077,\n \"acc_norm_stderr\": 0.03379036516702569,\n \"mc1\": 0.33659730722154224,\n \"mc1_stderr\": 0.016542412809494887,\n \"mc2\": 0.48895630003672147,\n \"mc2_stderr\": 0.01547031853751192,\n \"em\": 0.3457424496644295,\n \"em_stderr\": 0.0048706887210275595,\n \"f1\": 0.40009542785235,\n \"f1_stderr\": 0.004747563788656675\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5639931740614335,\n \"acc_stderr\": 0.014491225699230916,\n \"acc_norm\": 0.6015358361774744,\n \"acc_norm_stderr\": 0.014306946052735567\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6378211511651065,\n \"acc_stderr\": 0.0047964786644038426,\n \"acc_norm\": 0.8325034853614818,\n \"acc_norm_stderr\": 0.003726554129348462\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6074074074074074,\n \"acc_stderr\": 0.04218506215368881,\n \"acc_norm\": 0.6074074074074074,\n \"acc_norm_stderr\": 0.04218506215368881\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6842105263157895,\n \"acc_stderr\": 0.03782728980865469,\n \"acc_norm\": 0.6842105263157895,\n \"acc_norm_stderr\": 0.03782728980865469\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6716981132075471,\n \"acc_stderr\": 0.02890159361241178,\n \"acc_norm\": 0.6716981132075471,\n \"acc_norm_stderr\": 0.02890159361241178\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6736111111111112,\n \"acc_stderr\": 0.03921067198982266,\n \"acc_norm\": 0.6736111111111112,\n \"acc_norm_stderr\": 0.03921067198982266\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956911,\n \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956911\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6184971098265896,\n \"acc_stderr\": 0.03703851193099521,\n \"acc_norm\": 0.6184971098265896,\n \"acc_norm_stderr\": 0.03703851193099521\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.04835503696107223,\n \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.04835503696107223\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.73,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5234042553191489,\n \"acc_stderr\": 0.03265019475033582,\n \"acc_norm\": 0.5234042553191489,\n \"acc_norm_stderr\": 0.03265019475033582\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.37719298245614036,\n \"acc_stderr\": 0.04559522141958216,\n \"acc_norm\": 0.37719298245614036,\n \"acc_norm_stderr\": 0.04559522141958216\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5379310344827586,\n \"acc_stderr\": 0.04154659671707548,\n \"acc_norm\": 0.5379310344827586,\n \"acc_norm_stderr\": 0.04154659671707548\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.41005291005291006,\n \"acc_stderr\": 0.02533120243894444,\n \"acc_norm\": 0.41005291005291006,\n \"acc_norm_stderr\": 0.02533120243894444\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4444444444444444,\n \"acc_stderr\": 0.04444444444444449,\n \"acc_norm\": 0.4444444444444444,\n \"acc_norm_stderr\": 0.04444444444444449\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6967741935483871,\n \"acc_stderr\": 0.02614868593067175,\n \"acc_norm\": 0.6967741935483871,\n \"acc_norm_stderr\": 0.02614868593067175\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.43349753694581283,\n \"acc_stderr\": 0.034867317274198714,\n \"acc_norm\": 0.43349753694581283,\n \"acc_norm_stderr\": 0.034867317274198714\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.59,\n \"acc_stderr\": 0.04943110704237101,\n \"acc_norm\": 0.59,\n \"acc_norm_stderr\": 0.04943110704237101\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.6909090909090909,\n \"acc_stderr\": 0.036085410115739666,\n \"acc_norm\": 0.6909090909090909,\n \"acc_norm_stderr\": 0.036085410115739666\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7525252525252525,\n \"acc_stderr\": 0.030746300742124484,\n \"acc_norm\": 0.7525252525252525,\n \"acc_norm_stderr\": 0.030746300742124484\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8238341968911918,\n \"acc_stderr\": 0.027493504244548057,\n \"acc_norm\": 0.8238341968911918,\n \"acc_norm_stderr\": 0.027493504244548057\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.5794871794871795,\n \"acc_stderr\": 0.025028610276710855,\n \"acc_norm\": 0.5794871794871795,\n \"acc_norm_stderr\": 0.025028610276710855\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.027940457136228412,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.027940457136228412\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.634453781512605,\n \"acc_stderr\": 0.031282177063684614,\n \"acc_norm\": 0.634453781512605,\n \"acc_norm_stderr\": 0.031282177063684614\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.33112582781456956,\n \"acc_stderr\": 0.038425817186598696,\n \"acc_norm\": 0.33112582781456956,\n \"acc_norm_stderr\": 0.038425817186598696\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.7926605504587156,\n \"acc_stderr\": 0.017381415563608674,\n \"acc_norm\": 0.7926605504587156,\n \"acc_norm_stderr\": 0.017381415563608674\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.4074074074074074,\n \"acc_stderr\": 0.03350991604696043,\n \"acc_norm\": 0.4074074074074074,\n \"acc_norm_stderr\": 0.03350991604696043\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7450980392156863,\n \"acc_stderr\": 0.03058759135160425,\n \"acc_norm\": 0.7450980392156863,\n \"acc_norm_stderr\": 0.03058759135160425\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7510548523206751,\n \"acc_stderr\": 0.028146970599422644,\n \"acc_norm\": 0.7510548523206751,\n \"acc_norm_stderr\": 0.028146970599422644\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6681614349775785,\n \"acc_stderr\": 0.031602951437766785,\n \"acc_norm\": 0.6681614349775785,\n \"acc_norm_stderr\": 0.031602951437766785\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7175572519083969,\n \"acc_stderr\": 0.03948406125768361,\n \"acc_norm\": 0.7175572519083969,\n \"acc_norm_stderr\": 0.03948406125768361\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.768595041322314,\n \"acc_stderr\": 0.03849856098794088,\n \"acc_norm\": 0.768595041322314,\n \"acc_norm_stderr\": 0.03849856098794088\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04186091791394607,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04186091791394607\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.754601226993865,\n \"acc_stderr\": 0.03380939813943354,\n \"acc_norm\": 0.754601226993865,\n \"acc_norm_stderr\": 0.03380939813943354\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.375,\n \"acc_stderr\": 0.04595091388086298,\n \"acc_norm\": 0.375,\n \"acc_norm_stderr\": 0.04595091388086298\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8504273504273504,\n \"acc_stderr\": 0.023365051491753715,\n \"acc_norm\": 0.8504273504273504,\n \"acc_norm_stderr\": 0.023365051491753715\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621505,\n \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.04688261722621505\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7662835249042146,\n \"acc_stderr\": 0.015133383278988836,\n \"acc_norm\": 0.7662835249042146,\n \"acc_norm_stderr\": 0.015133383278988836\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.6820809248554913,\n \"acc_stderr\": 0.025070713719153183,\n \"acc_norm\": 0.6820809248554913,\n \"acc_norm_stderr\": 0.025070713719153183\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4201117318435754,\n \"acc_stderr\": 0.016507671073256402,\n \"acc_norm\": 0.4201117318435754,\n \"acc_norm_stderr\": 0.016507671073256402\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.6928104575163399,\n \"acc_stderr\": 0.026415601914388995,\n \"acc_norm\": 0.6928104575163399,\n \"acc_norm_stderr\": 0.026415601914388995\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6784565916398714,\n \"acc_stderr\": 0.026527724079528872,\n \"acc_norm\": 0.6784565916398714,\n \"acc_norm_stderr\": 0.026527724079528872\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.02622964917882116,\n \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.02622964917882116\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.43617021276595747,\n \"acc_stderr\": 0.029583452036284062,\n \"acc_norm\": 0.43617021276595747,\n \"acc_norm_stderr\": 0.029583452036284062\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4406779661016949,\n \"acc_stderr\": 0.01268003799409707,\n \"acc_norm\": 0.4406779661016949,\n \"acc_norm_stderr\": 0.01268003799409707\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6102941176470589,\n \"acc_stderr\": 0.0296246635811597,\n \"acc_norm\": 0.6102941176470589,\n \"acc_norm_stderr\": 0.0296246635811597\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6045751633986928,\n \"acc_stderr\": 0.019780465954777515,\n \"acc_norm\": 0.6045751633986928,\n \"acc_norm_stderr\": 0.019780465954777515\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6454545454545455,\n \"acc_stderr\": 0.045820048415054174,\n \"acc_norm\": 0.6454545454545455,\n \"acc_norm_stderr\": 0.045820048415054174\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.6775510204081633,\n \"acc_stderr\": 0.029923100563683906,\n \"acc_norm\": 0.6775510204081633,\n \"acc_norm_stderr\": 0.029923100563683906\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8407960199004975,\n \"acc_stderr\": 0.02587064676616913,\n \"acc_norm\": 0.8407960199004975,\n \"acc_norm_stderr\": 0.02587064676616913\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.84,\n \"acc_stderr\": 0.036845294917747115,\n \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.036845294917747115\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5240963855421686,\n \"acc_stderr\": 0.03887971849597264,\n \"acc_norm\": 0.5240963855421686,\n \"acc_norm_stderr\": 0.03887971849597264\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.7894736842105263,\n \"acc_stderr\": 0.03126781714663179,\n \"acc_norm\": 0.7894736842105263,\n \"acc_norm_stderr\": 0.03126781714663179\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.33659730722154224,\n \"mc1_stderr\": 0.016542412809494887,\n \"mc2\": 0.48895630003672147,\n \"mc2_stderr\": 0.01547031853751192\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7592738752959748,\n \"acc_stderr\": 0.012015559212224176\n },\n \"harness|drop|3\": {\n \"em\": 0.3457424496644295,\n \"em_stderr\": 0.0048706887210275595,\n \"f1\": 0.40009542785235,\n \"f1_stderr\": 0.004747563788656675\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.1683093252463988,\n \"acc_stderr\": 0.010305695358125519\n }\n}\n```", "repo_url": "https://huggingface.co/l3utterfly/mistral-7b-v0.1-layla-v1", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_11_19T14_33_16.714547", "path": ["**/details_harness|arc:challenge|25_2023-11-19T14-33-16.714547.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-11-19T14-33-16.714547.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_11_19T14_33_16.714547", "path": ["**/details_harness|drop|3_2023-11-19T14-33-16.714547.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-11-19T14-33-16.714547.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_11_19T14_33_16.714547", "path": ["**/details_harness|gsm8k|5_2023-11-19T14-33-16.714547.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-11-19T14-33-16.714547.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_11_19T14_33_16.714547", "path": ["**/details_harness|hellaswag|10_2023-11-19T14-33-16.714547.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-11-19T14-33-16.714547.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_11_19T14_33_16.714547", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-19T14-33-16.714547.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-11-19T14-33-16.714547.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-11-19T14-33-16.714547.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-11-19T14-33-16.714547.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-19T14-33-16.714547.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-11-19T14-33-16.714547.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-11-19T14-33-16.714547.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-11-19T14-33-16.714547.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-11-19T14-33-16.714547.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-11-19T14-33-16.714547.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-11-19T14-33-16.714547.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-11-19T14-33-16.714547.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-19T14-33-16.714547.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-11-19T14-33-16.714547.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-19T14-33-16.714547.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-19T14-33-16.714547.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-11-19T14-33-16.714547.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-11-19T14-33-16.714547.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-11-19T14-33-16.714547.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-19T14-33-16.714547.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-19T14-33-16.714547.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-19T14-33-16.714547.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-11-19T14-33-16.714547.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-19T14-33-16.714547.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-19T14-33-16.714547.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-19T14-33-16.714547.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-19T14-33-16.714547.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-11-19T14-33-16.714547.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-19T14-33-16.714547.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-19T14-33-16.714547.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-19T14-33-16.714547.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-19T14-33-16.714547.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-11-19T14-33-16.714547.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-11-19T14-33-16.714547.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-11-19T14-33-16.714547.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-11-19T14-33-16.714547.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-19T14-33-16.714547.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-11-19T14-33-16.714547.parquet", "**/details_harness|hendrycksTest-management|5_2023-11-19T14-33-16.714547.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-11-19T14-33-16.714547.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-11-19T14-33-16.714547.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-11-19T14-33-16.714547.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-11-19T14-33-16.714547.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-19T14-33-16.714547.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-11-19T14-33-16.714547.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-11-19T14-33-16.714547.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-11-19T14-33-16.714547.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-11-19T14-33-16.714547.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-11-19T14-33-16.714547.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-11-19T14-33-16.714547.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-11-19T14-33-16.714547.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-11-19T14-33-16.714547.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-11-19T14-33-16.714547.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-11-19T14-33-16.714547.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-19T14-33-16.714547.parquet", "**/details_harness|hendrycksTest-virology|5_2023-11-19T14-33-16.714547.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-11-19T14-33-16.714547.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-19T14-33-16.714547.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-11-19T14-33-16.714547.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-11-19T14-33-16.714547.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-11-19T14-33-16.714547.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-19T14-33-16.714547.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-11-19T14-33-16.714547.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-11-19T14-33-16.714547.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-11-19T14-33-16.714547.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-11-19T14-33-16.714547.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-11-19T14-33-16.714547.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-11-19T14-33-16.714547.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-11-19T14-33-16.714547.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-19T14-33-16.714547.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-11-19T14-33-16.714547.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-19T14-33-16.714547.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-19T14-33-16.714547.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-11-19T14-33-16.714547.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-11-19T14-33-16.714547.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-11-19T14-33-16.714547.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-19T14-33-16.714547.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-19T14-33-16.714547.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-19T14-33-16.714547.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-11-19T14-33-16.714547.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-19T14-33-16.714547.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-19T14-33-16.714547.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-19T14-33-16.714547.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-19T14-33-16.714547.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-11-19T14-33-16.714547.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-19T14-33-16.714547.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-19T14-33-16.714547.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-19T14-33-16.714547.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-19T14-33-16.714547.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-11-19T14-33-16.714547.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-11-19T14-33-16.714547.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-11-19T14-33-16.714547.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-11-19T14-33-16.714547.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-19T14-33-16.714547.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-11-19T14-33-16.714547.parquet", "**/details_harness|hendrycksTest-management|5_2023-11-19T14-33-16.714547.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-11-19T14-33-16.714547.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-11-19T14-33-16.714547.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-11-19T14-33-16.714547.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-11-19T14-33-16.714547.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-19T14-33-16.714547.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-11-19T14-33-16.714547.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-11-19T14-33-16.714547.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-11-19T14-33-16.714547.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-11-19T14-33-16.714547.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-11-19T14-33-16.714547.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-11-19T14-33-16.714547.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-11-19T14-33-16.714547.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-11-19T14-33-16.714547.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-11-19T14-33-16.714547.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-11-19T14-33-16.714547.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-19T14-33-16.714547.parquet", "**/details_harness|hendrycksTest-virology|5_2023-11-19T14-33-16.714547.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-11-19T14-33-16.714547.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_11_19T14_33_16.714547", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-19T14-33-16.714547.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-19T14-33-16.714547.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_11_19T14_33_16.714547", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-11-19T14-33-16.714547.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-11-19T14-33-16.714547.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_11_19T14_33_16.714547", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-11-19T14-33-16.714547.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-11-19T14-33-16.714547.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_11_19T14_33_16.714547", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-11-19T14-33-16.714547.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-11-19T14-33-16.714547.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_11_19T14_33_16.714547", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-19T14-33-16.714547.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-19T14-33-16.714547.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_11_19T14_33_16.714547", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-11-19T14-33-16.714547.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-11-19T14-33-16.714547.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_11_19T14_33_16.714547", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-11-19T14-33-16.714547.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-11-19T14-33-16.714547.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_11_19T14_33_16.714547", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-11-19T14-33-16.714547.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-11-19T14-33-16.714547.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_11_19T14_33_16.714547", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-11-19T14-33-16.714547.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-11-19T14-33-16.714547.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_11_19T14_33_16.714547", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-11-19T14-33-16.714547.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-11-19T14-33-16.714547.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_11_19T14_33_16.714547", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-11-19T14-33-16.714547.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-11-19T14-33-16.714547.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_11_19T14_33_16.714547", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-11-19T14-33-16.714547.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-11-19T14-33-16.714547.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_11_19T14_33_16.714547", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-19T14-33-16.714547.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-19T14-33-16.714547.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_11_19T14_33_16.714547", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-11-19T14-33-16.714547.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-11-19T14-33-16.714547.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_11_19T14_33_16.714547", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-19T14-33-16.714547.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-19T14-33-16.714547.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_11_19T14_33_16.714547", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-19T14-33-16.714547.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-19T14-33-16.714547.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_11_19T14_33_16.714547", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-11-19T14-33-16.714547.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-11-19T14-33-16.714547.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_11_19T14_33_16.714547", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-11-19T14-33-16.714547.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-11-19T14-33-16.714547.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_11_19T14_33_16.714547", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-11-19T14-33-16.714547.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-11-19T14-33-16.714547.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_11_19T14_33_16.714547", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-19T14-33-16.714547.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-19T14-33-16.714547.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_11_19T14_33_16.714547", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-19T14-33-16.714547.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-19T14-33-16.714547.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_11_19T14_33_16.714547", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-19T14-33-16.714547.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-19T14-33-16.714547.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_11_19T14_33_16.714547", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-11-19T14-33-16.714547.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-11-19T14-33-16.714547.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_11_19T14_33_16.714547", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-19T14-33-16.714547.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-19T14-33-16.714547.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_11_19T14_33_16.714547", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-19T14-33-16.714547.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-19T14-33-16.714547.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_11_19T14_33_16.714547", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-19T14-33-16.714547.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-19T14-33-16.714547.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_11_19T14_33_16.714547", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-19T14-33-16.714547.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-19T14-33-16.714547.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_11_19T14_33_16.714547", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-11-19T14-33-16.714547.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-11-19T14-33-16.714547.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_11_19T14_33_16.714547", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-19T14-33-16.714547.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-19T14-33-16.714547.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_11_19T14_33_16.714547", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-19T14-33-16.714547.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-19T14-33-16.714547.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_11_19T14_33_16.714547", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-19T14-33-16.714547.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-19T14-33-16.714547.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_11_19T14_33_16.714547", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-19T14-33-16.714547.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-19T14-33-16.714547.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_11_19T14_33_16.714547", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-11-19T14-33-16.714547.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-11-19T14-33-16.714547.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_11_19T14_33_16.714547", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-11-19T14-33-16.714547.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-11-19T14-33-16.714547.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_11_19T14_33_16.714547", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-11-19T14-33-16.714547.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-11-19T14-33-16.714547.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_11_19T14_33_16.714547", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-11-19T14-33-16.714547.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-11-19T14-33-16.714547.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_11_19T14_33_16.714547", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-19T14-33-16.714547.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-19T14-33-16.714547.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_11_19T14_33_16.714547", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-11-19T14-33-16.714547.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-11-19T14-33-16.714547.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_11_19T14_33_16.714547", "path": ["**/details_harness|hendrycksTest-management|5_2023-11-19T14-33-16.714547.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-11-19T14-33-16.714547.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_11_19T14_33_16.714547", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-11-19T14-33-16.714547.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-11-19T14-33-16.714547.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_11_19T14_33_16.714547", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-11-19T14-33-16.714547.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-11-19T14-33-16.714547.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_11_19T14_33_16.714547", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-11-19T14-33-16.714547.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-11-19T14-33-16.714547.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_11_19T14_33_16.714547", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-11-19T14-33-16.714547.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-11-19T14-33-16.714547.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_11_19T14_33_16.714547", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-19T14-33-16.714547.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-19T14-33-16.714547.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_11_19T14_33_16.714547", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-11-19T14-33-16.714547.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-11-19T14-33-16.714547.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_11_19T14_33_16.714547", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-11-19T14-33-16.714547.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-11-19T14-33-16.714547.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_11_19T14_33_16.714547", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-11-19T14-33-16.714547.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-11-19T14-33-16.714547.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_11_19T14_33_16.714547", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-11-19T14-33-16.714547.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-11-19T14-33-16.714547.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_11_19T14_33_16.714547", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-11-19T14-33-16.714547.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-11-19T14-33-16.714547.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_11_19T14_33_16.714547", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-11-19T14-33-16.714547.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-11-19T14-33-16.714547.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_11_19T14_33_16.714547", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-11-19T14-33-16.714547.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-11-19T14-33-16.714547.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_11_19T14_33_16.714547", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-11-19T14-33-16.714547.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-11-19T14-33-16.714547.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_11_19T14_33_16.714547", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-11-19T14-33-16.714547.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-11-19T14-33-16.714547.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_11_19T14_33_16.714547", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-11-19T14-33-16.714547.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-11-19T14-33-16.714547.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_11_19T14_33_16.714547", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-19T14-33-16.714547.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-19T14-33-16.714547.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_11_19T14_33_16.714547", "path": ["**/details_harness|hendrycksTest-virology|5_2023-11-19T14-33-16.714547.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-11-19T14-33-16.714547.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_11_19T14_33_16.714547", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-11-19T14-33-16.714547.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-11-19T14-33-16.714547.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_11_19T14_33_16.714547", "path": ["**/details_harness|truthfulqa:mc|0_2023-11-19T14-33-16.714547.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-11-19T14-33-16.714547.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_11_19T14_33_16.714547", "path": ["**/details_harness|winogrande|5_2023-11-19T14-33-16.714547.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-11-19T14-33-16.714547.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_11_19T14_33_16.714547", "path": ["results_2023-11-19T14-33-16.714547.parquet"]}, {"split": "latest", "path": ["results_2023-11-19T14-33-16.714547.parquet"]}]}]} | 2023-11-19T14:37:01+00:00 | []
| []
| TAGS
#region-us
|
# Dataset Card for Evaluation run of l3utterfly/mistral-7b-v0.1-layla-v1
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model l3utterfly/mistral-7b-v0.1-layla-v1 on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-11-19T14:33:16.714547(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
| [
"# Dataset Card for Evaluation run of l3utterfly/mistral-7b-v0.1-layla-v1",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model l3utterfly/mistral-7b-v0.1-layla-v1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-11-19T14:33:16.714547(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
]
| [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of l3utterfly/mistral-7b-v0.1-layla-v1",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model l3utterfly/mistral-7b-v0.1-layla-v1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-11-19T14:33:16.714547(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
]
| [
6,
28,
31,
177,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
]
| [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of l3utterfly/mistral-7b-v0.1-layla-v1## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model l3utterfly/mistral-7b-v0.1-layla-v1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-11-19T14:33:16.714547(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
]
|
d782b62c57e92ac1ec7702e68eab856871d8850c |
# Dataset Card for PosterErase
[](https://github.com/shunk031/huggingface-datasets_PosterErase/actions/workflows/ci.yaml)
## Table of Contents
- [Dataset Card Creation Guide](#dataset-card-creation-guide)
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Initial Data Collection and Normalization](#initial-data-collection-and-normalization)
- [Who are the source language producers?](#who-are-the-source-language-producers)
- [Annotations](#annotations)
- [Annotation process](#annotation-process)
- [Who are the annotators?](#who-are-the-annotators)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** https://github.com/alimama-creative/Self-supervised-Text-Erasing
- **Repository:** https://github.com/shunk031/huggingface-datasets_PosterErase
- **Paper (Preprint):** https://arxiv.org/abs/2204.12743
- **Paper (ACMMM2022):** https://dl.acm.org/doi/abs/10.1145/3503161.3547905
### Dataset Summary
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
The language data in PKU-PosterLayout is in Chinese (BCP-47 zh).
## Dataset Structure
### Data Instances
To use PosterErase dataset, you need to download the dataset via [Alibaba Cloud](https://tianchi.aliyun.com/dataset/134810).
Then place the downloaded files in the following structure and specify its path.
```
/path/to/datasets
├── erase_1.zip
├── erase_2.zip
├── erase_3.zip
├── erase_4.zip
├── erase_5.zip
└── erase_6.zip
```
```python
import datasets as ds
dataset = ds.load_dataset(
path="shunk031/PosterErase",
data_dir="/path/to/datasets/",
)
```
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
[More Information Needed]
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
[More Information Needed]
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
You can find the following statement in [the license section](https://tianchi.aliyun.com/dataset/134810#license) of t[he dataset distribution location](https://tianchi.aliyun.com/dataset/134810).
> The dataset is distributed under the CC BY-SA 4.0 license.
However, the license setting on that page appears to be set to [CC-BY-SA-NC 4.0](http://creativecommons.org/licenses/by-sa/4.0/?spm=a2c22.12282016.0.0.7abc5a92qnyxdR).
### Citation Information
```bibtex
@inproceedings{jiang2022self,
title={Self-supervised text erasing with controllable image synthesis},
author={Jiang, Gangwei and Wang, Shiyao and Ge, Tiezheng and Jiang, Yuning and Wei, Ying and Lian, Defu},
booktitle={Proceedings of the 30th ACM International Conference on Multimedia},
pages={1973--1983},
year={2022}
}
```
### Contributions
Thanks to [alimama-creative](https://github.com/alimama-creative) for creating this dataset.
| shunk031/PosterErase | [
"task_categories:other",
"annotations_creators:machine-generated",
"language_creators:found",
"multilinguality:monolingual",
"source_datasets:original",
"language:zh",
"license:cc-by-sa-4.0",
"graphic design",
"arxiv:2204.12743",
"region:us"
]
| 2023-11-19T14:42:04+00:00 | {"annotations_creators": ["machine-generated"], "language_creators": ["found"], "language": ["zh"], "license": ["cc-by-sa-4.0"], "multilinguality": ["monolingual"], "size_categories": [], "source_datasets": ["original"], "task_categories": ["other"], "task_ids": [], "pretty_name": "PosterErase", "tags": ["graphic design"]} | 2023-11-19T14:43:14+00:00 | [
"2204.12743"
]
| [
"zh"
]
| TAGS
#task_categories-other #annotations_creators-machine-generated #language_creators-found #multilinguality-monolingual #source_datasets-original #language-Chinese #license-cc-by-sa-4.0 #graphic design #arxiv-2204.12743 #region-us
|
# Dataset Card for PosterErase
: URL
- Paper (ACMMM2022): URL
### Dataset Summary
### Supported Tasks and Leaderboards
### Languages
The language data in PKU-PosterLayout is in Chinese (BCP-47 zh).
## Dataset Structure
### Data Instances
To use PosterErase dataset, you need to download the dataset via Alibaba Cloud.
Then place the downloaded files in the following structure and specify its path.
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
You can find the following statement in the license section of the dataset distribution location.
> The dataset is distributed under the CC BY-SA 4.0 license.
However, the license setting on that page appears to be set to CC-BY-SA-NC 4.0.
### Contributions
Thanks to alimama-creative for creating this dataset.
| [
"# Dataset Card for PosterErase\n\n: URL\n- Paper (ACMMM2022): URL",
"### Dataset Summary",
"### Supported Tasks and Leaderboards",
"### Languages\n\nThe language data in PKU-PosterLayout is in Chinese (BCP-47 zh).",
"## Dataset Structure",
"### Data Instances\n\nTo use PosterErase dataset, you need to download the dataset via Alibaba Cloud.\nThen place the downloaded files in the following structure and specify its path.",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information\n\nYou can find the following statement in the license section of the dataset distribution location.\n\n> The dataset is distributed under the CC BY-SA 4.0 license.\n\nHowever, the license setting on that page appears to be set to CC-BY-SA-NC 4.0.",
"### Contributions\n\nThanks to alimama-creative for creating this dataset."
]
| [
"TAGS\n#task_categories-other #annotations_creators-machine-generated #language_creators-found #multilinguality-monolingual #source_datasets-original #language-Chinese #license-cc-by-sa-4.0 #graphic design #arxiv-2204.12743 #region-us \n",
"# Dataset Card for PosterErase\n\n: URL\n- Paper (ACMMM2022): URL",
"### Dataset Summary",
"### Supported Tasks and Leaderboards",
"### Languages\n\nThe language data in PKU-PosterLayout is in Chinese (BCP-47 zh).",
"## Dataset Structure",
"### Data Instances\n\nTo use PosterErase dataset, you need to download the dataset via Alibaba Cloud.\nThen place the downloaded files in the following structure and specify its path.",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information\n\nYou can find the following statement in the license section of the dataset distribution location.\n\n> The dataset is distributed under the CC BY-SA 4.0 license.\n\nHowever, the license setting on that page appears to be set to CC-BY-SA-NC 4.0.",
"### Contributions\n\nThanks to alimama-creative for creating this dataset."
]
| [
78,
15,
162,
30,
6,
10,
26,
6,
43,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
61,
18
]
| [
"passage: TAGS\n#task_categories-other #annotations_creators-machine-generated #language_creators-found #multilinguality-monolingual #source_datasets-original #language-Chinese #license-cc-by-sa-4.0 #graphic design #arxiv-2204.12743 #region-us \n# Dataset Card for PosterErase\n\n: URL\n- Paper (ACMMM2022): URL### Dataset Summary### Supported Tasks and Leaderboards### Languages\n\nThe language data in PKU-PosterLayout is in Chinese (BCP-47 zh).## Dataset Structure### Data Instances\n\nTo use PosterErase dataset, you need to download the dataset via Alibaba Cloud.\nThen place the downloaded files in the following structure and specify its path.### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators"
]
|
b81e768bf63e3bdfa365de95e2291a6d669a1147 |
# Dataset Card for Evaluation run of vihangd/shearedplats-2.7b-v2
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/vihangd/shearedplats-2.7b-v2
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [vihangd/shearedplats-2.7b-v2](https://huggingface.co/vihangd/shearedplats-2.7b-v2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_vihangd__shearedplats-2.7b-v2_public",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-11-19T15:14:51.109565](https://huggingface.co/datasets/open-llm-leaderboard/details_vihangd__shearedplats-2.7b-v2_public/blob/main/results_2023-11-19T15-14-51.109565.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.2834708463666077,
"acc_stderr": 0.0316438041216984,
"acc_norm": 0.28533191373449407,
"acc_norm_stderr": 0.03242801789499609,
"mc1": 0.2668298653610771,
"mc1_stderr": 0.015483691939237265,
"mc2": 0.3975561831004256,
"mc2_stderr": 0.01443999930404212,
"em": 0.02097315436241611,
"em_stderr": 0.0014674686372139715,
"f1": 0.07344798657718132,
"f1_stderr": 0.0018673519634175401
},
"harness|arc:challenge|25": {
"acc": 0.38993174061433444,
"acc_stderr": 0.014252959848892884,
"acc_norm": 0.42406143344709896,
"acc_norm_stderr": 0.014441889627464401
},
"harness|hellaswag|10": {
"acc": 0.5428201553475404,
"acc_stderr": 0.0049714495527871765,
"acc_norm": 0.7257518422624976,
"acc_norm_stderr": 0.004452228541043551
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.34074074074074073,
"acc_stderr": 0.04094376269996794,
"acc_norm": 0.34074074074074073,
"acc_norm_stderr": 0.04094376269996794
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.19736842105263158,
"acc_stderr": 0.03238981601699397,
"acc_norm": 0.19736842105263158,
"acc_norm_stderr": 0.03238981601699397
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.4,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.4,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.30943396226415093,
"acc_stderr": 0.028450154794118627,
"acc_norm": 0.30943396226415093,
"acc_norm_stderr": 0.028450154794118627
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.3055555555555556,
"acc_stderr": 0.03852084696008534,
"acc_norm": 0.3055555555555556,
"acc_norm_stderr": 0.03852084696008534
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.19,
"acc_stderr": 0.03942772444036625,
"acc_norm": 0.19,
"acc_norm_stderr": 0.03942772444036625
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.27,
"acc_stderr": 0.0446196043338474,
"acc_norm": 0.27,
"acc_norm_stderr": 0.0446196043338474
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.17341040462427745,
"acc_stderr": 0.02886810787497064,
"acc_norm": 0.17341040462427745,
"acc_norm_stderr": 0.02886810787497064
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.21568627450980393,
"acc_stderr": 0.04092563958237654,
"acc_norm": 0.21568627450980393,
"acc_norm_stderr": 0.04092563958237654
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.41,
"acc_stderr": 0.04943110704237101,
"acc_norm": 0.41,
"acc_norm_stderr": 0.04943110704237101
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.2851063829787234,
"acc_stderr": 0.029513196625539355,
"acc_norm": 0.2851063829787234,
"acc_norm_stderr": 0.029513196625539355
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.22807017543859648,
"acc_stderr": 0.03947152782669415,
"acc_norm": 0.22807017543859648,
"acc_norm_stderr": 0.03947152782669415
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2827586206896552,
"acc_stderr": 0.037528339580033376,
"acc_norm": 0.2827586206896552,
"acc_norm_stderr": 0.037528339580033376
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.1984126984126984,
"acc_stderr": 0.02053948126188688,
"acc_norm": 0.1984126984126984,
"acc_norm_stderr": 0.02053948126188688
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.04006168083848877,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.04006168083848877
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768079,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768079
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.2838709677419355,
"acc_stderr": 0.025649381063029254,
"acc_norm": 0.2838709677419355,
"acc_norm_stderr": 0.025649381063029254
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.23645320197044334,
"acc_stderr": 0.02989611429173355,
"acc_norm": 0.23645320197044334,
"acc_norm_stderr": 0.02989611429173355
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.3393939393939394,
"acc_stderr": 0.03697442205031595,
"acc_norm": 0.3393939393939394,
"acc_norm_stderr": 0.03697442205031595
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.25252525252525254,
"acc_stderr": 0.030954055470365897,
"acc_norm": 0.25252525252525254,
"acc_norm_stderr": 0.030954055470365897
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.2849740932642487,
"acc_stderr": 0.03257714077709662,
"acc_norm": 0.2849740932642487,
"acc_norm_stderr": 0.03257714077709662
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.24358974358974358,
"acc_stderr": 0.021763733684173923,
"acc_norm": 0.24358974358974358,
"acc_norm_stderr": 0.021763733684173923
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2037037037037037,
"acc_stderr": 0.024556172219141272,
"acc_norm": 0.2037037037037037,
"acc_norm_stderr": 0.024556172219141272
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.2184873949579832,
"acc_stderr": 0.02684151432295896,
"acc_norm": 0.2184873949579832,
"acc_norm_stderr": 0.02684151432295896
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.24503311258278146,
"acc_stderr": 0.03511807571804725,
"acc_norm": 0.24503311258278146,
"acc_norm_stderr": 0.03511807571804725
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.29174311926605506,
"acc_stderr": 0.01948930096887653,
"acc_norm": 0.29174311926605506,
"acc_norm_stderr": 0.01948930096887653
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.2361111111111111,
"acc_stderr": 0.02896370257079102,
"acc_norm": 0.2361111111111111,
"acc_norm_stderr": 0.02896370257079102
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.2549019607843137,
"acc_stderr": 0.030587591351604243,
"acc_norm": 0.2549019607843137,
"acc_norm_stderr": 0.030587591351604243
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.2742616033755274,
"acc_stderr": 0.02904133351059804,
"acc_norm": 0.2742616033755274,
"acc_norm_stderr": 0.02904133351059804
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.3721973094170404,
"acc_stderr": 0.03244305283008731,
"acc_norm": 0.3721973094170404,
"acc_norm_stderr": 0.03244305283008731
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.2595419847328244,
"acc_stderr": 0.03844876139785271,
"acc_norm": 0.2595419847328244,
"acc_norm_stderr": 0.03844876139785271
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.371900826446281,
"acc_stderr": 0.04412015806624504,
"acc_norm": 0.371900826446281,
"acc_norm_stderr": 0.04412015806624504
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.04557239513497751,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.04557239513497751
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.26993865030674846,
"acc_stderr": 0.034878251684978906,
"acc_norm": 0.26993865030674846,
"acc_norm_stderr": 0.034878251684978906
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.29464285714285715,
"acc_stderr": 0.04327040932578728,
"acc_norm": 0.29464285714285715,
"acc_norm_stderr": 0.04327040932578728
},
"harness|hendrycksTest-management|5": {
"acc": 0.23300970873786409,
"acc_stderr": 0.041858325989283136,
"acc_norm": 0.23300970873786409,
"acc_norm_stderr": 0.041858325989283136
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.28205128205128205,
"acc_stderr": 0.02948036054954119,
"acc_norm": 0.28205128205128205,
"acc_norm_stderr": 0.02948036054954119
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252604,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252604
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.3448275862068966,
"acc_stderr": 0.016997123346113436,
"acc_norm": 0.3448275862068966,
"acc_norm_stderr": 0.016997123346113436
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.2774566473988439,
"acc_stderr": 0.024105712607754307,
"acc_norm": 0.2774566473988439,
"acc_norm_stderr": 0.024105712607754307
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2435754189944134,
"acc_stderr": 0.014355911964767865,
"acc_norm": 0.2435754189944134,
"acc_norm_stderr": 0.014355911964767865
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.28431372549019607,
"acc_stderr": 0.025829163272757475,
"acc_norm": 0.28431372549019607,
"acc_norm_stderr": 0.025829163272757475
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.3022508038585209,
"acc_stderr": 0.02608270069539966,
"acc_norm": 0.3022508038585209,
"acc_norm_stderr": 0.02608270069539966
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.3117283950617284,
"acc_stderr": 0.025773111169630453,
"acc_norm": 0.3117283950617284,
"acc_norm_stderr": 0.025773111169630453
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.23049645390070922,
"acc_stderr": 0.025123739226872405,
"acc_norm": 0.23049645390070922,
"acc_norm_stderr": 0.025123739226872405
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2653194263363755,
"acc_stderr": 0.011276198843958871,
"acc_norm": 0.2653194263363755,
"acc_norm_stderr": 0.011276198843958871
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.1801470588235294,
"acc_stderr": 0.02334516361654485,
"acc_norm": 0.1801470588235294,
"acc_norm_stderr": 0.02334516361654485
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.27941176470588236,
"acc_stderr": 0.018152871051538816,
"acc_norm": 0.27941176470588236,
"acc_norm_stderr": 0.018152871051538816
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.2909090909090909,
"acc_stderr": 0.04350271442923243,
"acc_norm": 0.2909090909090909,
"acc_norm_stderr": 0.04350271442923243
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.19591836734693877,
"acc_stderr": 0.025409301953225678,
"acc_norm": 0.19591836734693877,
"acc_norm_stderr": 0.025409301953225678
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.25870646766169153,
"acc_stderr": 0.030965903123573037,
"acc_norm": 0.25870646766169153,
"acc_norm_stderr": 0.030965903123573037
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-virology|5": {
"acc": 0.3253012048192771,
"acc_stderr": 0.036471685236832266,
"acc_norm": 0.3253012048192771,
"acc_norm_stderr": 0.036471685236832266
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.3684210526315789,
"acc_stderr": 0.036996580176568775,
"acc_norm": 0.3684210526315789,
"acc_norm_stderr": 0.036996580176568775
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2668298653610771,
"mc1_stderr": 0.015483691939237265,
"mc2": 0.3975561831004256,
"mc2_stderr": 0.01443999930404212
},
"harness|winogrande|5": {
"acc": 0.659037095501184,
"acc_stderr": 0.013322681435934786
},
"harness|drop|3": {
"em": 0.02097315436241611,
"em_stderr": 0.0014674686372139715,
"f1": 0.07344798657718132,
"f1_stderr": 0.0018673519634175401
},
"harness|gsm8k|5": {
"acc": 0.015163002274450341,
"acc_stderr": 0.003366022949726365
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] | open-llm-leaderboard/details_vihangd__shearedplats-2.7b-v2 | [
"region:us"
]
| 2023-11-19T15:17:55+00:00 | {"pretty_name": "Evaluation run of vihangd/shearedplats-2.7b-v2", "dataset_summary": "Dataset automatically created during the evaluation run of model [vihangd/shearedplats-2.7b-v2](https://huggingface.co/vihangd/shearedplats-2.7b-v2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_vihangd__shearedplats-2.7b-v2_public\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-11-19T15:14:51.109565](https://huggingface.co/datasets/open-llm-leaderboard/details_vihangd__shearedplats-2.7b-v2_public/blob/main/results_2023-11-19T15-14-51.109565.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2834708463666077,\n \"acc_stderr\": 0.0316438041216984,\n \"acc_norm\": 0.28533191373449407,\n \"acc_norm_stderr\": 0.03242801789499609,\n \"mc1\": 0.2668298653610771,\n \"mc1_stderr\": 0.015483691939237265,\n \"mc2\": 0.3975561831004256,\n \"mc2_stderr\": 0.01443999930404212,\n \"em\": 0.02097315436241611,\n \"em_stderr\": 0.0014674686372139715,\n \"f1\": 0.07344798657718132,\n \"f1_stderr\": 0.0018673519634175401\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.38993174061433444,\n \"acc_stderr\": 0.014252959848892884,\n \"acc_norm\": 0.42406143344709896,\n \"acc_norm_stderr\": 0.014441889627464401\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5428201553475404,\n \"acc_stderr\": 0.0049714495527871765,\n \"acc_norm\": 0.7257518422624976,\n \"acc_norm_stderr\": 0.004452228541043551\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.34074074074074073,\n \"acc_stderr\": 0.04094376269996794,\n \"acc_norm\": 0.34074074074074073,\n \"acc_norm_stderr\": 0.04094376269996794\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.19736842105263158,\n \"acc_stderr\": 0.03238981601699397,\n \"acc_norm\": 0.19736842105263158,\n \"acc_norm_stderr\": 0.03238981601699397\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.4,\n \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.30943396226415093,\n \"acc_stderr\": 0.028450154794118627,\n \"acc_norm\": 0.30943396226415093,\n \"acc_norm_stderr\": 0.028450154794118627\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.3055555555555556,\n \"acc_stderr\": 0.03852084696008534,\n \"acc_norm\": 0.3055555555555556,\n \"acc_norm_stderr\": 0.03852084696008534\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.19,\n \"acc_stderr\": 0.03942772444036625,\n \"acc_norm\": 0.19,\n \"acc_norm_stderr\": 0.03942772444036625\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.27,\n \"acc_stderr\": 0.0446196043338474,\n \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.0446196043338474\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.17341040462427745,\n \"acc_stderr\": 0.02886810787497064,\n \"acc_norm\": 0.17341040462427745,\n \"acc_norm_stderr\": 0.02886810787497064\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.21568627450980393,\n \"acc_stderr\": 0.04092563958237654,\n \"acc_norm\": 0.21568627450980393,\n \"acc_norm_stderr\": 0.04092563958237654\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.41,\n \"acc_stderr\": 0.04943110704237101,\n \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.04943110704237101\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.2851063829787234,\n \"acc_stderr\": 0.029513196625539355,\n \"acc_norm\": 0.2851063829787234,\n \"acc_norm_stderr\": 0.029513196625539355\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.22807017543859648,\n \"acc_stderr\": 0.03947152782669415,\n \"acc_norm\": 0.22807017543859648,\n \"acc_norm_stderr\": 0.03947152782669415\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.2827586206896552,\n \"acc_stderr\": 0.037528339580033376,\n \"acc_norm\": 0.2827586206896552,\n \"acc_norm_stderr\": 0.037528339580033376\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.1984126984126984,\n \"acc_stderr\": 0.02053948126188688,\n \"acc_norm\": 0.1984126984126984,\n \"acc_norm_stderr\": 0.02053948126188688\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2777777777777778,\n \"acc_stderr\": 0.04006168083848877,\n \"acc_norm\": 0.2777777777777778,\n \"acc_norm_stderr\": 0.04006168083848877\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768079,\n \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768079\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.2838709677419355,\n \"acc_stderr\": 0.025649381063029254,\n \"acc_norm\": 0.2838709677419355,\n \"acc_norm_stderr\": 0.025649381063029254\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.23645320197044334,\n \"acc_stderr\": 0.02989611429173355,\n \"acc_norm\": 0.23645320197044334,\n \"acc_norm_stderr\": 0.02989611429173355\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.3393939393939394,\n \"acc_stderr\": 0.03697442205031595,\n \"acc_norm\": 0.3393939393939394,\n \"acc_norm_stderr\": 0.03697442205031595\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.25252525252525254,\n \"acc_stderr\": 0.030954055470365897,\n \"acc_norm\": 0.25252525252525254,\n \"acc_norm_stderr\": 0.030954055470365897\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.2849740932642487,\n \"acc_stderr\": 0.03257714077709662,\n \"acc_norm\": 0.2849740932642487,\n \"acc_norm_stderr\": 0.03257714077709662\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.24358974358974358,\n \"acc_stderr\": 0.021763733684173923,\n \"acc_norm\": 0.24358974358974358,\n \"acc_norm_stderr\": 0.021763733684173923\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.2037037037037037,\n \"acc_stderr\": 0.024556172219141272,\n \"acc_norm\": 0.2037037037037037,\n \"acc_norm_stderr\": 0.024556172219141272\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.2184873949579832,\n \"acc_stderr\": 0.02684151432295896,\n \"acc_norm\": 0.2184873949579832,\n \"acc_norm_stderr\": 0.02684151432295896\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.24503311258278146,\n \"acc_stderr\": 0.03511807571804725,\n \"acc_norm\": 0.24503311258278146,\n \"acc_norm_stderr\": 0.03511807571804725\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.29174311926605506,\n \"acc_stderr\": 0.01948930096887653,\n \"acc_norm\": 0.29174311926605506,\n \"acc_norm_stderr\": 0.01948930096887653\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.2361111111111111,\n \"acc_stderr\": 0.02896370257079102,\n \"acc_norm\": 0.2361111111111111,\n \"acc_norm_stderr\": 0.02896370257079102\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.2549019607843137,\n \"acc_stderr\": 0.030587591351604243,\n \"acc_norm\": 0.2549019607843137,\n \"acc_norm_stderr\": 0.030587591351604243\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.2742616033755274,\n \"acc_stderr\": 0.02904133351059804,\n \"acc_norm\": 0.2742616033755274,\n \"acc_norm_stderr\": 0.02904133351059804\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.3721973094170404,\n \"acc_stderr\": 0.03244305283008731,\n \"acc_norm\": 0.3721973094170404,\n \"acc_norm_stderr\": 0.03244305283008731\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.2595419847328244,\n \"acc_stderr\": 0.03844876139785271,\n \"acc_norm\": 0.2595419847328244,\n \"acc_norm_stderr\": 0.03844876139785271\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.371900826446281,\n \"acc_stderr\": 0.04412015806624504,\n \"acc_norm\": 0.371900826446281,\n \"acc_norm_stderr\": 0.04412015806624504\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.04557239513497751,\n \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.04557239513497751\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.26993865030674846,\n \"acc_stderr\": 0.034878251684978906,\n \"acc_norm\": 0.26993865030674846,\n \"acc_norm_stderr\": 0.034878251684978906\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.29464285714285715,\n \"acc_stderr\": 0.04327040932578728,\n \"acc_norm\": 0.29464285714285715,\n \"acc_norm_stderr\": 0.04327040932578728\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.23300970873786409,\n \"acc_stderr\": 0.041858325989283136,\n \"acc_norm\": 0.23300970873786409,\n \"acc_norm_stderr\": 0.041858325989283136\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.28205128205128205,\n \"acc_stderr\": 0.02948036054954119,\n \"acc_norm\": 0.28205128205128205,\n \"acc_norm_stderr\": 0.02948036054954119\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.3448275862068966,\n \"acc_stderr\": 0.016997123346113436,\n \"acc_norm\": 0.3448275862068966,\n \"acc_norm_stderr\": 0.016997123346113436\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.2774566473988439,\n \"acc_stderr\": 0.024105712607754307,\n \"acc_norm\": 0.2774566473988439,\n \"acc_norm_stderr\": 0.024105712607754307\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2435754189944134,\n \"acc_stderr\": 0.014355911964767865,\n \"acc_norm\": 0.2435754189944134,\n \"acc_norm_stderr\": 0.014355911964767865\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.28431372549019607,\n \"acc_stderr\": 0.025829163272757475,\n \"acc_norm\": 0.28431372549019607,\n \"acc_norm_stderr\": 0.025829163272757475\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.3022508038585209,\n \"acc_stderr\": 0.02608270069539966,\n \"acc_norm\": 0.3022508038585209,\n \"acc_norm_stderr\": 0.02608270069539966\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.3117283950617284,\n \"acc_stderr\": 0.025773111169630453,\n \"acc_norm\": 0.3117283950617284,\n \"acc_norm_stderr\": 0.025773111169630453\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.23049645390070922,\n \"acc_stderr\": 0.025123739226872405,\n \"acc_norm\": 0.23049645390070922,\n \"acc_norm_stderr\": 0.025123739226872405\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2653194263363755,\n \"acc_stderr\": 0.011276198843958871,\n \"acc_norm\": 0.2653194263363755,\n \"acc_norm_stderr\": 0.011276198843958871\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.1801470588235294,\n \"acc_stderr\": 0.02334516361654485,\n \"acc_norm\": 0.1801470588235294,\n \"acc_norm_stderr\": 0.02334516361654485\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.27941176470588236,\n \"acc_stderr\": 0.018152871051538816,\n \"acc_norm\": 0.27941176470588236,\n \"acc_norm_stderr\": 0.018152871051538816\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.2909090909090909,\n \"acc_stderr\": 0.04350271442923243,\n \"acc_norm\": 0.2909090909090909,\n \"acc_norm_stderr\": 0.04350271442923243\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.19591836734693877,\n \"acc_stderr\": 0.025409301953225678,\n \"acc_norm\": 0.19591836734693877,\n \"acc_norm_stderr\": 0.025409301953225678\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.25870646766169153,\n \"acc_stderr\": 0.030965903123573037,\n \"acc_norm\": 0.25870646766169153,\n \"acc_norm_stderr\": 0.030965903123573037\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3253012048192771,\n \"acc_stderr\": 0.036471685236832266,\n \"acc_norm\": 0.3253012048192771,\n \"acc_norm_stderr\": 0.036471685236832266\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.3684210526315789,\n \"acc_stderr\": 0.036996580176568775,\n \"acc_norm\": 0.3684210526315789,\n \"acc_norm_stderr\": 0.036996580176568775\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2668298653610771,\n \"mc1_stderr\": 0.015483691939237265,\n \"mc2\": 0.3975561831004256,\n \"mc2_stderr\": 0.01443999930404212\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.659037095501184,\n \"acc_stderr\": 0.013322681435934786\n },\n \"harness|drop|3\": {\n \"em\": 0.02097315436241611,\n \"em_stderr\": 0.0014674686372139715,\n \"f1\": 0.07344798657718132,\n \"f1_stderr\": 0.0018673519634175401\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.015163002274450341,\n \"acc_stderr\": 0.003366022949726365\n }\n}\n```", "repo_url": "https://huggingface.co/vihangd/shearedplats-2.7b-v2", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_11_19T15_14_51.109565", "path": ["**/details_harness|arc:challenge|25_2023-11-19T15-14-51.109565.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-11-19T15-14-51.109565.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_11_19T15_14_51.109565", "path": ["**/details_harness|drop|3_2023-11-19T15-14-51.109565.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-11-19T15-14-51.109565.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_11_19T15_14_51.109565", "path": ["**/details_harness|gsm8k|5_2023-11-19T15-14-51.109565.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-11-19T15-14-51.109565.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_11_19T15_14_51.109565", "path": ["**/details_harness|hellaswag|10_2023-11-19T15-14-51.109565.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-11-19T15-14-51.109565.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_11_19T15_14_51.109565", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-19T15-14-51.109565.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-11-19T15-14-51.109565.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-11-19T15-14-51.109565.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-11-19T15-14-51.109565.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-19T15-14-51.109565.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-11-19T15-14-51.109565.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-11-19T15-14-51.109565.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-11-19T15-14-51.109565.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-11-19T15-14-51.109565.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-11-19T15-14-51.109565.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-11-19T15-14-51.109565.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-11-19T15-14-51.109565.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-19T15-14-51.109565.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-11-19T15-14-51.109565.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-19T15-14-51.109565.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-19T15-14-51.109565.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-11-19T15-14-51.109565.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-11-19T15-14-51.109565.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-11-19T15-14-51.109565.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-19T15-14-51.109565.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-19T15-14-51.109565.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-19T15-14-51.109565.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-11-19T15-14-51.109565.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-19T15-14-51.109565.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-19T15-14-51.109565.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-19T15-14-51.109565.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-19T15-14-51.109565.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-11-19T15-14-51.109565.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-19T15-14-51.109565.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-19T15-14-51.109565.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-19T15-14-51.109565.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-19T15-14-51.109565.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-11-19T15-14-51.109565.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-11-19T15-14-51.109565.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-11-19T15-14-51.109565.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-11-19T15-14-51.109565.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-19T15-14-51.109565.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-11-19T15-14-51.109565.parquet", "**/details_harness|hendrycksTest-management|5_2023-11-19T15-14-51.109565.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-11-19T15-14-51.109565.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-11-19T15-14-51.109565.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-11-19T15-14-51.109565.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-11-19T15-14-51.109565.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-19T15-14-51.109565.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-11-19T15-14-51.109565.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-11-19T15-14-51.109565.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-11-19T15-14-51.109565.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-11-19T15-14-51.109565.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-11-19T15-14-51.109565.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-11-19T15-14-51.109565.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-11-19T15-14-51.109565.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-11-19T15-14-51.109565.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-11-19T15-14-51.109565.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-11-19T15-14-51.109565.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-19T15-14-51.109565.parquet", "**/details_harness|hendrycksTest-virology|5_2023-11-19T15-14-51.109565.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-11-19T15-14-51.109565.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-19T15-14-51.109565.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-11-19T15-14-51.109565.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-11-19T15-14-51.109565.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-11-19T15-14-51.109565.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-19T15-14-51.109565.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-11-19T15-14-51.109565.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-11-19T15-14-51.109565.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-11-19T15-14-51.109565.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-11-19T15-14-51.109565.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-11-19T15-14-51.109565.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-11-19T15-14-51.109565.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-11-19T15-14-51.109565.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-19T15-14-51.109565.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-11-19T15-14-51.109565.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-19T15-14-51.109565.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-19T15-14-51.109565.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-11-19T15-14-51.109565.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-11-19T15-14-51.109565.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-11-19T15-14-51.109565.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-19T15-14-51.109565.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-19T15-14-51.109565.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-19T15-14-51.109565.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-11-19T15-14-51.109565.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-19T15-14-51.109565.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-19T15-14-51.109565.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-19T15-14-51.109565.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-19T15-14-51.109565.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-11-19T15-14-51.109565.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-19T15-14-51.109565.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-19T15-14-51.109565.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-19T15-14-51.109565.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-19T15-14-51.109565.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-11-19T15-14-51.109565.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-11-19T15-14-51.109565.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-11-19T15-14-51.109565.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-11-19T15-14-51.109565.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-19T15-14-51.109565.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-11-19T15-14-51.109565.parquet", "**/details_harness|hendrycksTest-management|5_2023-11-19T15-14-51.109565.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-11-19T15-14-51.109565.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-11-19T15-14-51.109565.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-11-19T15-14-51.109565.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-11-19T15-14-51.109565.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-19T15-14-51.109565.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-11-19T15-14-51.109565.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-11-19T15-14-51.109565.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-11-19T15-14-51.109565.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-11-19T15-14-51.109565.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-11-19T15-14-51.109565.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-11-19T15-14-51.109565.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-11-19T15-14-51.109565.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-11-19T15-14-51.109565.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-11-19T15-14-51.109565.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-11-19T15-14-51.109565.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-19T15-14-51.109565.parquet", "**/details_harness|hendrycksTest-virology|5_2023-11-19T15-14-51.109565.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-11-19T15-14-51.109565.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_11_19T15_14_51.109565", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-19T15-14-51.109565.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-19T15-14-51.109565.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_11_19T15_14_51.109565", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-11-19T15-14-51.109565.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-11-19T15-14-51.109565.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_11_19T15_14_51.109565", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-11-19T15-14-51.109565.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-11-19T15-14-51.109565.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_11_19T15_14_51.109565", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-11-19T15-14-51.109565.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-11-19T15-14-51.109565.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_11_19T15_14_51.109565", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-19T15-14-51.109565.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-19T15-14-51.109565.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_11_19T15_14_51.109565", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-11-19T15-14-51.109565.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-11-19T15-14-51.109565.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_11_19T15_14_51.109565", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-11-19T15-14-51.109565.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-11-19T15-14-51.109565.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_11_19T15_14_51.109565", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-11-19T15-14-51.109565.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-11-19T15-14-51.109565.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_11_19T15_14_51.109565", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-11-19T15-14-51.109565.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-11-19T15-14-51.109565.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_11_19T15_14_51.109565", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-11-19T15-14-51.109565.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-11-19T15-14-51.109565.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_11_19T15_14_51.109565", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-11-19T15-14-51.109565.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-11-19T15-14-51.109565.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_11_19T15_14_51.109565", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-11-19T15-14-51.109565.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-11-19T15-14-51.109565.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_11_19T15_14_51.109565", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-19T15-14-51.109565.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-19T15-14-51.109565.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_11_19T15_14_51.109565", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-11-19T15-14-51.109565.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-11-19T15-14-51.109565.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_11_19T15_14_51.109565", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-19T15-14-51.109565.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-19T15-14-51.109565.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_11_19T15_14_51.109565", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-19T15-14-51.109565.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-19T15-14-51.109565.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_11_19T15_14_51.109565", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-11-19T15-14-51.109565.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-11-19T15-14-51.109565.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_11_19T15_14_51.109565", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-11-19T15-14-51.109565.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-11-19T15-14-51.109565.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_11_19T15_14_51.109565", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-11-19T15-14-51.109565.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-11-19T15-14-51.109565.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_11_19T15_14_51.109565", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-19T15-14-51.109565.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-19T15-14-51.109565.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_11_19T15_14_51.109565", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-19T15-14-51.109565.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-19T15-14-51.109565.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_11_19T15_14_51.109565", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-19T15-14-51.109565.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-19T15-14-51.109565.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_11_19T15_14_51.109565", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-11-19T15-14-51.109565.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-11-19T15-14-51.109565.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_11_19T15_14_51.109565", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-19T15-14-51.109565.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-19T15-14-51.109565.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_11_19T15_14_51.109565", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-19T15-14-51.109565.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-19T15-14-51.109565.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_11_19T15_14_51.109565", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-19T15-14-51.109565.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-19T15-14-51.109565.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_11_19T15_14_51.109565", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-19T15-14-51.109565.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-19T15-14-51.109565.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_11_19T15_14_51.109565", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-11-19T15-14-51.109565.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-11-19T15-14-51.109565.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_11_19T15_14_51.109565", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-19T15-14-51.109565.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-19T15-14-51.109565.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_11_19T15_14_51.109565", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-19T15-14-51.109565.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-19T15-14-51.109565.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_11_19T15_14_51.109565", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-19T15-14-51.109565.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-19T15-14-51.109565.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_11_19T15_14_51.109565", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-19T15-14-51.109565.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-19T15-14-51.109565.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_11_19T15_14_51.109565", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-11-19T15-14-51.109565.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-11-19T15-14-51.109565.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_11_19T15_14_51.109565", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-11-19T15-14-51.109565.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-11-19T15-14-51.109565.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_11_19T15_14_51.109565", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-11-19T15-14-51.109565.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-11-19T15-14-51.109565.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_11_19T15_14_51.109565", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-11-19T15-14-51.109565.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-11-19T15-14-51.109565.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_11_19T15_14_51.109565", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-19T15-14-51.109565.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-19T15-14-51.109565.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_11_19T15_14_51.109565", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-11-19T15-14-51.109565.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-11-19T15-14-51.109565.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_11_19T15_14_51.109565", "path": ["**/details_harness|hendrycksTest-management|5_2023-11-19T15-14-51.109565.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-11-19T15-14-51.109565.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_11_19T15_14_51.109565", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-11-19T15-14-51.109565.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-11-19T15-14-51.109565.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_11_19T15_14_51.109565", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-11-19T15-14-51.109565.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-11-19T15-14-51.109565.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_11_19T15_14_51.109565", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-11-19T15-14-51.109565.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-11-19T15-14-51.109565.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_11_19T15_14_51.109565", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-11-19T15-14-51.109565.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-11-19T15-14-51.109565.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_11_19T15_14_51.109565", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-19T15-14-51.109565.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-19T15-14-51.109565.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_11_19T15_14_51.109565", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-11-19T15-14-51.109565.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-11-19T15-14-51.109565.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_11_19T15_14_51.109565", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-11-19T15-14-51.109565.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-11-19T15-14-51.109565.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_11_19T15_14_51.109565", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-11-19T15-14-51.109565.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-11-19T15-14-51.109565.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_11_19T15_14_51.109565", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-11-19T15-14-51.109565.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-11-19T15-14-51.109565.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_11_19T15_14_51.109565", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-11-19T15-14-51.109565.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-11-19T15-14-51.109565.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_11_19T15_14_51.109565", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-11-19T15-14-51.109565.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-11-19T15-14-51.109565.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_11_19T15_14_51.109565", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-11-19T15-14-51.109565.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-11-19T15-14-51.109565.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_11_19T15_14_51.109565", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-11-19T15-14-51.109565.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-11-19T15-14-51.109565.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_11_19T15_14_51.109565", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-11-19T15-14-51.109565.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-11-19T15-14-51.109565.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_11_19T15_14_51.109565", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-11-19T15-14-51.109565.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-11-19T15-14-51.109565.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_11_19T15_14_51.109565", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-19T15-14-51.109565.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-19T15-14-51.109565.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_11_19T15_14_51.109565", "path": ["**/details_harness|hendrycksTest-virology|5_2023-11-19T15-14-51.109565.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-11-19T15-14-51.109565.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_11_19T15_14_51.109565", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-11-19T15-14-51.109565.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-11-19T15-14-51.109565.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_11_19T15_14_51.109565", "path": ["**/details_harness|truthfulqa:mc|0_2023-11-19T15-14-51.109565.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-11-19T15-14-51.109565.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_11_19T15_14_51.109565", "path": ["**/details_harness|winogrande|5_2023-11-19T15-14-51.109565.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-11-19T15-14-51.109565.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_11_19T15_14_51.109565", "path": ["results_2023-11-19T15-14-51.109565.parquet"]}, {"split": "latest", "path": ["results_2023-11-19T15-14-51.109565.parquet"]}]}]} | 2023-11-19T15:18:43+00:00 | []
| []
| TAGS
#region-us
|
# Dataset Card for Evaluation run of vihangd/shearedplats-2.7b-v2
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model vihangd/shearedplats-2.7b-v2 on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-11-19T15:14:51.109565(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
| [
"# Dataset Card for Evaluation run of vihangd/shearedplats-2.7b-v2",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model vihangd/shearedplats-2.7b-v2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-11-19T15:14:51.109565(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
]
| [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of vihangd/shearedplats-2.7b-v2",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model vihangd/shearedplats-2.7b-v2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-11-19T15:14:51.109565(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
]
| [
6,
24,
31,
173,
66,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
]
| [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of vihangd/shearedplats-2.7b-v2## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model vihangd/shearedplats-2.7b-v2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-11-19T15:14:51.109565(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
]
|
db9f5e60f1963a37480f6d43e871dadca5d66d67 | # Dataset Card for "MySentimentAnwar"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | Anwaarma/MySentimentAnwar | [
"region:us"
]
| 2023-11-19T15:32:56+00:00 | {"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "test", "path": "data/test-*"}]}], "dataset_info": {"features": [{"name": "text", "dtype": "string"}, {"name": "label", "dtype": {"class_label": {"names": {"0": "Negative", "1": "Positive"}}}}], "splits": [{"name": "train", "num_bytes": 376893.0610734115, "num_examples": 2882}, {"name": "test", "num_bytes": 96145, "num_examples": 580}], "download_size": 269068, "dataset_size": 473038.0610734115}} | 2023-11-19T15:33:03+00:00 | []
| []
| TAGS
#region-us
| # Dataset Card for "MySentimentAnwar"
More Information needed | [
"# Dataset Card for \"MySentimentAnwar\"\n\nMore Information needed"
]
| [
"TAGS\n#region-us \n",
"# Dataset Card for \"MySentimentAnwar\"\n\nMore Information needed"
]
| [
6,
16
]
| [
"passage: TAGS\n#region-us \n# Dataset Card for \"MySentimentAnwar\"\n\nMore Information needed"
]
|
bd78631b52065f4ffb0fac0ec1a83106bc72d35d |
# Dataset Card for Evaluation run of postbot/pythia-160m-hq-emails
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/postbot/pythia-160m-hq-emails
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [postbot/pythia-160m-hq-emails](https://huggingface.co/postbot/pythia-160m-hq-emails) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_postbot__pythia-160m-hq-emails_public",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-11-19T15:36:28.681873](https://huggingface.co/datasets/open-llm-leaderboard/details_postbot__pythia-160m-hq-emails_public/blob/main/results_2023-11-19T15-36-28.681873.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.264448312099342,
"acc_stderr": 0.031139481426485385,
"acc_norm": 0.26576656092101425,
"acc_norm_stderr": 0.03196953249728658,
"mc1": 0.2558139534883721,
"mc1_stderr": 0.015274176219283349,
"mc2": 0.4550530004497737,
"mc2_stderr": 0.016187324561962944,
"em": 0.0,
"em_stderr": 0.0,
"f1": 0.003122902684563759,
"f1_stderr": 0.00024291228896195137
},
"harness|arc:challenge|25": {
"acc": 0.19880546075085323,
"acc_stderr": 0.011662850198175543,
"acc_norm": 0.23122866894197952,
"acc_norm_stderr": 0.01232085883477228
},
"harness|hellaswag|10": {
"acc": 0.2813184624576778,
"acc_stderr": 0.004487235657955673,
"acc_norm": 0.3005377414857598,
"acc_norm_stderr": 0.004575548557275204
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.34814814814814815,
"acc_stderr": 0.041153246103369526,
"acc_norm": 0.34814814814814815,
"acc_norm_stderr": 0.041153246103369526
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.21052631578947367,
"acc_stderr": 0.033176727875331574,
"acc_norm": 0.21052631578947367,
"acc_norm_stderr": 0.033176727875331574
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.2,
"acc_stderr": 0.040201512610368445,
"acc_norm": 0.2,
"acc_norm_stderr": 0.040201512610368445
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.26037735849056604,
"acc_stderr": 0.027008766090708083,
"acc_norm": 0.26037735849056604,
"acc_norm_stderr": 0.027008766090708083
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2569444444444444,
"acc_stderr": 0.03653946969442099,
"acc_norm": 0.2569444444444444,
"acc_norm_stderr": 0.03653946969442099
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.19,
"acc_stderr": 0.039427724440366234,
"acc_norm": 0.19,
"acc_norm_stderr": 0.039427724440366234
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.27,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.27,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.20809248554913296,
"acc_stderr": 0.03095289021774988,
"acc_norm": 0.20809248554913296,
"acc_norm_stderr": 0.03095289021774988
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3235294117647059,
"acc_stderr": 0.046550104113196177,
"acc_norm": 0.3235294117647059,
"acc_norm_stderr": 0.046550104113196177
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.22,
"acc_stderr": 0.0416333199893227,
"acc_norm": 0.22,
"acc_norm_stderr": 0.0416333199893227
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.225531914893617,
"acc_stderr": 0.027321078417387533,
"acc_norm": 0.225531914893617,
"acc_norm_stderr": 0.027321078417387533
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2543859649122807,
"acc_stderr": 0.04096985139843671,
"acc_norm": 0.2543859649122807,
"acc_norm_stderr": 0.04096985139843671
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2413793103448276,
"acc_stderr": 0.03565998174135303,
"acc_norm": 0.2413793103448276,
"acc_norm_stderr": 0.03565998174135303
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2566137566137566,
"acc_stderr": 0.022494510767503154,
"acc_norm": 0.2566137566137566,
"acc_norm_stderr": 0.022494510767503154
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3492063492063492,
"acc_stderr": 0.04263906892795132,
"acc_norm": 0.3492063492063492,
"acc_norm_stderr": 0.04263906892795132
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.3161290322580645,
"acc_stderr": 0.02645087448904277,
"acc_norm": 0.3161290322580645,
"acc_norm_stderr": 0.02645087448904277
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.27586206896551724,
"acc_stderr": 0.03144712581678242,
"acc_norm": 0.27586206896551724,
"acc_norm_stderr": 0.03144712581678242
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.24,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.24,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.22424242424242424,
"acc_stderr": 0.032568666616811015,
"acc_norm": 0.22424242424242424,
"acc_norm_stderr": 0.032568666616811015
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.3181818181818182,
"acc_stderr": 0.033184773338453315,
"acc_norm": 0.3181818181818182,
"acc_norm_stderr": 0.033184773338453315
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.3471502590673575,
"acc_stderr": 0.03435696168361355,
"acc_norm": 0.3471502590673575,
"acc_norm_stderr": 0.03435696168361355
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.3128205128205128,
"acc_stderr": 0.023507579020645344,
"acc_norm": 0.3128205128205128,
"acc_norm_stderr": 0.023507579020645344
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2740740740740741,
"acc_stderr": 0.027195934804085626,
"acc_norm": 0.2740740740740741,
"acc_norm_stderr": 0.027195934804085626
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.31512605042016806,
"acc_stderr": 0.03017680828897434,
"acc_norm": 0.31512605042016806,
"acc_norm_stderr": 0.03017680828897434
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31125827814569534,
"acc_stderr": 0.03780445850526733,
"acc_norm": 0.31125827814569534,
"acc_norm_stderr": 0.03780445850526733
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.25137614678899084,
"acc_stderr": 0.01859920636028741,
"acc_norm": 0.25137614678899084,
"acc_norm_stderr": 0.01859920636028741
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.38425925925925924,
"acc_stderr": 0.03317354514310742,
"acc_norm": 0.38425925925925924,
"acc_norm_stderr": 0.03317354514310742
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.24019607843137256,
"acc_stderr": 0.02998373305591362,
"acc_norm": 0.24019607843137256,
"acc_norm_stderr": 0.02998373305591362
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.189873417721519,
"acc_stderr": 0.025530100460233504,
"acc_norm": 0.189873417721519,
"acc_norm_stderr": 0.025530100460233504
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.242152466367713,
"acc_stderr": 0.028751392398694755,
"acc_norm": 0.242152466367713,
"acc_norm_stderr": 0.028751392398694755
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.2595419847328244,
"acc_stderr": 0.03844876139785271,
"acc_norm": 0.2595419847328244,
"acc_norm_stderr": 0.03844876139785271
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.3140495867768595,
"acc_stderr": 0.04236964753041018,
"acc_norm": 0.3140495867768595,
"acc_norm_stderr": 0.04236964753041018
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.24074074074074073,
"acc_stderr": 0.041331194402438376,
"acc_norm": 0.24074074074074073,
"acc_norm_stderr": 0.041331194402438376
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.22085889570552147,
"acc_stderr": 0.03259177392742177,
"acc_norm": 0.22085889570552147,
"acc_norm_stderr": 0.03259177392742177
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.16071428571428573,
"acc_stderr": 0.034859460964757415,
"acc_norm": 0.16071428571428573,
"acc_norm_stderr": 0.034859460964757415
},
"harness|hendrycksTest-management|5": {
"acc": 0.21359223300970873,
"acc_stderr": 0.040580420156460344,
"acc_norm": 0.21359223300970873,
"acc_norm_stderr": 0.040580420156460344
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.027236013946196687,
"acc_norm": 0.2222222222222222,
"acc_norm_stderr": 0.027236013946196687
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.28,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.28,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.27458492975734355,
"acc_stderr": 0.015959829933084046,
"acc_norm": 0.27458492975734355,
"acc_norm_stderr": 0.015959829933084046
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.20809248554913296,
"acc_stderr": 0.021855255263421795,
"acc_norm": 0.20809248554913296,
"acc_norm_stderr": 0.021855255263421795
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.24916201117318434,
"acc_stderr": 0.014465893829859936,
"acc_norm": 0.24916201117318434,
"acc_norm_stderr": 0.014465893829859936
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.025646863097137897,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.025646863097137897
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.2990353697749196,
"acc_stderr": 0.02600330111788514,
"acc_norm": 0.2990353697749196,
"acc_norm_stderr": 0.02600330111788514
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.24691358024691357,
"acc_stderr": 0.023993501709042117,
"acc_norm": 0.24691358024691357,
"acc_norm_stderr": 0.023993501709042117
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.2375886524822695,
"acc_stderr": 0.02538951255272991,
"acc_norm": 0.2375886524822695,
"acc_norm_stderr": 0.02538951255272991
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2333767926988266,
"acc_stderr": 0.010803108481179081,
"acc_norm": 0.2333767926988266,
"acc_norm_stderr": 0.010803108481179081
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.4338235294117647,
"acc_stderr": 0.030105636570016643,
"acc_norm": 0.4338235294117647,
"acc_norm_stderr": 0.030105636570016643
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.238562091503268,
"acc_stderr": 0.017242385828779582,
"acc_norm": 0.238562091503268,
"acc_norm_stderr": 0.017242385828779582
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.2636363636363636,
"acc_stderr": 0.04220224692971987,
"acc_norm": 0.2636363636363636,
"acc_norm_stderr": 0.04220224692971987
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.39183673469387753,
"acc_stderr": 0.031251275910891656,
"acc_norm": 0.39183673469387753,
"acc_norm_stderr": 0.031251275910891656
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.23383084577114427,
"acc_stderr": 0.029929415408348384,
"acc_norm": 0.23383084577114427,
"acc_norm_stderr": 0.029929415408348384
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.27,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.27,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-virology|5": {
"acc": 0.25301204819277107,
"acc_stderr": 0.03384429155233134,
"acc_norm": 0.25301204819277107,
"acc_norm_stderr": 0.03384429155233134
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.22807017543859648,
"acc_stderr": 0.03218093795602357,
"acc_norm": 0.22807017543859648,
"acc_norm_stderr": 0.03218093795602357
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2558139534883721,
"mc1_stderr": 0.015274176219283349,
"mc2": 0.4550530004497737,
"mc2_stderr": 0.016187324561962944
},
"harness|winogrande|5": {
"acc": 0.5027624309392266,
"acc_stderr": 0.014052271211616438
},
"harness|drop|3": {
"em": 0.0,
"em_stderr": 0.0,
"f1": 0.003122902684563759,
"f1_stderr": 0.00024291228896195137
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] | open-llm-leaderboard/details_postbot__pythia-160m-hq-emails | [
"region:us"
]
| 2023-11-19T15:38:44+00:00 | {"pretty_name": "Evaluation run of postbot/pythia-160m-hq-emails", "dataset_summary": "Dataset automatically created during the evaluation run of model [postbot/pythia-160m-hq-emails](https://huggingface.co/postbot/pythia-160m-hq-emails) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_postbot__pythia-160m-hq-emails_public\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-11-19T15:36:28.681873](https://huggingface.co/datasets/open-llm-leaderboard/details_postbot__pythia-160m-hq-emails_public/blob/main/results_2023-11-19T15-36-28.681873.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.264448312099342,\n \"acc_stderr\": 0.031139481426485385,\n \"acc_norm\": 0.26576656092101425,\n \"acc_norm_stderr\": 0.03196953249728658,\n \"mc1\": 0.2558139534883721,\n \"mc1_stderr\": 0.015274176219283349,\n \"mc2\": 0.4550530004497737,\n \"mc2_stderr\": 0.016187324561962944,\n \"em\": 0.0,\n \"em_stderr\": 0.0,\n \"f1\": 0.003122902684563759,\n \"f1_stderr\": 0.00024291228896195137\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.19880546075085323,\n \"acc_stderr\": 0.011662850198175543,\n \"acc_norm\": 0.23122866894197952,\n \"acc_norm_stderr\": 0.01232085883477228\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.2813184624576778,\n \"acc_stderr\": 0.004487235657955673,\n \"acc_norm\": 0.3005377414857598,\n \"acc_norm_stderr\": 0.004575548557275204\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.34814814814814815,\n \"acc_stderr\": 0.041153246103369526,\n \"acc_norm\": 0.34814814814814815,\n \"acc_norm_stderr\": 0.041153246103369526\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.21052631578947367,\n \"acc_stderr\": 0.033176727875331574,\n \"acc_norm\": 0.21052631578947367,\n \"acc_norm_stderr\": 0.033176727875331574\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.2,\n \"acc_stderr\": 0.040201512610368445,\n \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.040201512610368445\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.26037735849056604,\n \"acc_stderr\": 0.027008766090708083,\n \"acc_norm\": 0.26037735849056604,\n \"acc_norm_stderr\": 0.027008766090708083\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2569444444444444,\n \"acc_stderr\": 0.03653946969442099,\n \"acc_norm\": 0.2569444444444444,\n \"acc_norm_stderr\": 0.03653946969442099\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.19,\n \"acc_stderr\": 0.039427724440366234,\n \"acc_norm\": 0.19,\n \"acc_norm_stderr\": 0.039427724440366234\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.20809248554913296,\n \"acc_stderr\": 0.03095289021774988,\n \"acc_norm\": 0.20809248554913296,\n \"acc_norm_stderr\": 0.03095289021774988\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.3235294117647059,\n \"acc_stderr\": 0.046550104113196177,\n \"acc_norm\": 0.3235294117647059,\n \"acc_norm_stderr\": 0.046550104113196177\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.22,\n \"acc_stderr\": 0.0416333199893227,\n \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.0416333199893227\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.225531914893617,\n \"acc_stderr\": 0.027321078417387533,\n \"acc_norm\": 0.225531914893617,\n \"acc_norm_stderr\": 0.027321078417387533\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2543859649122807,\n \"acc_stderr\": 0.04096985139843671,\n \"acc_norm\": 0.2543859649122807,\n \"acc_norm_stderr\": 0.04096985139843671\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.2413793103448276,\n \"acc_stderr\": 0.03565998174135303,\n \"acc_norm\": 0.2413793103448276,\n \"acc_norm_stderr\": 0.03565998174135303\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.2566137566137566,\n \"acc_stderr\": 0.022494510767503154,\n \"acc_norm\": 0.2566137566137566,\n \"acc_norm_stderr\": 0.022494510767503154\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3492063492063492,\n \"acc_stderr\": 0.04263906892795132,\n \"acc_norm\": 0.3492063492063492,\n \"acc_norm_stderr\": 0.04263906892795132\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768078,\n \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768078\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.3161290322580645,\n \"acc_stderr\": 0.02645087448904277,\n \"acc_norm\": 0.3161290322580645,\n \"acc_norm_stderr\": 0.02645087448904277\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.27586206896551724,\n \"acc_stderr\": 0.03144712581678242,\n \"acc_norm\": 0.27586206896551724,\n \"acc_norm_stderr\": 0.03144712581678242\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.24,\n \"acc_stderr\": 0.042923469599092816,\n \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.042923469599092816\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.22424242424242424,\n \"acc_stderr\": 0.032568666616811015,\n \"acc_norm\": 0.22424242424242424,\n \"acc_norm_stderr\": 0.032568666616811015\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.3181818181818182,\n \"acc_stderr\": 0.033184773338453315,\n \"acc_norm\": 0.3181818181818182,\n \"acc_norm_stderr\": 0.033184773338453315\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.3471502590673575,\n \"acc_stderr\": 0.03435696168361355,\n \"acc_norm\": 0.3471502590673575,\n \"acc_norm_stderr\": 0.03435696168361355\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.3128205128205128,\n \"acc_stderr\": 0.023507579020645344,\n \"acc_norm\": 0.3128205128205128,\n \"acc_norm_stderr\": 0.023507579020645344\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.2740740740740741,\n \"acc_stderr\": 0.027195934804085626,\n \"acc_norm\": 0.2740740740740741,\n \"acc_norm_stderr\": 0.027195934804085626\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.31512605042016806,\n \"acc_stderr\": 0.03017680828897434,\n \"acc_norm\": 0.31512605042016806,\n \"acc_norm_stderr\": 0.03017680828897434\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.31125827814569534,\n \"acc_stderr\": 0.03780445850526733,\n \"acc_norm\": 0.31125827814569534,\n \"acc_norm_stderr\": 0.03780445850526733\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.25137614678899084,\n \"acc_stderr\": 0.01859920636028741,\n \"acc_norm\": 0.25137614678899084,\n \"acc_norm_stderr\": 0.01859920636028741\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.38425925925925924,\n \"acc_stderr\": 0.03317354514310742,\n \"acc_norm\": 0.38425925925925924,\n \"acc_norm_stderr\": 0.03317354514310742\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.24019607843137256,\n \"acc_stderr\": 0.02998373305591362,\n \"acc_norm\": 0.24019607843137256,\n \"acc_norm_stderr\": 0.02998373305591362\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.189873417721519,\n \"acc_stderr\": 0.025530100460233504,\n \"acc_norm\": 0.189873417721519,\n \"acc_norm_stderr\": 0.025530100460233504\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.242152466367713,\n \"acc_stderr\": 0.028751392398694755,\n \"acc_norm\": 0.242152466367713,\n \"acc_norm_stderr\": 0.028751392398694755\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.2595419847328244,\n \"acc_stderr\": 0.03844876139785271,\n \"acc_norm\": 0.2595419847328244,\n \"acc_norm_stderr\": 0.03844876139785271\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.3140495867768595,\n \"acc_stderr\": 0.04236964753041018,\n \"acc_norm\": 0.3140495867768595,\n \"acc_norm_stderr\": 0.04236964753041018\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.24074074074074073,\n \"acc_stderr\": 0.041331194402438376,\n \"acc_norm\": 0.24074074074074073,\n \"acc_norm_stderr\": 0.041331194402438376\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.22085889570552147,\n \"acc_stderr\": 0.03259177392742177,\n \"acc_norm\": 0.22085889570552147,\n \"acc_norm_stderr\": 0.03259177392742177\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.16071428571428573,\n \"acc_stderr\": 0.034859460964757415,\n \"acc_norm\": 0.16071428571428573,\n \"acc_norm_stderr\": 0.034859460964757415\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.21359223300970873,\n \"acc_stderr\": 0.040580420156460344,\n \"acc_norm\": 0.21359223300970873,\n \"acc_norm_stderr\": 0.040580420156460344\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2222222222222222,\n \"acc_stderr\": 0.027236013946196687,\n \"acc_norm\": 0.2222222222222222,\n \"acc_norm_stderr\": 0.027236013946196687\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.045126085985421276,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.045126085985421276\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.27458492975734355,\n \"acc_stderr\": 0.015959829933084046,\n \"acc_norm\": 0.27458492975734355,\n \"acc_norm_stderr\": 0.015959829933084046\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.20809248554913296,\n \"acc_stderr\": 0.021855255263421795,\n \"acc_norm\": 0.20809248554913296,\n \"acc_norm_stderr\": 0.021855255263421795\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24916201117318434,\n \"acc_stderr\": 0.014465893829859936,\n \"acc_norm\": 0.24916201117318434,\n \"acc_norm_stderr\": 0.014465893829859936\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.2777777777777778,\n \"acc_stderr\": 0.025646863097137897,\n \"acc_norm\": 0.2777777777777778,\n \"acc_norm_stderr\": 0.025646863097137897\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.2990353697749196,\n \"acc_stderr\": 0.02600330111788514,\n \"acc_norm\": 0.2990353697749196,\n \"acc_norm_stderr\": 0.02600330111788514\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.24691358024691357,\n \"acc_stderr\": 0.023993501709042117,\n \"acc_norm\": 0.24691358024691357,\n \"acc_norm_stderr\": 0.023993501709042117\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.2375886524822695,\n \"acc_stderr\": 0.02538951255272991,\n \"acc_norm\": 0.2375886524822695,\n \"acc_norm_stderr\": 0.02538951255272991\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2333767926988266,\n \"acc_stderr\": 0.010803108481179081,\n \"acc_norm\": 0.2333767926988266,\n \"acc_norm_stderr\": 0.010803108481179081\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.4338235294117647,\n \"acc_stderr\": 0.030105636570016643,\n \"acc_norm\": 0.4338235294117647,\n \"acc_norm_stderr\": 0.030105636570016643\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.238562091503268,\n \"acc_stderr\": 0.017242385828779582,\n \"acc_norm\": 0.238562091503268,\n \"acc_norm_stderr\": 0.017242385828779582\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.2636363636363636,\n \"acc_stderr\": 0.04220224692971987,\n \"acc_norm\": 0.2636363636363636,\n \"acc_norm_stderr\": 0.04220224692971987\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.39183673469387753,\n \"acc_stderr\": 0.031251275910891656,\n \"acc_norm\": 0.39183673469387753,\n \"acc_norm_stderr\": 0.031251275910891656\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.23383084577114427,\n \"acc_stderr\": 0.029929415408348384,\n \"acc_norm\": 0.23383084577114427,\n \"acc_norm_stderr\": 0.029929415408348384\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.25301204819277107,\n \"acc_stderr\": 0.03384429155233134,\n \"acc_norm\": 0.25301204819277107,\n \"acc_norm_stderr\": 0.03384429155233134\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.22807017543859648,\n \"acc_stderr\": 0.03218093795602357,\n \"acc_norm\": 0.22807017543859648,\n \"acc_norm_stderr\": 0.03218093795602357\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2558139534883721,\n \"mc1_stderr\": 0.015274176219283349,\n \"mc2\": 0.4550530004497737,\n \"mc2_stderr\": 0.016187324561962944\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.5027624309392266,\n \"acc_stderr\": 0.014052271211616438\n },\n \"harness|drop|3\": {\n \"em\": 0.0,\n \"em_stderr\": 0.0,\n \"f1\": 0.003122902684563759,\n \"f1_stderr\": 0.00024291228896195137\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\": 0.0\n }\n}\n```", "repo_url": "https://huggingface.co/postbot/pythia-160m-hq-emails", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_11_19T15_36_28.681873", "path": ["**/details_harness|arc:challenge|25_2023-11-19T15-36-28.681873.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-11-19T15-36-28.681873.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_11_19T15_36_28.681873", "path": ["**/details_harness|drop|3_2023-11-19T15-36-28.681873.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-11-19T15-36-28.681873.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_11_19T15_36_28.681873", "path": ["**/details_harness|gsm8k|5_2023-11-19T15-36-28.681873.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-11-19T15-36-28.681873.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_11_19T15_36_28.681873", "path": ["**/details_harness|hellaswag|10_2023-11-19T15-36-28.681873.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-11-19T15-36-28.681873.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_11_19T15_36_28.681873", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-19T15-36-28.681873.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-11-19T15-36-28.681873.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-11-19T15-36-28.681873.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-11-19T15-36-28.681873.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-19T15-36-28.681873.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-11-19T15-36-28.681873.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-11-19T15-36-28.681873.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-11-19T15-36-28.681873.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-11-19T15-36-28.681873.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-11-19T15-36-28.681873.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-11-19T15-36-28.681873.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-11-19T15-36-28.681873.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-19T15-36-28.681873.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-11-19T15-36-28.681873.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-19T15-36-28.681873.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-19T15-36-28.681873.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-11-19T15-36-28.681873.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-11-19T15-36-28.681873.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-11-19T15-36-28.681873.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-19T15-36-28.681873.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-19T15-36-28.681873.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-19T15-36-28.681873.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-11-19T15-36-28.681873.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-19T15-36-28.681873.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-19T15-36-28.681873.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-19T15-36-28.681873.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-19T15-36-28.681873.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-11-19T15-36-28.681873.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-19T15-36-28.681873.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-19T15-36-28.681873.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-19T15-36-28.681873.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-19T15-36-28.681873.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-11-19T15-36-28.681873.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-11-19T15-36-28.681873.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-11-19T15-36-28.681873.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-11-19T15-36-28.681873.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-19T15-36-28.681873.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-11-19T15-36-28.681873.parquet", "**/details_harness|hendrycksTest-management|5_2023-11-19T15-36-28.681873.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-11-19T15-36-28.681873.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-11-19T15-36-28.681873.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-11-19T15-36-28.681873.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-11-19T15-36-28.681873.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-19T15-36-28.681873.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-11-19T15-36-28.681873.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-11-19T15-36-28.681873.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-11-19T15-36-28.681873.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-11-19T15-36-28.681873.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-11-19T15-36-28.681873.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-11-19T15-36-28.681873.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-11-19T15-36-28.681873.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-11-19T15-36-28.681873.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-11-19T15-36-28.681873.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-11-19T15-36-28.681873.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-19T15-36-28.681873.parquet", "**/details_harness|hendrycksTest-virology|5_2023-11-19T15-36-28.681873.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-11-19T15-36-28.681873.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-19T15-36-28.681873.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-11-19T15-36-28.681873.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-11-19T15-36-28.681873.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-11-19T15-36-28.681873.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-19T15-36-28.681873.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-11-19T15-36-28.681873.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-11-19T15-36-28.681873.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-11-19T15-36-28.681873.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-11-19T15-36-28.681873.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-11-19T15-36-28.681873.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-11-19T15-36-28.681873.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-11-19T15-36-28.681873.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-19T15-36-28.681873.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-11-19T15-36-28.681873.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-19T15-36-28.681873.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-19T15-36-28.681873.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-11-19T15-36-28.681873.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-11-19T15-36-28.681873.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-11-19T15-36-28.681873.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-19T15-36-28.681873.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-19T15-36-28.681873.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-19T15-36-28.681873.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-11-19T15-36-28.681873.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-19T15-36-28.681873.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-19T15-36-28.681873.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-19T15-36-28.681873.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-19T15-36-28.681873.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-11-19T15-36-28.681873.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-19T15-36-28.681873.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-19T15-36-28.681873.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-19T15-36-28.681873.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-19T15-36-28.681873.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-11-19T15-36-28.681873.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-11-19T15-36-28.681873.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-11-19T15-36-28.681873.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-11-19T15-36-28.681873.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-19T15-36-28.681873.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-11-19T15-36-28.681873.parquet", "**/details_harness|hendrycksTest-management|5_2023-11-19T15-36-28.681873.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-11-19T15-36-28.681873.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-11-19T15-36-28.681873.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-11-19T15-36-28.681873.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-11-19T15-36-28.681873.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-19T15-36-28.681873.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-11-19T15-36-28.681873.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-11-19T15-36-28.681873.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-11-19T15-36-28.681873.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-11-19T15-36-28.681873.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-11-19T15-36-28.681873.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-11-19T15-36-28.681873.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-11-19T15-36-28.681873.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-11-19T15-36-28.681873.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-11-19T15-36-28.681873.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-11-19T15-36-28.681873.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-19T15-36-28.681873.parquet", "**/details_harness|hendrycksTest-virology|5_2023-11-19T15-36-28.681873.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-11-19T15-36-28.681873.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_11_19T15_36_28.681873", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-19T15-36-28.681873.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-19T15-36-28.681873.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_11_19T15_36_28.681873", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-11-19T15-36-28.681873.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-11-19T15-36-28.681873.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_11_19T15_36_28.681873", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-11-19T15-36-28.681873.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-11-19T15-36-28.681873.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_11_19T15_36_28.681873", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-11-19T15-36-28.681873.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-11-19T15-36-28.681873.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_11_19T15_36_28.681873", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-19T15-36-28.681873.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-19T15-36-28.681873.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_11_19T15_36_28.681873", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-11-19T15-36-28.681873.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-11-19T15-36-28.681873.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_11_19T15_36_28.681873", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-11-19T15-36-28.681873.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-11-19T15-36-28.681873.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_11_19T15_36_28.681873", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-11-19T15-36-28.681873.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-11-19T15-36-28.681873.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_11_19T15_36_28.681873", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-11-19T15-36-28.681873.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-11-19T15-36-28.681873.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_11_19T15_36_28.681873", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-11-19T15-36-28.681873.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-11-19T15-36-28.681873.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_11_19T15_36_28.681873", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-11-19T15-36-28.681873.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-11-19T15-36-28.681873.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_11_19T15_36_28.681873", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-11-19T15-36-28.681873.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-11-19T15-36-28.681873.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_11_19T15_36_28.681873", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-19T15-36-28.681873.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-19T15-36-28.681873.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_11_19T15_36_28.681873", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-11-19T15-36-28.681873.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-11-19T15-36-28.681873.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_11_19T15_36_28.681873", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-19T15-36-28.681873.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-19T15-36-28.681873.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_11_19T15_36_28.681873", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-19T15-36-28.681873.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-19T15-36-28.681873.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_11_19T15_36_28.681873", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-11-19T15-36-28.681873.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-11-19T15-36-28.681873.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_11_19T15_36_28.681873", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-11-19T15-36-28.681873.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-11-19T15-36-28.681873.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_11_19T15_36_28.681873", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-11-19T15-36-28.681873.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-11-19T15-36-28.681873.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_11_19T15_36_28.681873", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-19T15-36-28.681873.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-19T15-36-28.681873.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_11_19T15_36_28.681873", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-19T15-36-28.681873.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-19T15-36-28.681873.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_11_19T15_36_28.681873", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-19T15-36-28.681873.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-19T15-36-28.681873.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_11_19T15_36_28.681873", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-11-19T15-36-28.681873.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-11-19T15-36-28.681873.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_11_19T15_36_28.681873", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-19T15-36-28.681873.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-19T15-36-28.681873.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_11_19T15_36_28.681873", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-19T15-36-28.681873.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-19T15-36-28.681873.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_11_19T15_36_28.681873", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-19T15-36-28.681873.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-19T15-36-28.681873.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_11_19T15_36_28.681873", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-19T15-36-28.681873.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-19T15-36-28.681873.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_11_19T15_36_28.681873", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-11-19T15-36-28.681873.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-11-19T15-36-28.681873.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_11_19T15_36_28.681873", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-19T15-36-28.681873.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-19T15-36-28.681873.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_11_19T15_36_28.681873", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-19T15-36-28.681873.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-19T15-36-28.681873.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_11_19T15_36_28.681873", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-19T15-36-28.681873.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-19T15-36-28.681873.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_11_19T15_36_28.681873", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-19T15-36-28.681873.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-19T15-36-28.681873.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_11_19T15_36_28.681873", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-11-19T15-36-28.681873.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-11-19T15-36-28.681873.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_11_19T15_36_28.681873", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-11-19T15-36-28.681873.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-11-19T15-36-28.681873.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_11_19T15_36_28.681873", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-11-19T15-36-28.681873.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-11-19T15-36-28.681873.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_11_19T15_36_28.681873", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-11-19T15-36-28.681873.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-11-19T15-36-28.681873.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_11_19T15_36_28.681873", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-19T15-36-28.681873.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-19T15-36-28.681873.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_11_19T15_36_28.681873", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-11-19T15-36-28.681873.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-11-19T15-36-28.681873.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_11_19T15_36_28.681873", "path": ["**/details_harness|hendrycksTest-management|5_2023-11-19T15-36-28.681873.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-11-19T15-36-28.681873.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_11_19T15_36_28.681873", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-11-19T15-36-28.681873.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-11-19T15-36-28.681873.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_11_19T15_36_28.681873", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-11-19T15-36-28.681873.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-11-19T15-36-28.681873.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_11_19T15_36_28.681873", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-11-19T15-36-28.681873.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-11-19T15-36-28.681873.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_11_19T15_36_28.681873", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-11-19T15-36-28.681873.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-11-19T15-36-28.681873.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_11_19T15_36_28.681873", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-19T15-36-28.681873.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-19T15-36-28.681873.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_11_19T15_36_28.681873", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-11-19T15-36-28.681873.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-11-19T15-36-28.681873.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_11_19T15_36_28.681873", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-11-19T15-36-28.681873.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-11-19T15-36-28.681873.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_11_19T15_36_28.681873", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-11-19T15-36-28.681873.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-11-19T15-36-28.681873.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_11_19T15_36_28.681873", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-11-19T15-36-28.681873.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-11-19T15-36-28.681873.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_11_19T15_36_28.681873", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-11-19T15-36-28.681873.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-11-19T15-36-28.681873.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_11_19T15_36_28.681873", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-11-19T15-36-28.681873.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-11-19T15-36-28.681873.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_11_19T15_36_28.681873", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-11-19T15-36-28.681873.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-11-19T15-36-28.681873.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_11_19T15_36_28.681873", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-11-19T15-36-28.681873.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-11-19T15-36-28.681873.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_11_19T15_36_28.681873", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-11-19T15-36-28.681873.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-11-19T15-36-28.681873.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_11_19T15_36_28.681873", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-11-19T15-36-28.681873.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-11-19T15-36-28.681873.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_11_19T15_36_28.681873", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-19T15-36-28.681873.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-19T15-36-28.681873.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_11_19T15_36_28.681873", "path": ["**/details_harness|hendrycksTest-virology|5_2023-11-19T15-36-28.681873.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-11-19T15-36-28.681873.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_11_19T15_36_28.681873", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-11-19T15-36-28.681873.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-11-19T15-36-28.681873.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_11_19T15_36_28.681873", "path": ["**/details_harness|truthfulqa:mc|0_2023-11-19T15-36-28.681873.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-11-19T15-36-28.681873.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_11_19T15_36_28.681873", "path": ["**/details_harness|winogrande|5_2023-11-19T15-36-28.681873.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-11-19T15-36-28.681873.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_11_19T15_36_28.681873", "path": ["results_2023-11-19T15-36-28.681873.parquet"]}, {"split": "latest", "path": ["results_2023-11-19T15-36-28.681873.parquet"]}]}]} | 2023-11-19T15:39:37+00:00 | []
| []
| TAGS
#region-us
|
# Dataset Card for Evaluation run of postbot/pythia-160m-hq-emails
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model postbot/pythia-160m-hq-emails on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-11-19T15:36:28.681873(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
| [
"# Dataset Card for Evaluation run of postbot/pythia-160m-hq-emails",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model postbot/pythia-160m-hq-emails on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-11-19T15:36:28.681873(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
]
| [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of postbot/pythia-160m-hq-emails",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model postbot/pythia-160m-hq-emails on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-11-19T15:36:28.681873(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
]
| [
6,
23,
31,
172,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
]
| [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of postbot/pythia-160m-hq-emails## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model postbot/pythia-160m-hq-emails on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-11-19T15:36:28.681873(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
]
|
0277f5953ce9e2b6bcbbd154a6e125a310b1f33d |
# Dataset Card for Evaluation run of PeanutJar/Mistral-v0.1-PeanutButter-v0.0.5-DPO-7B-QLoRA
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/PeanutJar/Mistral-v0.1-PeanutButter-v0.0.5-DPO-7B-QLoRA
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [PeanutJar/Mistral-v0.1-PeanutButter-v0.0.5-DPO-7B-QLoRA](https://huggingface.co/PeanutJar/Mistral-v0.1-PeanutButter-v0.0.5-DPO-7B-QLoRA) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_PeanutJar__Mistral-v0.1-PeanutButter-v0.0.5-DPO-7B-QLoRA_public",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-11-19T15:36:58.669943](https://huggingface.co/datasets/open-llm-leaderboard/details_PeanutJar__Mistral-v0.1-PeanutButter-v0.0.5-DPO-7B-QLoRA_public/blob/main/results_2023-11-19T15-36-58.669943.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6304363255253682,
"acc_stderr": 0.03221802645295953,
"acc_norm": 0.6394242549229363,
"acc_norm_stderr": 0.03291185339071779,
"mc1": 0.3108935128518972,
"mc1_stderr": 0.016203316673559693,
"mc2": 0.45750130636610203,
"mc2_stderr": 0.014435953658631701,
"em": 0.005557885906040268,
"em_stderr": 0.0007613497667018497,
"f1": 0.06505977348993297,
"f1_stderr": 0.0015006861389720051
},
"harness|arc:challenge|25": {
"acc": 0.5793515358361775,
"acc_stderr": 0.014426211252508401,
"acc_norm": 0.6126279863481229,
"acc_norm_stderr": 0.014235872487909869
},
"harness|hellaswag|10": {
"acc": 0.641804421429994,
"acc_stderr": 0.004784901248558711,
"acc_norm": 0.8452499502091216,
"acc_norm_stderr": 0.003609271000593054
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.27,
"acc_stderr": 0.0446196043338474,
"acc_norm": 0.27,
"acc_norm_stderr": 0.0446196043338474
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6222222222222222,
"acc_stderr": 0.04188307537595852,
"acc_norm": 0.6222222222222222,
"acc_norm_stderr": 0.04188307537595852
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6710526315789473,
"acc_stderr": 0.038234289699266046,
"acc_norm": 0.6710526315789473,
"acc_norm_stderr": 0.038234289699266046
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.6,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7094339622641509,
"acc_stderr": 0.02794321998933714,
"acc_norm": 0.7094339622641509,
"acc_norm_stderr": 0.02794321998933714
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7291666666666666,
"acc_stderr": 0.03716177437566017,
"acc_norm": 0.7291666666666666,
"acc_norm_stderr": 0.03716177437566017
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.55,
"acc_stderr": 0.05,
"acc_norm": 0.55,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6011560693641619,
"acc_stderr": 0.0373362665538351,
"acc_norm": 0.6011560693641619,
"acc_norm_stderr": 0.0373362665538351
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4117647058823529,
"acc_stderr": 0.048971049527263666,
"acc_norm": 0.4117647058823529,
"acc_norm_stderr": 0.048971049527263666
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909283,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909283
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.574468085106383,
"acc_stderr": 0.03232146916224468,
"acc_norm": 0.574468085106383,
"acc_norm_stderr": 0.03232146916224468
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5087719298245614,
"acc_stderr": 0.04702880432049615,
"acc_norm": 0.5087719298245614,
"acc_norm_stderr": 0.04702880432049615
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5172413793103449,
"acc_stderr": 0.04164188720169375,
"acc_norm": 0.5172413793103449,
"acc_norm_stderr": 0.04164188720169375
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4074074074074074,
"acc_stderr": 0.02530590624159063,
"acc_norm": 0.4074074074074074,
"acc_norm_stderr": 0.02530590624159063
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3968253968253968,
"acc_stderr": 0.04375888492727061,
"acc_norm": 0.3968253968253968,
"acc_norm_stderr": 0.04375888492727061
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.4,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.4,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7483870967741936,
"acc_stderr": 0.024685979286239963,
"acc_norm": 0.7483870967741936,
"acc_norm_stderr": 0.024685979286239963
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5024630541871922,
"acc_stderr": 0.03517945038691063,
"acc_norm": 0.5024630541871922,
"acc_norm_stderr": 0.03517945038691063
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7757575757575758,
"acc_stderr": 0.03256866661681102,
"acc_norm": 0.7757575757575758,
"acc_norm_stderr": 0.03256866661681102
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7727272727272727,
"acc_stderr": 0.029857515673386414,
"acc_norm": 0.7727272727272727,
"acc_norm_stderr": 0.029857515673386414
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8756476683937824,
"acc_stderr": 0.02381447708659355,
"acc_norm": 0.8756476683937824,
"acc_norm_stderr": 0.02381447708659355
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6333333333333333,
"acc_stderr": 0.02443301646605246,
"acc_norm": 0.6333333333333333,
"acc_norm_stderr": 0.02443301646605246
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.31851851851851853,
"acc_stderr": 0.02840653309060846,
"acc_norm": 0.31851851851851853,
"acc_norm_stderr": 0.02840653309060846
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6176470588235294,
"acc_stderr": 0.031566630992154156,
"acc_norm": 0.6176470588235294,
"acc_norm_stderr": 0.031566630992154156
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3509933774834437,
"acc_stderr": 0.03896981964257375,
"acc_norm": 0.3509933774834437,
"acc_norm_stderr": 0.03896981964257375
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8146788990825689,
"acc_stderr": 0.016659279700295845,
"acc_norm": 0.8146788990825689,
"acc_norm_stderr": 0.016659279700295845
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.49537037037037035,
"acc_stderr": 0.03409825519163572,
"acc_norm": 0.49537037037037035,
"acc_norm_stderr": 0.03409825519163572
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7843137254901961,
"acc_stderr": 0.028867431449849316,
"acc_norm": 0.7843137254901961,
"acc_norm_stderr": 0.028867431449849316
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7637130801687764,
"acc_stderr": 0.027652153144159256,
"acc_norm": 0.7637130801687764,
"acc_norm_stderr": 0.027652153144159256
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6860986547085202,
"acc_stderr": 0.031146796482972465,
"acc_norm": 0.6860986547085202,
"acc_norm_stderr": 0.031146796482972465
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.816793893129771,
"acc_stderr": 0.03392770926494733,
"acc_norm": 0.816793893129771,
"acc_norm_stderr": 0.03392770926494733
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7768595041322314,
"acc_stderr": 0.03800754475228733,
"acc_norm": 0.7768595041322314,
"acc_norm_stderr": 0.03800754475228733
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7870370370370371,
"acc_stderr": 0.0395783547198098,
"acc_norm": 0.7870370370370371,
"acc_norm_stderr": 0.0395783547198098
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7975460122699386,
"acc_stderr": 0.03157065078911901,
"acc_norm": 0.7975460122699386,
"acc_norm_stderr": 0.03157065078911901
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5357142857142857,
"acc_stderr": 0.04733667890053756,
"acc_norm": 0.5357142857142857,
"acc_norm_stderr": 0.04733667890053756
},
"harness|hendrycksTest-management|5": {
"acc": 0.8349514563106796,
"acc_stderr": 0.036756688322331886,
"acc_norm": 0.8349514563106796,
"acc_norm_stderr": 0.036756688322331886
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8846153846153846,
"acc_stderr": 0.020930193185179333,
"acc_norm": 0.8846153846153846,
"acc_norm_stderr": 0.020930193185179333
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8160919540229885,
"acc_stderr": 0.013853724170922526,
"acc_norm": 0.8160919540229885,
"acc_norm_stderr": 0.013853724170922526
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7138728323699421,
"acc_stderr": 0.02433214677913413,
"acc_norm": 0.7138728323699421,
"acc_norm_stderr": 0.02433214677913413
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.31620111731843575,
"acc_stderr": 0.015551673652172542,
"acc_norm": 0.31620111731843575,
"acc_norm_stderr": 0.015551673652172542
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7712418300653595,
"acc_stderr": 0.024051029739912258,
"acc_norm": 0.7712418300653595,
"acc_norm_stderr": 0.024051029739912258
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7009646302250804,
"acc_stderr": 0.02600330111788514,
"acc_norm": 0.7009646302250804,
"acc_norm_stderr": 0.02600330111788514
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.75,
"acc_stderr": 0.02409347123262133,
"acc_norm": 0.75,
"acc_norm_stderr": 0.02409347123262133
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.49645390070921985,
"acc_stderr": 0.02982674915328092,
"acc_norm": 0.49645390070921985,
"acc_norm_stderr": 0.02982674915328092
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.45632333767926986,
"acc_stderr": 0.012721420501462547,
"acc_norm": 0.45632333767926986,
"acc_norm_stderr": 0.012721420501462547
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6433823529411765,
"acc_stderr": 0.029097209568411955,
"acc_norm": 0.6433823529411765,
"acc_norm_stderr": 0.029097209568411955
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.673202614379085,
"acc_stderr": 0.018975427920507215,
"acc_norm": 0.673202614379085,
"acc_norm_stderr": 0.018975427920507215
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6272727272727273,
"acc_stderr": 0.046313813194254656,
"acc_norm": 0.6272727272727273,
"acc_norm_stderr": 0.046313813194254656
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7306122448979592,
"acc_stderr": 0.02840125202902294,
"acc_norm": 0.7306122448979592,
"acc_norm_stderr": 0.02840125202902294
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8308457711442786,
"acc_stderr": 0.026508590656233268,
"acc_norm": 0.8308457711442786,
"acc_norm_stderr": 0.026508590656233268
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.88,
"acc_stderr": 0.03265986323710906,
"acc_norm": 0.88,
"acc_norm_stderr": 0.03265986323710906
},
"harness|hendrycksTest-virology|5": {
"acc": 0.536144578313253,
"acc_stderr": 0.038823108508905954,
"acc_norm": 0.536144578313253,
"acc_norm_stderr": 0.038823108508905954
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8128654970760234,
"acc_stderr": 0.02991312723236804,
"acc_norm": 0.8128654970760234,
"acc_norm_stderr": 0.02991312723236804
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3108935128518972,
"mc1_stderr": 0.016203316673559693,
"mc2": 0.45750130636610203,
"mc2_stderr": 0.014435953658631701
},
"harness|winogrande|5": {
"acc": 0.7861089187056038,
"acc_stderr": 0.011524466954090254
},
"harness|drop|3": {
"em": 0.005557885906040268,
"em_stderr": 0.0007613497667018497,
"f1": 0.06505977348993297,
"f1_stderr": 0.0015006861389720051
},
"harness|gsm8k|5": {
"acc": 0.18119787717968158,
"acc_stderr": 0.010609827611527355
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] | open-llm-leaderboard/details_PeanutJar__Mistral-v0.1-PeanutButter-v0.0.5-DPO-7B-QLoRA | [
"region:us"
]
| 2023-11-19T15:40:00+00:00 | {"pretty_name": "Evaluation run of PeanutJar/Mistral-v0.1-PeanutButter-v0.0.5-DPO-7B-QLoRA", "dataset_summary": "Dataset automatically created during the evaluation run of model [PeanutJar/Mistral-v0.1-PeanutButter-v0.0.5-DPO-7B-QLoRA](https://huggingface.co/PeanutJar/Mistral-v0.1-PeanutButter-v0.0.5-DPO-7B-QLoRA) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_PeanutJar__Mistral-v0.1-PeanutButter-v0.0.5-DPO-7B-QLoRA_public\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-11-19T15:36:58.669943](https://huggingface.co/datasets/open-llm-leaderboard/details_PeanutJar__Mistral-v0.1-PeanutButter-v0.0.5-DPO-7B-QLoRA_public/blob/main/results_2023-11-19T15-36-58.669943.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6304363255253682,\n \"acc_stderr\": 0.03221802645295953,\n \"acc_norm\": 0.6394242549229363,\n \"acc_norm_stderr\": 0.03291185339071779,\n \"mc1\": 0.3108935128518972,\n \"mc1_stderr\": 0.016203316673559693,\n \"mc2\": 0.45750130636610203,\n \"mc2_stderr\": 0.014435953658631701,\n \"em\": 0.005557885906040268,\n \"em_stderr\": 0.0007613497667018497,\n \"f1\": 0.06505977348993297,\n \"f1_stderr\": 0.0015006861389720051\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5793515358361775,\n \"acc_stderr\": 0.014426211252508401,\n \"acc_norm\": 0.6126279863481229,\n \"acc_norm_stderr\": 0.014235872487909869\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.641804421429994,\n \"acc_stderr\": 0.004784901248558711,\n \"acc_norm\": 0.8452499502091216,\n \"acc_norm_stderr\": 0.003609271000593054\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.27,\n \"acc_stderr\": 0.0446196043338474,\n \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.0446196043338474\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6222222222222222,\n \"acc_stderr\": 0.04188307537595852,\n \"acc_norm\": 0.6222222222222222,\n \"acc_norm_stderr\": 0.04188307537595852\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6710526315789473,\n \"acc_stderr\": 0.038234289699266046,\n \"acc_norm\": 0.6710526315789473,\n \"acc_norm_stderr\": 0.038234289699266046\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7094339622641509,\n \"acc_stderr\": 0.02794321998933714,\n \"acc_norm\": 0.7094339622641509,\n \"acc_norm_stderr\": 0.02794321998933714\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7291666666666666,\n \"acc_stderr\": 0.03716177437566017,\n \"acc_norm\": 0.7291666666666666,\n \"acc_norm_stderr\": 0.03716177437566017\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.55,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.55,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6011560693641619,\n \"acc_stderr\": 0.0373362665538351,\n \"acc_norm\": 0.6011560693641619,\n \"acc_norm_stderr\": 0.0373362665538351\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4117647058823529,\n \"acc_stderr\": 0.048971049527263666,\n \"acc_norm\": 0.4117647058823529,\n \"acc_norm_stderr\": 0.048971049527263666\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.76,\n \"acc_stderr\": 0.04292346959909283,\n \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.04292346959909283\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.574468085106383,\n \"acc_stderr\": 0.03232146916224468,\n \"acc_norm\": 0.574468085106383,\n \"acc_norm_stderr\": 0.03232146916224468\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5087719298245614,\n \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.5087719298245614,\n \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5172413793103449,\n \"acc_stderr\": 0.04164188720169375,\n \"acc_norm\": 0.5172413793103449,\n \"acc_norm_stderr\": 0.04164188720169375\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.4074074074074074,\n \"acc_stderr\": 0.02530590624159063,\n \"acc_norm\": 0.4074074074074074,\n \"acc_norm_stderr\": 0.02530590624159063\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3968253968253968,\n \"acc_stderr\": 0.04375888492727061,\n \"acc_norm\": 0.3968253968253968,\n \"acc_norm_stderr\": 0.04375888492727061\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.4,\n \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7483870967741936,\n \"acc_stderr\": 0.024685979286239963,\n \"acc_norm\": 0.7483870967741936,\n \"acc_norm_stderr\": 0.024685979286239963\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5024630541871922,\n \"acc_stderr\": 0.03517945038691063,\n \"acc_norm\": 0.5024630541871922,\n \"acc_norm_stderr\": 0.03517945038691063\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7757575757575758,\n \"acc_stderr\": 0.03256866661681102,\n \"acc_norm\": 0.7757575757575758,\n \"acc_norm_stderr\": 0.03256866661681102\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7727272727272727,\n \"acc_stderr\": 0.029857515673386414,\n \"acc_norm\": 0.7727272727272727,\n \"acc_norm_stderr\": 0.029857515673386414\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8756476683937824,\n \"acc_stderr\": 0.02381447708659355,\n \"acc_norm\": 0.8756476683937824,\n \"acc_norm_stderr\": 0.02381447708659355\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6333333333333333,\n \"acc_stderr\": 0.02443301646605246,\n \"acc_norm\": 0.6333333333333333,\n \"acc_norm_stderr\": 0.02443301646605246\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.31851851851851853,\n \"acc_stderr\": 0.02840653309060846,\n \"acc_norm\": 0.31851851851851853,\n \"acc_norm_stderr\": 0.02840653309060846\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6176470588235294,\n \"acc_stderr\": 0.031566630992154156,\n \"acc_norm\": 0.6176470588235294,\n \"acc_norm_stderr\": 0.031566630992154156\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3509933774834437,\n \"acc_stderr\": 0.03896981964257375,\n \"acc_norm\": 0.3509933774834437,\n \"acc_norm_stderr\": 0.03896981964257375\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8146788990825689,\n \"acc_stderr\": 0.016659279700295845,\n \"acc_norm\": 0.8146788990825689,\n \"acc_norm_stderr\": 0.016659279700295845\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.49537037037037035,\n \"acc_stderr\": 0.03409825519163572,\n \"acc_norm\": 0.49537037037037035,\n \"acc_norm_stderr\": 0.03409825519163572\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7843137254901961,\n \"acc_stderr\": 0.028867431449849316,\n \"acc_norm\": 0.7843137254901961,\n \"acc_norm_stderr\": 0.028867431449849316\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7637130801687764,\n \"acc_stderr\": 0.027652153144159256,\n \"acc_norm\": 0.7637130801687764,\n \"acc_norm_stderr\": 0.027652153144159256\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6860986547085202,\n \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.6860986547085202,\n \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.816793893129771,\n \"acc_stderr\": 0.03392770926494733,\n \"acc_norm\": 0.816793893129771,\n \"acc_norm_stderr\": 0.03392770926494733\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7768595041322314,\n \"acc_stderr\": 0.03800754475228733,\n \"acc_norm\": 0.7768595041322314,\n \"acc_norm_stderr\": 0.03800754475228733\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7870370370370371,\n \"acc_stderr\": 0.0395783547198098,\n \"acc_norm\": 0.7870370370370371,\n \"acc_norm_stderr\": 0.0395783547198098\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7975460122699386,\n \"acc_stderr\": 0.03157065078911901,\n \"acc_norm\": 0.7975460122699386,\n \"acc_norm_stderr\": 0.03157065078911901\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5357142857142857,\n \"acc_stderr\": 0.04733667890053756,\n \"acc_norm\": 0.5357142857142857,\n \"acc_norm_stderr\": 0.04733667890053756\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8349514563106796,\n \"acc_stderr\": 0.036756688322331886,\n \"acc_norm\": 0.8349514563106796,\n \"acc_norm_stderr\": 0.036756688322331886\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8846153846153846,\n \"acc_stderr\": 0.020930193185179333,\n \"acc_norm\": 0.8846153846153846,\n \"acc_norm_stderr\": 0.020930193185179333\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8160919540229885,\n \"acc_stderr\": 0.013853724170922526,\n \"acc_norm\": 0.8160919540229885,\n \"acc_norm_stderr\": 0.013853724170922526\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7138728323699421,\n \"acc_stderr\": 0.02433214677913413,\n \"acc_norm\": 0.7138728323699421,\n \"acc_norm_stderr\": 0.02433214677913413\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.31620111731843575,\n \"acc_stderr\": 0.015551673652172542,\n \"acc_norm\": 0.31620111731843575,\n \"acc_norm_stderr\": 0.015551673652172542\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7712418300653595,\n \"acc_stderr\": 0.024051029739912258,\n \"acc_norm\": 0.7712418300653595,\n \"acc_norm_stderr\": 0.024051029739912258\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7009646302250804,\n \"acc_stderr\": 0.02600330111788514,\n \"acc_norm\": 0.7009646302250804,\n \"acc_norm_stderr\": 0.02600330111788514\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.02409347123262133,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.02409347123262133\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.49645390070921985,\n \"acc_stderr\": 0.02982674915328092,\n \"acc_norm\": 0.49645390070921985,\n \"acc_norm_stderr\": 0.02982674915328092\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.45632333767926986,\n \"acc_stderr\": 0.012721420501462547,\n \"acc_norm\": 0.45632333767926986,\n \"acc_norm_stderr\": 0.012721420501462547\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6433823529411765,\n \"acc_stderr\": 0.029097209568411955,\n \"acc_norm\": 0.6433823529411765,\n \"acc_norm_stderr\": 0.029097209568411955\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.673202614379085,\n \"acc_stderr\": 0.018975427920507215,\n \"acc_norm\": 0.673202614379085,\n \"acc_norm_stderr\": 0.018975427920507215\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6272727272727273,\n \"acc_stderr\": 0.046313813194254656,\n \"acc_norm\": 0.6272727272727273,\n \"acc_norm_stderr\": 0.046313813194254656\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7306122448979592,\n \"acc_stderr\": 0.02840125202902294,\n \"acc_norm\": 0.7306122448979592,\n \"acc_norm_stderr\": 0.02840125202902294\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8308457711442786,\n \"acc_stderr\": 0.026508590656233268,\n \"acc_norm\": 0.8308457711442786,\n \"acc_norm_stderr\": 0.026508590656233268\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.88,\n \"acc_stderr\": 0.03265986323710906,\n \"acc_norm\": 0.88,\n \"acc_norm_stderr\": 0.03265986323710906\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.536144578313253,\n \"acc_stderr\": 0.038823108508905954,\n \"acc_norm\": 0.536144578313253,\n \"acc_norm_stderr\": 0.038823108508905954\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8128654970760234,\n \"acc_stderr\": 0.02991312723236804,\n \"acc_norm\": 0.8128654970760234,\n \"acc_norm_stderr\": 0.02991312723236804\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3108935128518972,\n \"mc1_stderr\": 0.016203316673559693,\n \"mc2\": 0.45750130636610203,\n \"mc2_stderr\": 0.014435953658631701\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7861089187056038,\n \"acc_stderr\": 0.011524466954090254\n },\n \"harness|drop|3\": {\n \"em\": 0.005557885906040268,\n \"em_stderr\": 0.0007613497667018497,\n \"f1\": 0.06505977348993297,\n \"f1_stderr\": 0.0015006861389720051\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.18119787717968158,\n \"acc_stderr\": 0.010609827611527355\n }\n}\n```", "repo_url": "https://huggingface.co/PeanutJar/Mistral-v0.1-PeanutButter-v0.0.5-DPO-7B-QLoRA", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_11_19T15_36_58.669943", "path": ["**/details_harness|arc:challenge|25_2023-11-19T15-36-58.669943.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-11-19T15-36-58.669943.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_11_19T15_36_58.669943", "path": ["**/details_harness|drop|3_2023-11-19T15-36-58.669943.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-11-19T15-36-58.669943.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_11_19T15_36_58.669943", "path": ["**/details_harness|gsm8k|5_2023-11-19T15-36-58.669943.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-11-19T15-36-58.669943.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_11_19T15_36_58.669943", "path": ["**/details_harness|hellaswag|10_2023-11-19T15-36-58.669943.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-11-19T15-36-58.669943.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_11_19T15_36_58.669943", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-19T15-36-58.669943.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-11-19T15-36-58.669943.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-11-19T15-36-58.669943.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-11-19T15-36-58.669943.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-19T15-36-58.669943.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-11-19T15-36-58.669943.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-11-19T15-36-58.669943.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-11-19T15-36-58.669943.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-11-19T15-36-58.669943.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-11-19T15-36-58.669943.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-11-19T15-36-58.669943.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-11-19T15-36-58.669943.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-19T15-36-58.669943.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-11-19T15-36-58.669943.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-19T15-36-58.669943.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-19T15-36-58.669943.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-11-19T15-36-58.669943.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-11-19T15-36-58.669943.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-11-19T15-36-58.669943.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-19T15-36-58.669943.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-19T15-36-58.669943.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-19T15-36-58.669943.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-11-19T15-36-58.669943.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-19T15-36-58.669943.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-19T15-36-58.669943.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-19T15-36-58.669943.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-19T15-36-58.669943.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-11-19T15-36-58.669943.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-19T15-36-58.669943.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-19T15-36-58.669943.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-19T15-36-58.669943.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-19T15-36-58.669943.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-11-19T15-36-58.669943.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-11-19T15-36-58.669943.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-11-19T15-36-58.669943.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-11-19T15-36-58.669943.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-19T15-36-58.669943.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-11-19T15-36-58.669943.parquet", "**/details_harness|hendrycksTest-management|5_2023-11-19T15-36-58.669943.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-11-19T15-36-58.669943.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-11-19T15-36-58.669943.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-11-19T15-36-58.669943.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-11-19T15-36-58.669943.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-19T15-36-58.669943.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-11-19T15-36-58.669943.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-11-19T15-36-58.669943.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-11-19T15-36-58.669943.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-11-19T15-36-58.669943.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-11-19T15-36-58.669943.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-11-19T15-36-58.669943.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-11-19T15-36-58.669943.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-11-19T15-36-58.669943.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-11-19T15-36-58.669943.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-11-19T15-36-58.669943.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-19T15-36-58.669943.parquet", "**/details_harness|hendrycksTest-virology|5_2023-11-19T15-36-58.669943.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-11-19T15-36-58.669943.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-19T15-36-58.669943.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-11-19T15-36-58.669943.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-11-19T15-36-58.669943.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-11-19T15-36-58.669943.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-19T15-36-58.669943.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-11-19T15-36-58.669943.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-11-19T15-36-58.669943.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-11-19T15-36-58.669943.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-11-19T15-36-58.669943.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-11-19T15-36-58.669943.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-11-19T15-36-58.669943.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-11-19T15-36-58.669943.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-19T15-36-58.669943.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-11-19T15-36-58.669943.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-19T15-36-58.669943.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-19T15-36-58.669943.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-11-19T15-36-58.669943.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-11-19T15-36-58.669943.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-11-19T15-36-58.669943.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-19T15-36-58.669943.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-19T15-36-58.669943.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-19T15-36-58.669943.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-11-19T15-36-58.669943.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-19T15-36-58.669943.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-19T15-36-58.669943.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-19T15-36-58.669943.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-19T15-36-58.669943.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-11-19T15-36-58.669943.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-19T15-36-58.669943.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-19T15-36-58.669943.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-19T15-36-58.669943.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-19T15-36-58.669943.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-11-19T15-36-58.669943.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-11-19T15-36-58.669943.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-11-19T15-36-58.669943.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-11-19T15-36-58.669943.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-19T15-36-58.669943.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-11-19T15-36-58.669943.parquet", "**/details_harness|hendrycksTest-management|5_2023-11-19T15-36-58.669943.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-11-19T15-36-58.669943.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-11-19T15-36-58.669943.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-11-19T15-36-58.669943.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-11-19T15-36-58.669943.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-19T15-36-58.669943.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-11-19T15-36-58.669943.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-11-19T15-36-58.669943.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-11-19T15-36-58.669943.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-11-19T15-36-58.669943.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-11-19T15-36-58.669943.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-11-19T15-36-58.669943.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-11-19T15-36-58.669943.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-11-19T15-36-58.669943.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-11-19T15-36-58.669943.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-11-19T15-36-58.669943.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-19T15-36-58.669943.parquet", "**/details_harness|hendrycksTest-virology|5_2023-11-19T15-36-58.669943.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-11-19T15-36-58.669943.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_11_19T15_36_58.669943", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-19T15-36-58.669943.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-19T15-36-58.669943.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_11_19T15_36_58.669943", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-11-19T15-36-58.669943.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-11-19T15-36-58.669943.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_11_19T15_36_58.669943", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-11-19T15-36-58.669943.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-11-19T15-36-58.669943.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_11_19T15_36_58.669943", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-11-19T15-36-58.669943.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-11-19T15-36-58.669943.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_11_19T15_36_58.669943", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-19T15-36-58.669943.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-19T15-36-58.669943.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_11_19T15_36_58.669943", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-11-19T15-36-58.669943.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-11-19T15-36-58.669943.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_11_19T15_36_58.669943", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-11-19T15-36-58.669943.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-11-19T15-36-58.669943.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_11_19T15_36_58.669943", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-11-19T15-36-58.669943.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-11-19T15-36-58.669943.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_11_19T15_36_58.669943", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-11-19T15-36-58.669943.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-11-19T15-36-58.669943.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_11_19T15_36_58.669943", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-11-19T15-36-58.669943.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-11-19T15-36-58.669943.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_11_19T15_36_58.669943", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-11-19T15-36-58.669943.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-11-19T15-36-58.669943.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_11_19T15_36_58.669943", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-11-19T15-36-58.669943.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-11-19T15-36-58.669943.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_11_19T15_36_58.669943", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-19T15-36-58.669943.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-19T15-36-58.669943.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_11_19T15_36_58.669943", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-11-19T15-36-58.669943.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-11-19T15-36-58.669943.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_11_19T15_36_58.669943", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-19T15-36-58.669943.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-19T15-36-58.669943.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_11_19T15_36_58.669943", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-19T15-36-58.669943.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-19T15-36-58.669943.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_11_19T15_36_58.669943", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-11-19T15-36-58.669943.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-11-19T15-36-58.669943.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_11_19T15_36_58.669943", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-11-19T15-36-58.669943.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-11-19T15-36-58.669943.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_11_19T15_36_58.669943", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-11-19T15-36-58.669943.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-11-19T15-36-58.669943.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_11_19T15_36_58.669943", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-19T15-36-58.669943.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-19T15-36-58.669943.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_11_19T15_36_58.669943", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-19T15-36-58.669943.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-19T15-36-58.669943.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_11_19T15_36_58.669943", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-19T15-36-58.669943.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-19T15-36-58.669943.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_11_19T15_36_58.669943", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-11-19T15-36-58.669943.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-11-19T15-36-58.669943.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_11_19T15_36_58.669943", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-19T15-36-58.669943.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-19T15-36-58.669943.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_11_19T15_36_58.669943", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-19T15-36-58.669943.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-19T15-36-58.669943.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_11_19T15_36_58.669943", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-19T15-36-58.669943.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-19T15-36-58.669943.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_11_19T15_36_58.669943", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-19T15-36-58.669943.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-19T15-36-58.669943.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_11_19T15_36_58.669943", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-11-19T15-36-58.669943.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-11-19T15-36-58.669943.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_11_19T15_36_58.669943", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-19T15-36-58.669943.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-19T15-36-58.669943.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_11_19T15_36_58.669943", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-19T15-36-58.669943.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-19T15-36-58.669943.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_11_19T15_36_58.669943", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-19T15-36-58.669943.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-19T15-36-58.669943.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_11_19T15_36_58.669943", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-19T15-36-58.669943.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-19T15-36-58.669943.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_11_19T15_36_58.669943", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-11-19T15-36-58.669943.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-11-19T15-36-58.669943.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_11_19T15_36_58.669943", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-11-19T15-36-58.669943.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-11-19T15-36-58.669943.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_11_19T15_36_58.669943", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-11-19T15-36-58.669943.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-11-19T15-36-58.669943.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_11_19T15_36_58.669943", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-11-19T15-36-58.669943.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-11-19T15-36-58.669943.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_11_19T15_36_58.669943", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-19T15-36-58.669943.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-19T15-36-58.669943.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_11_19T15_36_58.669943", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-11-19T15-36-58.669943.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-11-19T15-36-58.669943.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_11_19T15_36_58.669943", "path": ["**/details_harness|hendrycksTest-management|5_2023-11-19T15-36-58.669943.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-11-19T15-36-58.669943.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_11_19T15_36_58.669943", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-11-19T15-36-58.669943.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-11-19T15-36-58.669943.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_11_19T15_36_58.669943", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-11-19T15-36-58.669943.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-11-19T15-36-58.669943.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_11_19T15_36_58.669943", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-11-19T15-36-58.669943.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-11-19T15-36-58.669943.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_11_19T15_36_58.669943", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-11-19T15-36-58.669943.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-11-19T15-36-58.669943.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_11_19T15_36_58.669943", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-19T15-36-58.669943.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-19T15-36-58.669943.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_11_19T15_36_58.669943", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-11-19T15-36-58.669943.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-11-19T15-36-58.669943.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_11_19T15_36_58.669943", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-11-19T15-36-58.669943.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-11-19T15-36-58.669943.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_11_19T15_36_58.669943", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-11-19T15-36-58.669943.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-11-19T15-36-58.669943.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_11_19T15_36_58.669943", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-11-19T15-36-58.669943.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-11-19T15-36-58.669943.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_11_19T15_36_58.669943", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-11-19T15-36-58.669943.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-11-19T15-36-58.669943.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_11_19T15_36_58.669943", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-11-19T15-36-58.669943.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-11-19T15-36-58.669943.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_11_19T15_36_58.669943", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-11-19T15-36-58.669943.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-11-19T15-36-58.669943.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_11_19T15_36_58.669943", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-11-19T15-36-58.669943.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-11-19T15-36-58.669943.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_11_19T15_36_58.669943", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-11-19T15-36-58.669943.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-11-19T15-36-58.669943.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_11_19T15_36_58.669943", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-11-19T15-36-58.669943.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-11-19T15-36-58.669943.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_11_19T15_36_58.669943", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-19T15-36-58.669943.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-19T15-36-58.669943.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_11_19T15_36_58.669943", "path": ["**/details_harness|hendrycksTest-virology|5_2023-11-19T15-36-58.669943.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-11-19T15-36-58.669943.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_11_19T15_36_58.669943", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-11-19T15-36-58.669943.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-11-19T15-36-58.669943.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_11_19T15_36_58.669943", "path": ["**/details_harness|truthfulqa:mc|0_2023-11-19T15-36-58.669943.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-11-19T15-36-58.669943.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_11_19T15_36_58.669943", "path": ["**/details_harness|winogrande|5_2023-11-19T15-36-58.669943.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-11-19T15-36-58.669943.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_11_19T15_36_58.669943", "path": ["results_2023-11-19T15-36-58.669943.parquet"]}, {"split": "latest", "path": ["results_2023-11-19T15-36-58.669943.parquet"]}]}]} | 2023-11-19T15:40:46+00:00 | []
| []
| TAGS
#region-us
|
# Dataset Card for Evaluation run of PeanutJar/Mistral-v0.1-PeanutButter-v0.0.5-DPO-7B-QLoRA
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model PeanutJar/Mistral-v0.1-PeanutButter-v0.0.5-DPO-7B-QLoRA on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-11-19T15:36:58.669943(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
| [
"# Dataset Card for Evaluation run of PeanutJar/Mistral-v0.1-PeanutButter-v0.0.5-DPO-7B-QLoRA",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model PeanutJar/Mistral-v0.1-PeanutButter-v0.0.5-DPO-7B-QLoRA on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-11-19T15:36:58.669943(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
]
| [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of PeanutJar/Mistral-v0.1-PeanutButter-v0.0.5-DPO-7B-QLoRA",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model PeanutJar/Mistral-v0.1-PeanutButter-v0.0.5-DPO-7B-QLoRA on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-11-19T15:36:58.669943(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
]
| [
6,
38,
31,
187,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
]
| [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of PeanutJar/Mistral-v0.1-PeanutButter-v0.0.5-DPO-7B-QLoRA## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model PeanutJar/Mistral-v0.1-PeanutButter-v0.0.5-DPO-7B-QLoRA on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-11-19T15:36:58.669943(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
]
|
7f51015ebf0f5ce478c8ad27b0cc6a4de3c31580 |
# Dataset Card for Evaluation run of PeanutJar/Mistral-v0.1-PeanutButter-v0.0.5-SFT-7B-QLoRA
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/PeanutJar/Mistral-v0.1-PeanutButter-v0.0.5-SFT-7B-QLoRA
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [PeanutJar/Mistral-v0.1-PeanutButter-v0.0.5-SFT-7B-QLoRA](https://huggingface.co/PeanutJar/Mistral-v0.1-PeanutButter-v0.0.5-SFT-7B-QLoRA) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_PeanutJar__Mistral-v0.1-PeanutButter-v0.0.5-SFT-7B-QLoRA_public",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-11-19T15:40:53.939427](https://huggingface.co/datasets/open-llm-leaderboard/details_PeanutJar__Mistral-v0.1-PeanutButter-v0.0.5-SFT-7B-QLoRA_public/blob/main/results_2023-11-19T15-40-53.939427.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6304901523986502,
"acc_stderr": 0.03227432351145437,
"acc_norm": 0.6396379138474626,
"acc_norm_stderr": 0.03297469555234416,
"mc1": 0.2974296205630355,
"mc1_stderr": 0.016002651487361,
"mc2": 0.449449453863883,
"mc2_stderr": 0.014386188846092064,
"em": 0.00576761744966443,
"em_stderr": 0.0007755000442815149,
"f1": 0.06506291946308734,
"f1_stderr": 0.0015068091686217023
},
"harness|arc:challenge|25": {
"acc": 0.5733788395904437,
"acc_stderr": 0.014453185592920293,
"acc_norm": 0.6075085324232082,
"acc_norm_stderr": 0.014269634635670726
},
"harness|hellaswag|10": {
"acc": 0.6395140410276837,
"acc_stderr": 0.0047916019756127646,
"acc_norm": 0.8423620792670783,
"acc_norm_stderr": 0.003636564286352675
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768081,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768081
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6222222222222222,
"acc_stderr": 0.04188307537595852,
"acc_norm": 0.6222222222222222,
"acc_norm_stderr": 0.04188307537595852
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6710526315789473,
"acc_stderr": 0.038234289699266046,
"acc_norm": 0.6710526315789473,
"acc_norm_stderr": 0.038234289699266046
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7056603773584905,
"acc_stderr": 0.02804918631569525,
"acc_norm": 0.7056603773584905,
"acc_norm_stderr": 0.02804918631569525
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7291666666666666,
"acc_stderr": 0.03716177437566017,
"acc_norm": 0.7291666666666666,
"acc_norm_stderr": 0.03716177437566017
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.53,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.53,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6011560693641619,
"acc_stderr": 0.0373362665538351,
"acc_norm": 0.6011560693641619,
"acc_norm_stderr": 0.0373362665538351
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4215686274509804,
"acc_stderr": 0.04913595201274498,
"acc_norm": 0.4215686274509804,
"acc_norm_stderr": 0.04913595201274498
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909283,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909283
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.574468085106383,
"acc_stderr": 0.03232146916224468,
"acc_norm": 0.574468085106383,
"acc_norm_stderr": 0.03232146916224468
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5087719298245614,
"acc_stderr": 0.04702880432049615,
"acc_norm": 0.5087719298245614,
"acc_norm_stderr": 0.04702880432049615
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5103448275862069,
"acc_stderr": 0.04165774775728762,
"acc_norm": 0.5103448275862069,
"acc_norm_stderr": 0.04165774775728762
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3994708994708995,
"acc_stderr": 0.025225450284067884,
"acc_norm": 0.3994708994708995,
"acc_norm_stderr": 0.025225450284067884
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.40476190476190477,
"acc_stderr": 0.04390259265377562,
"acc_norm": 0.40476190476190477,
"acc_norm_stderr": 0.04390259265377562
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7483870967741936,
"acc_stderr": 0.024685979286239956,
"acc_norm": 0.7483870967741936,
"acc_norm_stderr": 0.024685979286239956
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5024630541871922,
"acc_stderr": 0.03517945038691063,
"acc_norm": 0.5024630541871922,
"acc_norm_stderr": 0.03517945038691063
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7757575757575758,
"acc_stderr": 0.03256866661681102,
"acc_norm": 0.7757575757575758,
"acc_norm_stderr": 0.03256866661681102
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7727272727272727,
"acc_stderr": 0.029857515673386414,
"acc_norm": 0.7727272727272727,
"acc_norm_stderr": 0.029857515673386414
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8756476683937824,
"acc_stderr": 0.02381447708659355,
"acc_norm": 0.8756476683937824,
"acc_norm_stderr": 0.02381447708659355
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6307692307692307,
"acc_stderr": 0.02446861524147892,
"acc_norm": 0.6307692307692307,
"acc_norm_stderr": 0.02446861524147892
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.02874204090394849,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.02874204090394849
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6218487394957983,
"acc_stderr": 0.03149930577784906,
"acc_norm": 0.6218487394957983,
"acc_norm_stderr": 0.03149930577784906
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3708609271523179,
"acc_stderr": 0.03943966699183629,
"acc_norm": 0.3708609271523179,
"acc_norm_stderr": 0.03943966699183629
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8146788990825689,
"acc_stderr": 0.016659279700295845,
"acc_norm": 0.8146788990825689,
"acc_norm_stderr": 0.016659279700295845
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5,
"acc_stderr": 0.034099716973523674,
"acc_norm": 0.5,
"acc_norm_stderr": 0.034099716973523674
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7696078431372549,
"acc_stderr": 0.02955429260569507,
"acc_norm": 0.7696078431372549,
"acc_norm_stderr": 0.02955429260569507
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7679324894514767,
"acc_stderr": 0.027479744550808503,
"acc_norm": 0.7679324894514767,
"acc_norm_stderr": 0.027479744550808503
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6816143497757847,
"acc_stderr": 0.03126580522513713,
"acc_norm": 0.6816143497757847,
"acc_norm_stderr": 0.03126580522513713
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.816793893129771,
"acc_stderr": 0.03392770926494733,
"acc_norm": 0.816793893129771,
"acc_norm_stderr": 0.03392770926494733
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7768595041322314,
"acc_stderr": 0.03800754475228733,
"acc_norm": 0.7768595041322314,
"acc_norm_stderr": 0.03800754475228733
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7870370370370371,
"acc_stderr": 0.0395783547198098,
"acc_norm": 0.7870370370370371,
"acc_norm_stderr": 0.0395783547198098
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7852760736196319,
"acc_stderr": 0.03226219377286775,
"acc_norm": 0.7852760736196319,
"acc_norm_stderr": 0.03226219377286775
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5357142857142857,
"acc_stderr": 0.04733667890053756,
"acc_norm": 0.5357142857142857,
"acc_norm_stderr": 0.04733667890053756
},
"harness|hendrycksTest-management|5": {
"acc": 0.8252427184466019,
"acc_stderr": 0.03760178006026621,
"acc_norm": 0.8252427184466019,
"acc_norm_stderr": 0.03760178006026621
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8760683760683761,
"acc_stderr": 0.021586494001281382,
"acc_norm": 0.8760683760683761,
"acc_norm_stderr": 0.021586494001281382
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8148148148148148,
"acc_stderr": 0.013890862162876166,
"acc_norm": 0.8148148148148148,
"acc_norm_stderr": 0.013890862162876166
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7196531791907514,
"acc_stderr": 0.024182427496577612,
"acc_norm": 0.7196531791907514,
"acc_norm_stderr": 0.024182427496577612
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.311731843575419,
"acc_stderr": 0.015491756531894637,
"acc_norm": 0.311731843575419,
"acc_norm_stderr": 0.015491756531894637
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7745098039215687,
"acc_stderr": 0.0239291555173513,
"acc_norm": 0.7745098039215687,
"acc_norm_stderr": 0.0239291555173513
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7009646302250804,
"acc_stderr": 0.02600330111788514,
"acc_norm": 0.7009646302250804,
"acc_norm_stderr": 0.02600330111788514
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7438271604938271,
"acc_stderr": 0.024288533637726095,
"acc_norm": 0.7438271604938271,
"acc_norm_stderr": 0.024288533637726095
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5141843971631206,
"acc_stderr": 0.02981549448368206,
"acc_norm": 0.5141843971631206,
"acc_norm_stderr": 0.02981549448368206
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.45697522816166886,
"acc_stderr": 0.012722869501611419,
"acc_norm": 0.45697522816166886,
"acc_norm_stderr": 0.012722869501611419
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6286764705882353,
"acc_stderr": 0.029349803139765873,
"acc_norm": 0.6286764705882353,
"acc_norm_stderr": 0.029349803139765873
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.673202614379085,
"acc_stderr": 0.018975427920507215,
"acc_norm": 0.673202614379085,
"acc_norm_stderr": 0.018975427920507215
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6363636363636364,
"acc_stderr": 0.04607582090719976,
"acc_norm": 0.6363636363636364,
"acc_norm_stderr": 0.04607582090719976
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7346938775510204,
"acc_stderr": 0.028263889943784603,
"acc_norm": 0.7346938775510204,
"acc_norm_stderr": 0.028263889943784603
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8308457711442786,
"acc_stderr": 0.026508590656233268,
"acc_norm": 0.8308457711442786,
"acc_norm_stderr": 0.026508590656233268
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.88,
"acc_stderr": 0.03265986323710906,
"acc_norm": 0.88,
"acc_norm_stderr": 0.03265986323710906
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5481927710843374,
"acc_stderr": 0.03874371556587953,
"acc_norm": 0.5481927710843374,
"acc_norm_stderr": 0.03874371556587953
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8128654970760234,
"acc_stderr": 0.02991312723236804,
"acc_norm": 0.8128654970760234,
"acc_norm_stderr": 0.02991312723236804
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2974296205630355,
"mc1_stderr": 0.016002651487361,
"mc2": 0.449449453863883,
"mc2_stderr": 0.014386188846092064
},
"harness|winogrande|5": {
"acc": 0.7868981846882399,
"acc_stderr": 0.011508957690722762
},
"harness|drop|3": {
"em": 0.00576761744966443,
"em_stderr": 0.0007755000442815149,
"f1": 0.06506291946308734,
"f1_stderr": 0.0015068091686217023
},
"harness|gsm8k|5": {
"acc": 0.17134192570128887,
"acc_stderr": 0.010379150273178359
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] | open-llm-leaderboard/details_PeanutJar__Mistral-v0.1-PeanutButter-v0.0.5-SFT-7B-QLoRA | [
"region:us"
]
| 2023-11-19T15:43:54+00:00 | {"pretty_name": "Evaluation run of PeanutJar/Mistral-v0.1-PeanutButter-v0.0.5-SFT-7B-QLoRA", "dataset_summary": "Dataset automatically created during the evaluation run of model [PeanutJar/Mistral-v0.1-PeanutButter-v0.0.5-SFT-7B-QLoRA](https://huggingface.co/PeanutJar/Mistral-v0.1-PeanutButter-v0.0.5-SFT-7B-QLoRA) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_PeanutJar__Mistral-v0.1-PeanutButter-v0.0.5-SFT-7B-QLoRA_public\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-11-19T15:40:53.939427](https://huggingface.co/datasets/open-llm-leaderboard/details_PeanutJar__Mistral-v0.1-PeanutButter-v0.0.5-SFT-7B-QLoRA_public/blob/main/results_2023-11-19T15-40-53.939427.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6304901523986502,\n \"acc_stderr\": 0.03227432351145437,\n \"acc_norm\": 0.6396379138474626,\n \"acc_norm_stderr\": 0.03297469555234416,\n \"mc1\": 0.2974296205630355,\n \"mc1_stderr\": 0.016002651487361,\n \"mc2\": 0.449449453863883,\n \"mc2_stderr\": 0.014386188846092064,\n \"em\": 0.00576761744966443,\n \"em_stderr\": 0.0007755000442815149,\n \"f1\": 0.06506291946308734,\n \"f1_stderr\": 0.0015068091686217023\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5733788395904437,\n \"acc_stderr\": 0.014453185592920293,\n \"acc_norm\": 0.6075085324232082,\n \"acc_norm_stderr\": 0.014269634635670726\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6395140410276837,\n \"acc_stderr\": 0.0047916019756127646,\n \"acc_norm\": 0.8423620792670783,\n \"acc_norm_stderr\": 0.003636564286352675\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768081,\n \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768081\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6222222222222222,\n \"acc_stderr\": 0.04188307537595852,\n \"acc_norm\": 0.6222222222222222,\n \"acc_norm_stderr\": 0.04188307537595852\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6710526315789473,\n \"acc_stderr\": 0.038234289699266046,\n \"acc_norm\": 0.6710526315789473,\n \"acc_norm_stderr\": 0.038234289699266046\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.61,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.61,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7056603773584905,\n \"acc_stderr\": 0.02804918631569525,\n \"acc_norm\": 0.7056603773584905,\n \"acc_norm_stderr\": 0.02804918631569525\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7291666666666666,\n \"acc_stderr\": 0.03716177437566017,\n \"acc_norm\": 0.7291666666666666,\n \"acc_norm_stderr\": 0.03716177437566017\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.53,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6011560693641619,\n \"acc_stderr\": 0.0373362665538351,\n \"acc_norm\": 0.6011560693641619,\n \"acc_norm_stderr\": 0.0373362665538351\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4215686274509804,\n \"acc_stderr\": 0.04913595201274498,\n \"acc_norm\": 0.4215686274509804,\n \"acc_norm_stderr\": 0.04913595201274498\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.76,\n \"acc_stderr\": 0.04292346959909283,\n \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.04292346959909283\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.574468085106383,\n \"acc_stderr\": 0.03232146916224468,\n \"acc_norm\": 0.574468085106383,\n \"acc_norm_stderr\": 0.03232146916224468\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5087719298245614,\n \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.5087719298245614,\n \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5103448275862069,\n \"acc_stderr\": 0.04165774775728762,\n \"acc_norm\": 0.5103448275862069,\n \"acc_norm_stderr\": 0.04165774775728762\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.3994708994708995,\n \"acc_stderr\": 0.025225450284067884,\n \"acc_norm\": 0.3994708994708995,\n \"acc_norm_stderr\": 0.025225450284067884\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.40476190476190477,\n \"acc_stderr\": 0.04390259265377562,\n \"acc_norm\": 0.40476190476190477,\n \"acc_norm_stderr\": 0.04390259265377562\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7483870967741936,\n \"acc_stderr\": 0.024685979286239956,\n \"acc_norm\": 0.7483870967741936,\n \"acc_norm_stderr\": 0.024685979286239956\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5024630541871922,\n \"acc_stderr\": 0.03517945038691063,\n \"acc_norm\": 0.5024630541871922,\n \"acc_norm_stderr\": 0.03517945038691063\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7757575757575758,\n \"acc_stderr\": 0.03256866661681102,\n \"acc_norm\": 0.7757575757575758,\n \"acc_norm_stderr\": 0.03256866661681102\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7727272727272727,\n \"acc_stderr\": 0.029857515673386414,\n \"acc_norm\": 0.7727272727272727,\n \"acc_norm_stderr\": 0.029857515673386414\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8756476683937824,\n \"acc_stderr\": 0.02381447708659355,\n \"acc_norm\": 0.8756476683937824,\n \"acc_norm_stderr\": 0.02381447708659355\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6307692307692307,\n \"acc_stderr\": 0.02446861524147892,\n \"acc_norm\": 0.6307692307692307,\n \"acc_norm_stderr\": 0.02446861524147892\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.02874204090394849,\n \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.02874204090394849\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6218487394957983,\n \"acc_stderr\": 0.03149930577784906,\n \"acc_norm\": 0.6218487394957983,\n \"acc_norm_stderr\": 0.03149930577784906\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3708609271523179,\n \"acc_stderr\": 0.03943966699183629,\n \"acc_norm\": 0.3708609271523179,\n \"acc_norm_stderr\": 0.03943966699183629\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8146788990825689,\n \"acc_stderr\": 0.016659279700295845,\n \"acc_norm\": 0.8146788990825689,\n \"acc_norm_stderr\": 0.016659279700295845\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.034099716973523674,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.034099716973523674\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7696078431372549,\n \"acc_stderr\": 0.02955429260569507,\n \"acc_norm\": 0.7696078431372549,\n \"acc_norm_stderr\": 0.02955429260569507\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7679324894514767,\n \"acc_stderr\": 0.027479744550808503,\n \"acc_norm\": 0.7679324894514767,\n \"acc_norm_stderr\": 0.027479744550808503\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6816143497757847,\n \"acc_stderr\": 0.03126580522513713,\n \"acc_norm\": 0.6816143497757847,\n \"acc_norm_stderr\": 0.03126580522513713\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.816793893129771,\n \"acc_stderr\": 0.03392770926494733,\n \"acc_norm\": 0.816793893129771,\n \"acc_norm_stderr\": 0.03392770926494733\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7768595041322314,\n \"acc_stderr\": 0.03800754475228733,\n \"acc_norm\": 0.7768595041322314,\n \"acc_norm_stderr\": 0.03800754475228733\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7870370370370371,\n \"acc_stderr\": 0.0395783547198098,\n \"acc_norm\": 0.7870370370370371,\n \"acc_norm_stderr\": 0.0395783547198098\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7852760736196319,\n \"acc_stderr\": 0.03226219377286775,\n \"acc_norm\": 0.7852760736196319,\n \"acc_norm_stderr\": 0.03226219377286775\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5357142857142857,\n \"acc_stderr\": 0.04733667890053756,\n \"acc_norm\": 0.5357142857142857,\n \"acc_norm_stderr\": 0.04733667890053756\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8252427184466019,\n \"acc_stderr\": 0.03760178006026621,\n \"acc_norm\": 0.8252427184466019,\n \"acc_norm_stderr\": 0.03760178006026621\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8760683760683761,\n \"acc_stderr\": 0.021586494001281382,\n \"acc_norm\": 0.8760683760683761,\n \"acc_norm_stderr\": 0.021586494001281382\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8148148148148148,\n \"acc_stderr\": 0.013890862162876166,\n \"acc_norm\": 0.8148148148148148,\n \"acc_norm_stderr\": 0.013890862162876166\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7196531791907514,\n \"acc_stderr\": 0.024182427496577612,\n \"acc_norm\": 0.7196531791907514,\n \"acc_norm_stderr\": 0.024182427496577612\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.311731843575419,\n \"acc_stderr\": 0.015491756531894637,\n \"acc_norm\": 0.311731843575419,\n \"acc_norm_stderr\": 0.015491756531894637\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7745098039215687,\n \"acc_stderr\": 0.0239291555173513,\n \"acc_norm\": 0.7745098039215687,\n \"acc_norm_stderr\": 0.0239291555173513\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7009646302250804,\n \"acc_stderr\": 0.02600330111788514,\n \"acc_norm\": 0.7009646302250804,\n \"acc_norm_stderr\": 0.02600330111788514\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7438271604938271,\n \"acc_stderr\": 0.024288533637726095,\n \"acc_norm\": 0.7438271604938271,\n \"acc_norm_stderr\": 0.024288533637726095\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.5141843971631206,\n \"acc_stderr\": 0.02981549448368206,\n \"acc_norm\": 0.5141843971631206,\n \"acc_norm_stderr\": 0.02981549448368206\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.45697522816166886,\n \"acc_stderr\": 0.012722869501611419,\n \"acc_norm\": 0.45697522816166886,\n \"acc_norm_stderr\": 0.012722869501611419\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6286764705882353,\n \"acc_stderr\": 0.029349803139765873,\n \"acc_norm\": 0.6286764705882353,\n \"acc_norm_stderr\": 0.029349803139765873\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.673202614379085,\n \"acc_stderr\": 0.018975427920507215,\n \"acc_norm\": 0.673202614379085,\n \"acc_norm_stderr\": 0.018975427920507215\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6363636363636364,\n \"acc_stderr\": 0.04607582090719976,\n \"acc_norm\": 0.6363636363636364,\n \"acc_norm_stderr\": 0.04607582090719976\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7346938775510204,\n \"acc_stderr\": 0.028263889943784603,\n \"acc_norm\": 0.7346938775510204,\n \"acc_norm_stderr\": 0.028263889943784603\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8308457711442786,\n \"acc_stderr\": 0.026508590656233268,\n \"acc_norm\": 0.8308457711442786,\n \"acc_norm_stderr\": 0.026508590656233268\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.88,\n \"acc_stderr\": 0.03265986323710906,\n \"acc_norm\": 0.88,\n \"acc_norm_stderr\": 0.03265986323710906\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5481927710843374,\n \"acc_stderr\": 0.03874371556587953,\n \"acc_norm\": 0.5481927710843374,\n \"acc_norm_stderr\": 0.03874371556587953\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8128654970760234,\n \"acc_stderr\": 0.02991312723236804,\n \"acc_norm\": 0.8128654970760234,\n \"acc_norm_stderr\": 0.02991312723236804\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2974296205630355,\n \"mc1_stderr\": 0.016002651487361,\n \"mc2\": 0.449449453863883,\n \"mc2_stderr\": 0.014386188846092064\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7868981846882399,\n \"acc_stderr\": 0.011508957690722762\n },\n \"harness|drop|3\": {\n \"em\": 0.00576761744966443,\n \"em_stderr\": 0.0007755000442815149,\n \"f1\": 0.06506291946308734,\n \"f1_stderr\": 0.0015068091686217023\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.17134192570128887,\n \"acc_stderr\": 0.010379150273178359\n }\n}\n```", "repo_url": "https://huggingface.co/PeanutJar/Mistral-v0.1-PeanutButter-v0.0.5-SFT-7B-QLoRA", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_11_19T15_40_53.939427", "path": ["**/details_harness|arc:challenge|25_2023-11-19T15-40-53.939427.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-11-19T15-40-53.939427.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_11_19T15_40_53.939427", "path": ["**/details_harness|drop|3_2023-11-19T15-40-53.939427.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-11-19T15-40-53.939427.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_11_19T15_40_53.939427", "path": ["**/details_harness|gsm8k|5_2023-11-19T15-40-53.939427.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-11-19T15-40-53.939427.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_11_19T15_40_53.939427", "path": ["**/details_harness|hellaswag|10_2023-11-19T15-40-53.939427.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-11-19T15-40-53.939427.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_11_19T15_40_53.939427", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-19T15-40-53.939427.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-11-19T15-40-53.939427.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-11-19T15-40-53.939427.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-11-19T15-40-53.939427.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-19T15-40-53.939427.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-11-19T15-40-53.939427.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-11-19T15-40-53.939427.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-11-19T15-40-53.939427.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-11-19T15-40-53.939427.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-11-19T15-40-53.939427.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-11-19T15-40-53.939427.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-11-19T15-40-53.939427.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-19T15-40-53.939427.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-11-19T15-40-53.939427.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-19T15-40-53.939427.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-19T15-40-53.939427.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-11-19T15-40-53.939427.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-11-19T15-40-53.939427.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-11-19T15-40-53.939427.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-19T15-40-53.939427.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-19T15-40-53.939427.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-19T15-40-53.939427.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-11-19T15-40-53.939427.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-19T15-40-53.939427.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-19T15-40-53.939427.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-19T15-40-53.939427.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-19T15-40-53.939427.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-11-19T15-40-53.939427.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-19T15-40-53.939427.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-19T15-40-53.939427.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-19T15-40-53.939427.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-19T15-40-53.939427.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-11-19T15-40-53.939427.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-11-19T15-40-53.939427.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-11-19T15-40-53.939427.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-11-19T15-40-53.939427.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-19T15-40-53.939427.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-11-19T15-40-53.939427.parquet", "**/details_harness|hendrycksTest-management|5_2023-11-19T15-40-53.939427.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-11-19T15-40-53.939427.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-11-19T15-40-53.939427.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-11-19T15-40-53.939427.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-11-19T15-40-53.939427.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-19T15-40-53.939427.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-11-19T15-40-53.939427.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-11-19T15-40-53.939427.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-11-19T15-40-53.939427.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-11-19T15-40-53.939427.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-11-19T15-40-53.939427.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-11-19T15-40-53.939427.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-11-19T15-40-53.939427.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-11-19T15-40-53.939427.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-11-19T15-40-53.939427.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-11-19T15-40-53.939427.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-19T15-40-53.939427.parquet", "**/details_harness|hendrycksTest-virology|5_2023-11-19T15-40-53.939427.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-11-19T15-40-53.939427.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-19T15-40-53.939427.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-11-19T15-40-53.939427.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-11-19T15-40-53.939427.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-11-19T15-40-53.939427.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-19T15-40-53.939427.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-11-19T15-40-53.939427.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-11-19T15-40-53.939427.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-11-19T15-40-53.939427.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-11-19T15-40-53.939427.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-11-19T15-40-53.939427.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-11-19T15-40-53.939427.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-11-19T15-40-53.939427.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-19T15-40-53.939427.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-11-19T15-40-53.939427.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-19T15-40-53.939427.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-19T15-40-53.939427.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-11-19T15-40-53.939427.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-11-19T15-40-53.939427.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-11-19T15-40-53.939427.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-19T15-40-53.939427.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-19T15-40-53.939427.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-19T15-40-53.939427.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-11-19T15-40-53.939427.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-19T15-40-53.939427.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-19T15-40-53.939427.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-19T15-40-53.939427.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-19T15-40-53.939427.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-11-19T15-40-53.939427.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-19T15-40-53.939427.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-19T15-40-53.939427.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-19T15-40-53.939427.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-19T15-40-53.939427.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-11-19T15-40-53.939427.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-11-19T15-40-53.939427.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-11-19T15-40-53.939427.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-11-19T15-40-53.939427.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-19T15-40-53.939427.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-11-19T15-40-53.939427.parquet", "**/details_harness|hendrycksTest-management|5_2023-11-19T15-40-53.939427.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-11-19T15-40-53.939427.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-11-19T15-40-53.939427.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-11-19T15-40-53.939427.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-11-19T15-40-53.939427.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-19T15-40-53.939427.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-11-19T15-40-53.939427.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-11-19T15-40-53.939427.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-11-19T15-40-53.939427.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-11-19T15-40-53.939427.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-11-19T15-40-53.939427.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-11-19T15-40-53.939427.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-11-19T15-40-53.939427.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-11-19T15-40-53.939427.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-11-19T15-40-53.939427.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-11-19T15-40-53.939427.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-19T15-40-53.939427.parquet", "**/details_harness|hendrycksTest-virology|5_2023-11-19T15-40-53.939427.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-11-19T15-40-53.939427.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_11_19T15_40_53.939427", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-19T15-40-53.939427.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-19T15-40-53.939427.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_11_19T15_40_53.939427", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-11-19T15-40-53.939427.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-11-19T15-40-53.939427.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_11_19T15_40_53.939427", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-11-19T15-40-53.939427.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-11-19T15-40-53.939427.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_11_19T15_40_53.939427", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-11-19T15-40-53.939427.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-11-19T15-40-53.939427.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_11_19T15_40_53.939427", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-19T15-40-53.939427.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-19T15-40-53.939427.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_11_19T15_40_53.939427", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-11-19T15-40-53.939427.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-11-19T15-40-53.939427.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_11_19T15_40_53.939427", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-11-19T15-40-53.939427.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-11-19T15-40-53.939427.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_11_19T15_40_53.939427", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-11-19T15-40-53.939427.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-11-19T15-40-53.939427.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_11_19T15_40_53.939427", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-11-19T15-40-53.939427.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-11-19T15-40-53.939427.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_11_19T15_40_53.939427", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-11-19T15-40-53.939427.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-11-19T15-40-53.939427.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_11_19T15_40_53.939427", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-11-19T15-40-53.939427.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-11-19T15-40-53.939427.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_11_19T15_40_53.939427", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-11-19T15-40-53.939427.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-11-19T15-40-53.939427.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_11_19T15_40_53.939427", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-19T15-40-53.939427.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-19T15-40-53.939427.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_11_19T15_40_53.939427", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-11-19T15-40-53.939427.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-11-19T15-40-53.939427.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_11_19T15_40_53.939427", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-19T15-40-53.939427.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-19T15-40-53.939427.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_11_19T15_40_53.939427", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-19T15-40-53.939427.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-19T15-40-53.939427.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_11_19T15_40_53.939427", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-11-19T15-40-53.939427.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-11-19T15-40-53.939427.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_11_19T15_40_53.939427", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-11-19T15-40-53.939427.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-11-19T15-40-53.939427.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_11_19T15_40_53.939427", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-11-19T15-40-53.939427.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-11-19T15-40-53.939427.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_11_19T15_40_53.939427", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-19T15-40-53.939427.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-19T15-40-53.939427.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_11_19T15_40_53.939427", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-19T15-40-53.939427.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-19T15-40-53.939427.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_11_19T15_40_53.939427", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-19T15-40-53.939427.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-19T15-40-53.939427.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_11_19T15_40_53.939427", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-11-19T15-40-53.939427.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-11-19T15-40-53.939427.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_11_19T15_40_53.939427", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-19T15-40-53.939427.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-19T15-40-53.939427.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_11_19T15_40_53.939427", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-19T15-40-53.939427.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-19T15-40-53.939427.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_11_19T15_40_53.939427", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-19T15-40-53.939427.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-19T15-40-53.939427.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_11_19T15_40_53.939427", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-19T15-40-53.939427.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-19T15-40-53.939427.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_11_19T15_40_53.939427", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-11-19T15-40-53.939427.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-11-19T15-40-53.939427.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_11_19T15_40_53.939427", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-19T15-40-53.939427.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-19T15-40-53.939427.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_11_19T15_40_53.939427", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-19T15-40-53.939427.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-19T15-40-53.939427.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_11_19T15_40_53.939427", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-19T15-40-53.939427.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-19T15-40-53.939427.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_11_19T15_40_53.939427", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-19T15-40-53.939427.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-19T15-40-53.939427.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_11_19T15_40_53.939427", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-11-19T15-40-53.939427.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-11-19T15-40-53.939427.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_11_19T15_40_53.939427", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-11-19T15-40-53.939427.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-11-19T15-40-53.939427.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_11_19T15_40_53.939427", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-11-19T15-40-53.939427.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-11-19T15-40-53.939427.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_11_19T15_40_53.939427", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-11-19T15-40-53.939427.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-11-19T15-40-53.939427.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_11_19T15_40_53.939427", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-19T15-40-53.939427.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-19T15-40-53.939427.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_11_19T15_40_53.939427", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-11-19T15-40-53.939427.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-11-19T15-40-53.939427.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_11_19T15_40_53.939427", "path": ["**/details_harness|hendrycksTest-management|5_2023-11-19T15-40-53.939427.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-11-19T15-40-53.939427.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_11_19T15_40_53.939427", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-11-19T15-40-53.939427.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-11-19T15-40-53.939427.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_11_19T15_40_53.939427", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-11-19T15-40-53.939427.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-11-19T15-40-53.939427.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_11_19T15_40_53.939427", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-11-19T15-40-53.939427.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-11-19T15-40-53.939427.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_11_19T15_40_53.939427", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-11-19T15-40-53.939427.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-11-19T15-40-53.939427.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_11_19T15_40_53.939427", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-19T15-40-53.939427.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-19T15-40-53.939427.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_11_19T15_40_53.939427", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-11-19T15-40-53.939427.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-11-19T15-40-53.939427.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_11_19T15_40_53.939427", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-11-19T15-40-53.939427.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-11-19T15-40-53.939427.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_11_19T15_40_53.939427", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-11-19T15-40-53.939427.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-11-19T15-40-53.939427.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_11_19T15_40_53.939427", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-11-19T15-40-53.939427.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-11-19T15-40-53.939427.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_11_19T15_40_53.939427", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-11-19T15-40-53.939427.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-11-19T15-40-53.939427.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_11_19T15_40_53.939427", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-11-19T15-40-53.939427.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-11-19T15-40-53.939427.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_11_19T15_40_53.939427", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-11-19T15-40-53.939427.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-11-19T15-40-53.939427.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_11_19T15_40_53.939427", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-11-19T15-40-53.939427.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-11-19T15-40-53.939427.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_11_19T15_40_53.939427", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-11-19T15-40-53.939427.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-11-19T15-40-53.939427.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_11_19T15_40_53.939427", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-11-19T15-40-53.939427.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-11-19T15-40-53.939427.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_11_19T15_40_53.939427", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-19T15-40-53.939427.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-19T15-40-53.939427.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_11_19T15_40_53.939427", "path": ["**/details_harness|hendrycksTest-virology|5_2023-11-19T15-40-53.939427.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-11-19T15-40-53.939427.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_11_19T15_40_53.939427", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-11-19T15-40-53.939427.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-11-19T15-40-53.939427.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_11_19T15_40_53.939427", "path": ["**/details_harness|truthfulqa:mc|0_2023-11-19T15-40-53.939427.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-11-19T15-40-53.939427.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_11_19T15_40_53.939427", "path": ["**/details_harness|winogrande|5_2023-11-19T15-40-53.939427.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-11-19T15-40-53.939427.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_11_19T15_40_53.939427", "path": ["results_2023-11-19T15-40-53.939427.parquet"]}, {"split": "latest", "path": ["results_2023-11-19T15-40-53.939427.parquet"]}]}]} | 2023-11-19T15:44:40+00:00 | []
| []
| TAGS
#region-us
|
# Dataset Card for Evaluation run of PeanutJar/Mistral-v0.1-PeanutButter-v0.0.5-SFT-7B-QLoRA
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model PeanutJar/Mistral-v0.1-PeanutButter-v0.0.5-SFT-7B-QLoRA on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-11-19T15:40:53.939427(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
| [
"# Dataset Card for Evaluation run of PeanutJar/Mistral-v0.1-PeanutButter-v0.0.5-SFT-7B-QLoRA",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model PeanutJar/Mistral-v0.1-PeanutButter-v0.0.5-SFT-7B-QLoRA on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-11-19T15:40:53.939427(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
]
| [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of PeanutJar/Mistral-v0.1-PeanutButter-v0.0.5-SFT-7B-QLoRA",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model PeanutJar/Mistral-v0.1-PeanutButter-v0.0.5-SFT-7B-QLoRA on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-11-19T15:40:53.939427(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
]
| [
6,
38,
31,
187,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
]
| [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of PeanutJar/Mistral-v0.1-PeanutButter-v0.0.5-SFT-7B-QLoRA## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model PeanutJar/Mistral-v0.1-PeanutButter-v0.0.5-SFT-7B-QLoRA on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-11-19T15:40:53.939427(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
]
|
1d4c5cab9fe1d3e46552583fbcc6a6cc0f47ce05 |
# Dataset Card for Evaluation run of KnutJaegersberg/webMistral-7B
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/KnutJaegersberg/webMistral-7B
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [KnutJaegersberg/webMistral-7B](https://huggingface.co/KnutJaegersberg/webMistral-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_KnutJaegersberg__webMistral-7B_public",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-11-19T15:44:56.176634](https://huggingface.co/datasets/open-llm-leaderboard/details_KnutJaegersberg__webMistral-7B_public/blob/main/results_2023-11-19T15-44-56.176634.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5841590014890155,
"acc_stderr": 0.03333753951949392,
"acc_norm": 0.5936832280847818,
"acc_norm_stderr": 0.03411324688283648,
"mc1": 0.2668298653610771,
"mc1_stderr": 0.015483691939237272,
"mc2": 0.397102297699568,
"mc2_stderr": 0.014419759087988877,
"em": 0.001363255033557047,
"em_stderr": 0.0003778609196460787,
"f1": 0.05746224832214775,
"f1_stderr": 0.0013324273038450888
},
"harness|arc:challenge|25": {
"acc": 0.53839590443686,
"acc_stderr": 0.014568245550296356,
"acc_norm": 0.590443686006826,
"acc_norm_stderr": 0.014370358632472446
},
"harness|hellaswag|10": {
"acc": 0.6154152559251145,
"acc_stderr": 0.004855027248398159,
"acc_norm": 0.8089026090420235,
"acc_norm_stderr": 0.003923620666711542
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6,
"acc_stderr": 0.04232073695151589,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04232073695151589
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6381578947368421,
"acc_stderr": 0.039105257528497236,
"acc_norm": 0.6381578947368421,
"acc_norm_stderr": 0.039105257528497236
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6415094339622641,
"acc_stderr": 0.029514703583981762,
"acc_norm": 0.6415094339622641,
"acc_norm_stderr": 0.029514703583981762
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.03942082639927213,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.03942082639927213
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.55,
"acc_stderr": 0.05,
"acc_norm": 0.55,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6184971098265896,
"acc_stderr": 0.037038511930995215,
"acc_norm": 0.6184971098265896,
"acc_norm_stderr": 0.037038511930995215
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4411764705882353,
"acc_stderr": 0.049406356306056595,
"acc_norm": 0.4411764705882353,
"acc_norm_stderr": 0.049406356306056595
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.73,
"acc_stderr": 0.04461960433384739,
"acc_norm": 0.73,
"acc_norm_stderr": 0.04461960433384739
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5106382978723404,
"acc_stderr": 0.03267862331014063,
"acc_norm": 0.5106382978723404,
"acc_norm_stderr": 0.03267862331014063
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.43859649122807015,
"acc_stderr": 0.04668000738510455,
"acc_norm": 0.43859649122807015,
"acc_norm_stderr": 0.04668000738510455
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5172413793103449,
"acc_stderr": 0.04164188720169375,
"acc_norm": 0.5172413793103449,
"acc_norm_stderr": 0.04164188720169375
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.37566137566137564,
"acc_stderr": 0.024942368931159795,
"acc_norm": 0.37566137566137564,
"acc_norm_stderr": 0.024942368931159795
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.29365079365079366,
"acc_stderr": 0.04073524322147125,
"acc_norm": 0.29365079365079366,
"acc_norm_stderr": 0.04073524322147125
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6967741935483871,
"acc_stderr": 0.02614868593067175,
"acc_norm": 0.6967741935483871,
"acc_norm_stderr": 0.02614868593067175
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.46798029556650245,
"acc_stderr": 0.035107665979592154,
"acc_norm": 0.46798029556650245,
"acc_norm_stderr": 0.035107665979592154
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.57,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.57,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.696969696969697,
"acc_stderr": 0.03588624800091706,
"acc_norm": 0.696969696969697,
"acc_norm_stderr": 0.03588624800091706
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7878787878787878,
"acc_stderr": 0.029126522834586818,
"acc_norm": 0.7878787878787878,
"acc_norm_stderr": 0.029126522834586818
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8290155440414507,
"acc_stderr": 0.027171213683164552,
"acc_norm": 0.8290155440414507,
"acc_norm_stderr": 0.027171213683164552
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5846153846153846,
"acc_stderr": 0.024985354923102346,
"acc_norm": 0.5846153846153846,
"acc_norm_stderr": 0.024985354923102346
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.32222222222222224,
"acc_stderr": 0.028493465091028597,
"acc_norm": 0.32222222222222224,
"acc_norm_stderr": 0.028493465091028597
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6260504201680672,
"acc_stderr": 0.03142946637883708,
"acc_norm": 0.6260504201680672,
"acc_norm_stderr": 0.03142946637883708
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3509933774834437,
"acc_stderr": 0.03896981964257375,
"acc_norm": 0.3509933774834437,
"acc_norm_stderr": 0.03896981964257375
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7596330275229358,
"acc_stderr": 0.01832060732096407,
"acc_norm": 0.7596330275229358,
"acc_norm_stderr": 0.01832060732096407
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4212962962962963,
"acc_stderr": 0.03367462138896078,
"acc_norm": 0.4212962962962963,
"acc_norm_stderr": 0.03367462138896078
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7549019607843137,
"acc_stderr": 0.03019028245350194,
"acc_norm": 0.7549019607843137,
"acc_norm_stderr": 0.03019028245350194
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7046413502109705,
"acc_stderr": 0.029696338713422882,
"acc_norm": 0.7046413502109705,
"acc_norm_stderr": 0.029696338713422882
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.672645739910314,
"acc_stderr": 0.03149384670994131,
"acc_norm": 0.672645739910314,
"acc_norm_stderr": 0.03149384670994131
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6946564885496184,
"acc_stderr": 0.04039314978724561,
"acc_norm": 0.6946564885496184,
"acc_norm_stderr": 0.04039314978724561
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7272727272727273,
"acc_stderr": 0.04065578140908705,
"acc_norm": 0.7272727272727273,
"acc_norm_stderr": 0.04065578140908705
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.04557239513497752,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.04557239513497752
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7300613496932515,
"acc_stderr": 0.034878251684978906,
"acc_norm": 0.7300613496932515,
"acc_norm_stderr": 0.034878251684978906
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.36607142857142855,
"acc_stderr": 0.045723723587374296,
"acc_norm": 0.36607142857142855,
"acc_norm_stderr": 0.045723723587374296
},
"harness|hendrycksTest-management|5": {
"acc": 0.7281553398058253,
"acc_stderr": 0.044052680241409216,
"acc_norm": 0.7281553398058253,
"acc_norm_stderr": 0.044052680241409216
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8461538461538461,
"acc_stderr": 0.02363687331748929,
"acc_norm": 0.8461538461538461,
"acc_norm_stderr": 0.02363687331748929
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.68,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.68,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7828863346104725,
"acc_stderr": 0.014743125394823288,
"acc_norm": 0.7828863346104725,
"acc_norm_stderr": 0.014743125394823288
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6358381502890174,
"acc_stderr": 0.025906632631016124,
"acc_norm": 0.6358381502890174,
"acc_norm_stderr": 0.025906632631016124
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2737430167597765,
"acc_stderr": 0.014912413096372434,
"acc_norm": 0.2737430167597765,
"acc_norm_stderr": 0.014912413096372434
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6601307189542484,
"acc_stderr": 0.027121956071388873,
"acc_norm": 0.6601307189542484,
"acc_norm_stderr": 0.027121956071388873
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6430868167202572,
"acc_stderr": 0.027210420375934023,
"acc_norm": 0.6430868167202572,
"acc_norm_stderr": 0.027210420375934023
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.654320987654321,
"acc_stderr": 0.026462487777001865,
"acc_norm": 0.654320987654321,
"acc_norm_stderr": 0.026462487777001865
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4574468085106383,
"acc_stderr": 0.029719281272236855,
"acc_norm": 0.4574468085106383,
"acc_norm_stderr": 0.029719281272236855
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4211212516297262,
"acc_stderr": 0.012610325733489905,
"acc_norm": 0.4211212516297262,
"acc_norm_stderr": 0.012610325733489905
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5845588235294118,
"acc_stderr": 0.029935342707877746,
"acc_norm": 0.5845588235294118,
"acc_norm_stderr": 0.029935342707877746
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5849673202614379,
"acc_stderr": 0.01993362777685742,
"acc_norm": 0.5849673202614379,
"acc_norm_stderr": 0.01993362777685742
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6363636363636364,
"acc_stderr": 0.046075820907199756,
"acc_norm": 0.6363636363636364,
"acc_norm_stderr": 0.046075820907199756
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.636734693877551,
"acc_stderr": 0.030789051139030806,
"acc_norm": 0.636734693877551,
"acc_norm_stderr": 0.030789051139030806
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8208955223880597,
"acc_stderr": 0.027113286753111844,
"acc_norm": 0.8208955223880597,
"acc_norm_stderr": 0.027113286753111844
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.82,
"acc_stderr": 0.03861229196653694,
"acc_norm": 0.82,
"acc_norm_stderr": 0.03861229196653694
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5240963855421686,
"acc_stderr": 0.03887971849597264,
"acc_norm": 0.5240963855421686,
"acc_norm_stderr": 0.03887971849597264
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7953216374269005,
"acc_stderr": 0.030944459778533193,
"acc_norm": 0.7953216374269005,
"acc_norm_stderr": 0.030944459778533193
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2668298653610771,
"mc1_stderr": 0.015483691939237272,
"mc2": 0.397102297699568,
"mc2_stderr": 0.014419759087988877
},
"harness|winogrande|5": {
"acc": 0.7632202052091555,
"acc_stderr": 0.011947592365207394
},
"harness|drop|3": {
"em": 0.001363255033557047,
"em_stderr": 0.0003778609196460787,
"f1": 0.05746224832214775,
"f1_stderr": 0.0013324273038450888
},
"harness|gsm8k|5": {
"acc": 0.0887035633055345,
"acc_stderr": 0.007831458737058717
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] | open-llm-leaderboard/details_KnutJaegersberg__webMistral-7B | [
"region:us"
]
| 2023-11-19T15:47:56+00:00 | {"pretty_name": "Evaluation run of KnutJaegersberg/webMistral-7B", "dataset_summary": "Dataset automatically created during the evaluation run of model [KnutJaegersberg/webMistral-7B](https://huggingface.co/KnutJaegersberg/webMistral-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_KnutJaegersberg__webMistral-7B_public\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-11-19T15:44:56.176634](https://huggingface.co/datasets/open-llm-leaderboard/details_KnutJaegersberg__webMistral-7B_public/blob/main/results_2023-11-19T15-44-56.176634.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5841590014890155,\n \"acc_stderr\": 0.03333753951949392,\n \"acc_norm\": 0.5936832280847818,\n \"acc_norm_stderr\": 0.03411324688283648,\n \"mc1\": 0.2668298653610771,\n \"mc1_stderr\": 0.015483691939237272,\n \"mc2\": 0.397102297699568,\n \"mc2_stderr\": 0.014419759087988877,\n \"em\": 0.001363255033557047,\n \"em_stderr\": 0.0003778609196460787,\n \"f1\": 0.05746224832214775,\n \"f1_stderr\": 0.0013324273038450888\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.53839590443686,\n \"acc_stderr\": 0.014568245550296356,\n \"acc_norm\": 0.590443686006826,\n \"acc_norm_stderr\": 0.014370358632472446\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6154152559251145,\n \"acc_stderr\": 0.004855027248398159,\n \"acc_norm\": 0.8089026090420235,\n \"acc_norm_stderr\": 0.003923620666711542\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.04232073695151589,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.04232073695151589\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6381578947368421,\n \"acc_stderr\": 0.039105257528497236,\n \"acc_norm\": 0.6381578947368421,\n \"acc_norm_stderr\": 0.039105257528497236\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6415094339622641,\n \"acc_stderr\": 0.029514703583981762,\n \"acc_norm\": 0.6415094339622641,\n \"acc_norm_stderr\": 0.029514703583981762\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.03942082639927213,\n \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.03942082639927213\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.55,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.55,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6184971098265896,\n \"acc_stderr\": 0.037038511930995215,\n \"acc_norm\": 0.6184971098265896,\n \"acc_norm_stderr\": 0.037038511930995215\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4411764705882353,\n \"acc_stderr\": 0.049406356306056595,\n \"acc_norm\": 0.4411764705882353,\n \"acc_norm_stderr\": 0.049406356306056595\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.73,\n \"acc_stderr\": 0.04461960433384739,\n \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.04461960433384739\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5106382978723404,\n \"acc_stderr\": 0.03267862331014063,\n \"acc_norm\": 0.5106382978723404,\n \"acc_norm_stderr\": 0.03267862331014063\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.43859649122807015,\n \"acc_stderr\": 0.04668000738510455,\n \"acc_norm\": 0.43859649122807015,\n \"acc_norm_stderr\": 0.04668000738510455\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5172413793103449,\n \"acc_stderr\": 0.04164188720169375,\n \"acc_norm\": 0.5172413793103449,\n \"acc_norm_stderr\": 0.04164188720169375\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.37566137566137564,\n \"acc_stderr\": 0.024942368931159795,\n \"acc_norm\": 0.37566137566137564,\n \"acc_norm_stderr\": 0.024942368931159795\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.29365079365079366,\n \"acc_stderr\": 0.04073524322147125,\n \"acc_norm\": 0.29365079365079366,\n \"acc_norm_stderr\": 0.04073524322147125\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6967741935483871,\n \"acc_stderr\": 0.02614868593067175,\n \"acc_norm\": 0.6967741935483871,\n \"acc_norm_stderr\": 0.02614868593067175\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.46798029556650245,\n \"acc_stderr\": 0.035107665979592154,\n \"acc_norm\": 0.46798029556650245,\n \"acc_norm_stderr\": 0.035107665979592154\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.57,\n \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\": 0.57,\n \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.696969696969697,\n \"acc_stderr\": 0.03588624800091706,\n \"acc_norm\": 0.696969696969697,\n \"acc_norm_stderr\": 0.03588624800091706\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7878787878787878,\n \"acc_stderr\": 0.029126522834586818,\n \"acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.029126522834586818\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8290155440414507,\n \"acc_stderr\": 0.027171213683164552,\n \"acc_norm\": 0.8290155440414507,\n \"acc_norm_stderr\": 0.027171213683164552\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.5846153846153846,\n \"acc_stderr\": 0.024985354923102346,\n \"acc_norm\": 0.5846153846153846,\n \"acc_norm_stderr\": 0.024985354923102346\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.32222222222222224,\n \"acc_stderr\": 0.028493465091028597,\n \"acc_norm\": 0.32222222222222224,\n \"acc_norm_stderr\": 0.028493465091028597\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6260504201680672,\n \"acc_stderr\": 0.03142946637883708,\n \"acc_norm\": 0.6260504201680672,\n \"acc_norm_stderr\": 0.03142946637883708\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3509933774834437,\n \"acc_stderr\": 0.03896981964257375,\n \"acc_norm\": 0.3509933774834437,\n \"acc_norm_stderr\": 0.03896981964257375\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.7596330275229358,\n \"acc_stderr\": 0.01832060732096407,\n \"acc_norm\": 0.7596330275229358,\n \"acc_norm_stderr\": 0.01832060732096407\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.4212962962962963,\n \"acc_stderr\": 0.03367462138896078,\n \"acc_norm\": 0.4212962962962963,\n \"acc_norm_stderr\": 0.03367462138896078\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7549019607843137,\n \"acc_stderr\": 0.03019028245350194,\n \"acc_norm\": 0.7549019607843137,\n \"acc_norm_stderr\": 0.03019028245350194\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7046413502109705,\n \"acc_stderr\": 0.029696338713422882,\n \"acc_norm\": 0.7046413502109705,\n \"acc_norm_stderr\": 0.029696338713422882\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.672645739910314,\n \"acc_stderr\": 0.03149384670994131,\n \"acc_norm\": 0.672645739910314,\n \"acc_norm_stderr\": 0.03149384670994131\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.6946564885496184,\n \"acc_stderr\": 0.04039314978724561,\n \"acc_norm\": 0.6946564885496184,\n \"acc_norm_stderr\": 0.04039314978724561\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7272727272727273,\n \"acc_stderr\": 0.04065578140908705,\n \"acc_norm\": 0.7272727272727273,\n \"acc_norm_stderr\": 0.04065578140908705\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.04557239513497752,\n \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.04557239513497752\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7300613496932515,\n \"acc_stderr\": 0.034878251684978906,\n \"acc_norm\": 0.7300613496932515,\n \"acc_norm_stderr\": 0.034878251684978906\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.36607142857142855,\n \"acc_stderr\": 0.045723723587374296,\n \"acc_norm\": 0.36607142857142855,\n \"acc_norm_stderr\": 0.045723723587374296\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7281553398058253,\n \"acc_stderr\": 0.044052680241409216,\n \"acc_norm\": 0.7281553398058253,\n \"acc_norm_stderr\": 0.044052680241409216\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8461538461538461,\n \"acc_stderr\": 0.02363687331748929,\n \"acc_norm\": 0.8461538461538461,\n \"acc_norm_stderr\": 0.02363687331748929\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.68,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7828863346104725,\n \"acc_stderr\": 0.014743125394823288,\n \"acc_norm\": 0.7828863346104725,\n \"acc_norm_stderr\": 0.014743125394823288\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.6358381502890174,\n \"acc_stderr\": 0.025906632631016124,\n \"acc_norm\": 0.6358381502890174,\n \"acc_norm_stderr\": 0.025906632631016124\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2737430167597765,\n \"acc_stderr\": 0.014912413096372434,\n \"acc_norm\": 0.2737430167597765,\n \"acc_norm_stderr\": 0.014912413096372434\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.6601307189542484,\n \"acc_stderr\": 0.027121956071388873,\n \"acc_norm\": 0.6601307189542484,\n \"acc_norm_stderr\": 0.027121956071388873\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6430868167202572,\n \"acc_stderr\": 0.027210420375934023,\n \"acc_norm\": 0.6430868167202572,\n \"acc_norm_stderr\": 0.027210420375934023\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.654320987654321,\n \"acc_stderr\": 0.026462487777001865,\n \"acc_norm\": 0.654320987654321,\n \"acc_norm_stderr\": 0.026462487777001865\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4574468085106383,\n \"acc_stderr\": 0.029719281272236855,\n \"acc_norm\": 0.4574468085106383,\n \"acc_norm_stderr\": 0.029719281272236855\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4211212516297262,\n \"acc_stderr\": 0.012610325733489905,\n \"acc_norm\": 0.4211212516297262,\n \"acc_norm_stderr\": 0.012610325733489905\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.5845588235294118,\n \"acc_stderr\": 0.029935342707877746,\n \"acc_norm\": 0.5845588235294118,\n \"acc_norm_stderr\": 0.029935342707877746\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.5849673202614379,\n \"acc_stderr\": 0.01993362777685742,\n \"acc_norm\": 0.5849673202614379,\n \"acc_norm_stderr\": 0.01993362777685742\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6363636363636364,\n \"acc_stderr\": 0.046075820907199756,\n \"acc_norm\": 0.6363636363636364,\n \"acc_norm_stderr\": 0.046075820907199756\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.636734693877551,\n \"acc_stderr\": 0.030789051139030806,\n \"acc_norm\": 0.636734693877551,\n \"acc_norm_stderr\": 0.030789051139030806\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8208955223880597,\n \"acc_stderr\": 0.027113286753111844,\n \"acc_norm\": 0.8208955223880597,\n \"acc_norm_stderr\": 0.027113286753111844\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.82,\n \"acc_stderr\": 0.03861229196653694,\n \"acc_norm\": 0.82,\n \"acc_norm_stderr\": 0.03861229196653694\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5240963855421686,\n \"acc_stderr\": 0.03887971849597264,\n \"acc_norm\": 0.5240963855421686,\n \"acc_norm_stderr\": 0.03887971849597264\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.7953216374269005,\n \"acc_stderr\": 0.030944459778533193,\n \"acc_norm\": 0.7953216374269005,\n \"acc_norm_stderr\": 0.030944459778533193\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2668298653610771,\n \"mc1_stderr\": 0.015483691939237272,\n \"mc2\": 0.397102297699568,\n \"mc2_stderr\": 0.014419759087988877\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7632202052091555,\n \"acc_stderr\": 0.011947592365207394\n },\n \"harness|drop|3\": {\n \"em\": 0.001363255033557047,\n \"em_stderr\": 0.0003778609196460787,\n \"f1\": 0.05746224832214775,\n \"f1_stderr\": 0.0013324273038450888\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0887035633055345,\n \"acc_stderr\": 0.007831458737058717\n }\n}\n```", "repo_url": "https://huggingface.co/KnutJaegersberg/webMistral-7B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_11_19T15_44_56.176634", "path": ["**/details_harness|arc:challenge|25_2023-11-19T15-44-56.176634.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-11-19T15-44-56.176634.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_11_19T15_44_56.176634", "path": ["**/details_harness|drop|3_2023-11-19T15-44-56.176634.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-11-19T15-44-56.176634.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_11_19T15_44_56.176634", "path": ["**/details_harness|gsm8k|5_2023-11-19T15-44-56.176634.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-11-19T15-44-56.176634.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_11_19T15_44_56.176634", "path": ["**/details_harness|hellaswag|10_2023-11-19T15-44-56.176634.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-11-19T15-44-56.176634.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_11_19T15_44_56.176634", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-19T15-44-56.176634.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-11-19T15-44-56.176634.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-11-19T15-44-56.176634.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-11-19T15-44-56.176634.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-19T15-44-56.176634.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-11-19T15-44-56.176634.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-11-19T15-44-56.176634.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-11-19T15-44-56.176634.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-11-19T15-44-56.176634.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-11-19T15-44-56.176634.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-11-19T15-44-56.176634.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-11-19T15-44-56.176634.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-19T15-44-56.176634.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-11-19T15-44-56.176634.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-19T15-44-56.176634.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-19T15-44-56.176634.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-11-19T15-44-56.176634.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-11-19T15-44-56.176634.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-11-19T15-44-56.176634.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-19T15-44-56.176634.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-19T15-44-56.176634.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-19T15-44-56.176634.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-11-19T15-44-56.176634.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-19T15-44-56.176634.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-19T15-44-56.176634.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-19T15-44-56.176634.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-19T15-44-56.176634.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-11-19T15-44-56.176634.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-19T15-44-56.176634.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-19T15-44-56.176634.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-19T15-44-56.176634.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-19T15-44-56.176634.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-11-19T15-44-56.176634.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-11-19T15-44-56.176634.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-11-19T15-44-56.176634.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-11-19T15-44-56.176634.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-19T15-44-56.176634.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-11-19T15-44-56.176634.parquet", "**/details_harness|hendrycksTest-management|5_2023-11-19T15-44-56.176634.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-11-19T15-44-56.176634.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-11-19T15-44-56.176634.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-11-19T15-44-56.176634.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-11-19T15-44-56.176634.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-19T15-44-56.176634.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-11-19T15-44-56.176634.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-11-19T15-44-56.176634.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-11-19T15-44-56.176634.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-11-19T15-44-56.176634.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-11-19T15-44-56.176634.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-11-19T15-44-56.176634.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-11-19T15-44-56.176634.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-11-19T15-44-56.176634.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-11-19T15-44-56.176634.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-11-19T15-44-56.176634.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-19T15-44-56.176634.parquet", "**/details_harness|hendrycksTest-virology|5_2023-11-19T15-44-56.176634.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-11-19T15-44-56.176634.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-19T15-44-56.176634.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-11-19T15-44-56.176634.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-11-19T15-44-56.176634.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-11-19T15-44-56.176634.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-19T15-44-56.176634.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-11-19T15-44-56.176634.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-11-19T15-44-56.176634.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-11-19T15-44-56.176634.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-11-19T15-44-56.176634.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-11-19T15-44-56.176634.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-11-19T15-44-56.176634.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-11-19T15-44-56.176634.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-19T15-44-56.176634.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-11-19T15-44-56.176634.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-19T15-44-56.176634.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-19T15-44-56.176634.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-11-19T15-44-56.176634.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-11-19T15-44-56.176634.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-11-19T15-44-56.176634.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-19T15-44-56.176634.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-19T15-44-56.176634.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-19T15-44-56.176634.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-11-19T15-44-56.176634.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-19T15-44-56.176634.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-19T15-44-56.176634.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-19T15-44-56.176634.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-19T15-44-56.176634.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-11-19T15-44-56.176634.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-19T15-44-56.176634.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-19T15-44-56.176634.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-19T15-44-56.176634.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-19T15-44-56.176634.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-11-19T15-44-56.176634.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-11-19T15-44-56.176634.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-11-19T15-44-56.176634.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-11-19T15-44-56.176634.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-19T15-44-56.176634.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-11-19T15-44-56.176634.parquet", "**/details_harness|hendrycksTest-management|5_2023-11-19T15-44-56.176634.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-11-19T15-44-56.176634.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-11-19T15-44-56.176634.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-11-19T15-44-56.176634.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-11-19T15-44-56.176634.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-19T15-44-56.176634.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-11-19T15-44-56.176634.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-11-19T15-44-56.176634.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-11-19T15-44-56.176634.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-11-19T15-44-56.176634.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-11-19T15-44-56.176634.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-11-19T15-44-56.176634.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-11-19T15-44-56.176634.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-11-19T15-44-56.176634.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-11-19T15-44-56.176634.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-11-19T15-44-56.176634.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-19T15-44-56.176634.parquet", "**/details_harness|hendrycksTest-virology|5_2023-11-19T15-44-56.176634.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-11-19T15-44-56.176634.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_11_19T15_44_56.176634", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-19T15-44-56.176634.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-19T15-44-56.176634.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_11_19T15_44_56.176634", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-11-19T15-44-56.176634.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-11-19T15-44-56.176634.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_11_19T15_44_56.176634", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-11-19T15-44-56.176634.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-11-19T15-44-56.176634.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_11_19T15_44_56.176634", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-11-19T15-44-56.176634.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-11-19T15-44-56.176634.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_11_19T15_44_56.176634", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-19T15-44-56.176634.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-19T15-44-56.176634.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_11_19T15_44_56.176634", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-11-19T15-44-56.176634.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-11-19T15-44-56.176634.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_11_19T15_44_56.176634", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-11-19T15-44-56.176634.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-11-19T15-44-56.176634.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_11_19T15_44_56.176634", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-11-19T15-44-56.176634.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-11-19T15-44-56.176634.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_11_19T15_44_56.176634", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-11-19T15-44-56.176634.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-11-19T15-44-56.176634.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_11_19T15_44_56.176634", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-11-19T15-44-56.176634.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-11-19T15-44-56.176634.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_11_19T15_44_56.176634", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-11-19T15-44-56.176634.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-11-19T15-44-56.176634.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_11_19T15_44_56.176634", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-11-19T15-44-56.176634.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-11-19T15-44-56.176634.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_11_19T15_44_56.176634", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-19T15-44-56.176634.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-19T15-44-56.176634.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_11_19T15_44_56.176634", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-11-19T15-44-56.176634.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-11-19T15-44-56.176634.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_11_19T15_44_56.176634", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-19T15-44-56.176634.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-19T15-44-56.176634.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_11_19T15_44_56.176634", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-19T15-44-56.176634.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-19T15-44-56.176634.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_11_19T15_44_56.176634", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-11-19T15-44-56.176634.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-11-19T15-44-56.176634.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_11_19T15_44_56.176634", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-11-19T15-44-56.176634.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-11-19T15-44-56.176634.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_11_19T15_44_56.176634", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-11-19T15-44-56.176634.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-11-19T15-44-56.176634.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_11_19T15_44_56.176634", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-19T15-44-56.176634.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-19T15-44-56.176634.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_11_19T15_44_56.176634", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-19T15-44-56.176634.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-19T15-44-56.176634.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_11_19T15_44_56.176634", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-19T15-44-56.176634.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-19T15-44-56.176634.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_11_19T15_44_56.176634", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-11-19T15-44-56.176634.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-11-19T15-44-56.176634.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_11_19T15_44_56.176634", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-19T15-44-56.176634.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-19T15-44-56.176634.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_11_19T15_44_56.176634", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-19T15-44-56.176634.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-19T15-44-56.176634.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_11_19T15_44_56.176634", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-19T15-44-56.176634.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-19T15-44-56.176634.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_11_19T15_44_56.176634", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-19T15-44-56.176634.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-19T15-44-56.176634.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_11_19T15_44_56.176634", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-11-19T15-44-56.176634.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-11-19T15-44-56.176634.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_11_19T15_44_56.176634", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-19T15-44-56.176634.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-19T15-44-56.176634.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_11_19T15_44_56.176634", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-19T15-44-56.176634.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-19T15-44-56.176634.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_11_19T15_44_56.176634", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-19T15-44-56.176634.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-19T15-44-56.176634.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_11_19T15_44_56.176634", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-19T15-44-56.176634.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-19T15-44-56.176634.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_11_19T15_44_56.176634", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-11-19T15-44-56.176634.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-11-19T15-44-56.176634.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_11_19T15_44_56.176634", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-11-19T15-44-56.176634.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-11-19T15-44-56.176634.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_11_19T15_44_56.176634", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-11-19T15-44-56.176634.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-11-19T15-44-56.176634.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_11_19T15_44_56.176634", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-11-19T15-44-56.176634.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-11-19T15-44-56.176634.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_11_19T15_44_56.176634", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-19T15-44-56.176634.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-19T15-44-56.176634.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_11_19T15_44_56.176634", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-11-19T15-44-56.176634.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-11-19T15-44-56.176634.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_11_19T15_44_56.176634", "path": ["**/details_harness|hendrycksTest-management|5_2023-11-19T15-44-56.176634.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-11-19T15-44-56.176634.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_11_19T15_44_56.176634", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-11-19T15-44-56.176634.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-11-19T15-44-56.176634.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_11_19T15_44_56.176634", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-11-19T15-44-56.176634.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-11-19T15-44-56.176634.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_11_19T15_44_56.176634", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-11-19T15-44-56.176634.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-11-19T15-44-56.176634.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_11_19T15_44_56.176634", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-11-19T15-44-56.176634.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-11-19T15-44-56.176634.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_11_19T15_44_56.176634", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-19T15-44-56.176634.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-19T15-44-56.176634.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_11_19T15_44_56.176634", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-11-19T15-44-56.176634.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-11-19T15-44-56.176634.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_11_19T15_44_56.176634", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-11-19T15-44-56.176634.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-11-19T15-44-56.176634.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_11_19T15_44_56.176634", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-11-19T15-44-56.176634.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-11-19T15-44-56.176634.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_11_19T15_44_56.176634", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-11-19T15-44-56.176634.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-11-19T15-44-56.176634.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_11_19T15_44_56.176634", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-11-19T15-44-56.176634.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-11-19T15-44-56.176634.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_11_19T15_44_56.176634", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-11-19T15-44-56.176634.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-11-19T15-44-56.176634.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_11_19T15_44_56.176634", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-11-19T15-44-56.176634.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-11-19T15-44-56.176634.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_11_19T15_44_56.176634", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-11-19T15-44-56.176634.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-11-19T15-44-56.176634.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_11_19T15_44_56.176634", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-11-19T15-44-56.176634.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-11-19T15-44-56.176634.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_11_19T15_44_56.176634", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-11-19T15-44-56.176634.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-11-19T15-44-56.176634.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_11_19T15_44_56.176634", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-19T15-44-56.176634.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-19T15-44-56.176634.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_11_19T15_44_56.176634", "path": ["**/details_harness|hendrycksTest-virology|5_2023-11-19T15-44-56.176634.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-11-19T15-44-56.176634.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_11_19T15_44_56.176634", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-11-19T15-44-56.176634.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-11-19T15-44-56.176634.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_11_19T15_44_56.176634", "path": ["**/details_harness|truthfulqa:mc|0_2023-11-19T15-44-56.176634.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-11-19T15-44-56.176634.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_11_19T15_44_56.176634", "path": ["**/details_harness|winogrande|5_2023-11-19T15-44-56.176634.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-11-19T15-44-56.176634.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_11_19T15_44_56.176634", "path": ["results_2023-11-19T15-44-56.176634.parquet"]}, {"split": "latest", "path": ["results_2023-11-19T15-44-56.176634.parquet"]}]}]} | 2023-11-19T15:48:43+00:00 | []
| []
| TAGS
#region-us
|
# Dataset Card for Evaluation run of KnutJaegersberg/webMistral-7B
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model KnutJaegersberg/webMistral-7B on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-11-19T15:44:56.176634(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
| [
"# Dataset Card for Evaluation run of KnutJaegersberg/webMistral-7B",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model KnutJaegersberg/webMistral-7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-11-19T15:44:56.176634(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
]
| [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of KnutJaegersberg/webMistral-7B",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model KnutJaegersberg/webMistral-7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-11-19T15:44:56.176634(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
]
| [
6,
20,
31,
169,
66,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
]
| [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of KnutJaegersberg/webMistral-7B## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model KnutJaegersberg/webMistral-7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-11-19T15:44:56.176634(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
]
|
4d6ad4aa1b77a99244e33e785fd73204bb7fe407 |
# Dataset Card for Evaluation run of euclaise/Ferret-7B
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/euclaise/Ferret-7B
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [euclaise/Ferret-7B](https://huggingface.co/euclaise/Ferret-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 4 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_euclaise__Ferret-7B_public",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-11-25T03:02:51.561913](https://huggingface.co/datasets/open-llm-leaderboard/details_euclaise__Ferret-7B_public/blob/main/results_2023-11-25T03-02-51.561913.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5959498298780265,
"acc_stderr": 0.033140542039800984,
"acc_norm": 0.6066121431850051,
"acc_norm_stderr": 0.03397883209596383,
"mc1": 0.2778457772337821,
"mc1_stderr": 0.015680929364024647,
"mc2": 0.4001041496199733,
"mc2_stderr": 0.014571617835253216,
"em": 0.001572986577181208,
"em_stderr": 0.00040584511324177344,
"f1": 0.06579802852349013,
"f1_stderr": 0.0014930152947085352
},
"harness|arc:challenge|25": {
"acc": 0.5767918088737202,
"acc_stderr": 0.014438036220848029,
"acc_norm": 0.6228668941979523,
"acc_norm_stderr": 0.014163366896192596
},
"harness|hellaswag|10": {
"acc": 0.6248755228042223,
"acc_stderr": 0.004831655648489736,
"acc_norm": 0.8130850428201554,
"acc_norm_stderr": 0.00389046515827181
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252606,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252606
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6,
"acc_stderr": 0.042320736951515885,
"acc_norm": 0.6,
"acc_norm_stderr": 0.042320736951515885
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6644736842105263,
"acc_stderr": 0.03842498559395269,
"acc_norm": 0.6644736842105263,
"acc_norm_stderr": 0.03842498559395269
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.55,
"acc_stderr": 0.05,
"acc_norm": 0.55,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6679245283018868,
"acc_stderr": 0.02898545565233439,
"acc_norm": 0.6679245283018868,
"acc_norm_stderr": 0.02898545565233439
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6944444444444444,
"acc_stderr": 0.03852084696008534,
"acc_norm": 0.6944444444444444,
"acc_norm_stderr": 0.03852084696008534
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252604,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252604
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5780346820809249,
"acc_stderr": 0.0376574669386515,
"acc_norm": 0.5780346820809249,
"acc_norm_stderr": 0.0376574669386515
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.37254901960784315,
"acc_stderr": 0.048108401480826346,
"acc_norm": 0.37254901960784315,
"acc_norm_stderr": 0.048108401480826346
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.71,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.71,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5659574468085107,
"acc_stderr": 0.03240038086792747,
"acc_norm": 0.5659574468085107,
"acc_norm_stderr": 0.03240038086792747
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5,
"acc_stderr": 0.047036043419179864,
"acc_norm": 0.5,
"acc_norm_stderr": 0.047036043419179864
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6137931034482759,
"acc_stderr": 0.04057324734419035,
"acc_norm": 0.6137931034482759,
"acc_norm_stderr": 0.04057324734419035
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3915343915343915,
"acc_stderr": 0.025138091388851088,
"acc_norm": 0.3915343915343915,
"acc_norm_stderr": 0.025138091388851088
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3888888888888889,
"acc_stderr": 0.0436031486007746,
"acc_norm": 0.3888888888888889,
"acc_norm_stderr": 0.0436031486007746
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.43,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.43,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6709677419354839,
"acc_stderr": 0.026729499068349954,
"acc_norm": 0.6709677419354839,
"acc_norm_stderr": 0.026729499068349954
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4729064039408867,
"acc_stderr": 0.03512819077876106,
"acc_norm": 0.4729064039408867,
"acc_norm_stderr": 0.03512819077876106
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7393939393939394,
"acc_stderr": 0.034277431758165236,
"acc_norm": 0.7393939393939394,
"acc_norm_stderr": 0.034277431758165236
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7424242424242424,
"acc_stderr": 0.03115626951964683,
"acc_norm": 0.7424242424242424,
"acc_norm_stderr": 0.03115626951964683
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8341968911917098,
"acc_stderr": 0.026839845022314415,
"acc_norm": 0.8341968911917098,
"acc_norm_stderr": 0.026839845022314415
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5897435897435898,
"acc_stderr": 0.024939313906940798,
"acc_norm": 0.5897435897435898,
"acc_norm_stderr": 0.024939313906940798
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.29259259259259257,
"acc_stderr": 0.027738969632176088,
"acc_norm": 0.29259259259259257,
"acc_norm_stderr": 0.027738969632176088
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6512605042016807,
"acc_stderr": 0.030956636328566545,
"acc_norm": 0.6512605042016807,
"acc_norm_stderr": 0.030956636328566545
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33774834437086093,
"acc_stderr": 0.0386155754625517,
"acc_norm": 0.33774834437086093,
"acc_norm_stderr": 0.0386155754625517
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7908256880733945,
"acc_stderr": 0.017437937173343233,
"acc_norm": 0.7908256880733945,
"acc_norm_stderr": 0.017437937173343233
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4166666666666667,
"acc_stderr": 0.03362277436608043,
"acc_norm": 0.4166666666666667,
"acc_norm_stderr": 0.03362277436608043
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7696078431372549,
"acc_stderr": 0.02955429260569506,
"acc_norm": 0.7696078431372549,
"acc_norm_stderr": 0.02955429260569506
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7637130801687764,
"acc_stderr": 0.02765215314415926,
"acc_norm": 0.7637130801687764,
"acc_norm_stderr": 0.02765215314415926
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6905829596412556,
"acc_stderr": 0.03102441174057221,
"acc_norm": 0.6905829596412556,
"acc_norm_stderr": 0.03102441174057221
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.732824427480916,
"acc_stderr": 0.038808483010823944,
"acc_norm": 0.732824427480916,
"acc_norm_stderr": 0.038808483010823944
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7272727272727273,
"acc_stderr": 0.04065578140908705,
"acc_norm": 0.7272727272727273,
"acc_norm_stderr": 0.04065578140908705
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7685185185185185,
"acc_stderr": 0.04077494709252626,
"acc_norm": 0.7685185185185185,
"acc_norm_stderr": 0.04077494709252626
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7239263803680982,
"acc_stderr": 0.035123852837050475,
"acc_norm": 0.7239263803680982,
"acc_norm_stderr": 0.035123852837050475
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.41964285714285715,
"acc_stderr": 0.04684099321077106,
"acc_norm": 0.41964285714285715,
"acc_norm_stderr": 0.04684099321077106
},
"harness|hendrycksTest-management|5": {
"acc": 0.8058252427184466,
"acc_stderr": 0.03916667762822585,
"acc_norm": 0.8058252427184466,
"acc_norm_stderr": 0.03916667762822585
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8034188034188035,
"acc_stderr": 0.026035386098951292,
"acc_norm": 0.8034188034188035,
"acc_norm_stderr": 0.026035386098951292
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.789272030651341,
"acc_stderr": 0.014583812465862543,
"acc_norm": 0.789272030651341,
"acc_norm_stderr": 0.014583812465862543
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.630057803468208,
"acc_stderr": 0.025992472029306376,
"acc_norm": 0.630057803468208,
"acc_norm_stderr": 0.025992472029306376
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.38212290502793295,
"acc_stderr": 0.016251139711570762,
"acc_norm": 0.38212290502793295,
"acc_norm_stderr": 0.016251139711570762
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6601307189542484,
"acc_stderr": 0.02712195607138886,
"acc_norm": 0.6601307189542484,
"acc_norm_stderr": 0.02712195607138886
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6527331189710611,
"acc_stderr": 0.027040745502307336,
"acc_norm": 0.6527331189710611,
"acc_norm_stderr": 0.027040745502307336
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6790123456790124,
"acc_stderr": 0.025976566010862737,
"acc_norm": 0.6790123456790124,
"acc_norm_stderr": 0.025976566010862737
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.450354609929078,
"acc_stderr": 0.02968010556502904,
"acc_norm": 0.450354609929078,
"acc_norm_stderr": 0.02968010556502904
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.3924380704041721,
"acc_stderr": 0.012471243669229106,
"acc_norm": 0.3924380704041721,
"acc_norm_stderr": 0.012471243669229106
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6066176470588235,
"acc_stderr": 0.029674288281311155,
"acc_norm": 0.6066176470588235,
"acc_norm_stderr": 0.029674288281311155
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6160130718954249,
"acc_stderr": 0.01967580813528151,
"acc_norm": 0.6160130718954249,
"acc_norm_stderr": 0.01967580813528151
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6272727272727273,
"acc_stderr": 0.046313813194254656,
"acc_norm": 0.6272727272727273,
"acc_norm_stderr": 0.046313813194254656
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.636734693877551,
"acc_stderr": 0.03078905113903081,
"acc_norm": 0.636734693877551,
"acc_norm_stderr": 0.03078905113903081
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7761194029850746,
"acc_stderr": 0.029475250236017204,
"acc_norm": 0.7761194029850746,
"acc_norm_stderr": 0.029475250236017204
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.83,
"acc_stderr": 0.0377525168068637,
"acc_norm": 0.83,
"acc_norm_stderr": 0.0377525168068637
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5,
"acc_stderr": 0.03892494720807614,
"acc_norm": 0.5,
"acc_norm_stderr": 0.03892494720807614
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.783625730994152,
"acc_stderr": 0.031581495393387324,
"acc_norm": 0.783625730994152,
"acc_norm_stderr": 0.031581495393387324
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2778457772337821,
"mc1_stderr": 0.015680929364024647,
"mc2": 0.4001041496199733,
"mc2_stderr": 0.014571617835253216
},
"harness|winogrande|5": {
"acc": 0.77663772691397,
"acc_stderr": 0.011705697565205198
},
"harness|drop|3": {
"em": 0.001572986577181208,
"em_stderr": 0.00040584511324177344,
"f1": 0.06579802852349013,
"f1_stderr": 0.0014930152947085352
},
"harness|gsm8k|5": {
"acc": 0.02047005307050796,
"acc_stderr": 0.003900413385915721
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] | open-llm-leaderboard/details_euclaise__Ferret-7B | [
"region:us"
]
| 2023-11-19T15:55:56+00:00 | {"pretty_name": "Evaluation run of euclaise/Ferret-7B", "dataset_summary": "Dataset automatically created during the evaluation run of model [euclaise/Ferret-7B](https://huggingface.co/euclaise/Ferret-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 4 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_euclaise__Ferret-7B_public\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-11-25T03:02:51.561913](https://huggingface.co/datasets/open-llm-leaderboard/details_euclaise__Ferret-7B_public/blob/main/results_2023-11-25T03-02-51.561913.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5959498298780265,\n \"acc_stderr\": 0.033140542039800984,\n \"acc_norm\": 0.6066121431850051,\n \"acc_norm_stderr\": 0.03397883209596383,\n \"mc1\": 0.2778457772337821,\n \"mc1_stderr\": 0.015680929364024647,\n \"mc2\": 0.4001041496199733,\n \"mc2_stderr\": 0.014571617835253216,\n \"em\": 0.001572986577181208,\n \"em_stderr\": 0.00040584511324177344,\n \"f1\": 0.06579802852349013,\n \"f1_stderr\": 0.0014930152947085352\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5767918088737202,\n \"acc_stderr\": 0.014438036220848029,\n \"acc_norm\": 0.6228668941979523,\n \"acc_norm_stderr\": 0.014163366896192596\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6248755228042223,\n \"acc_stderr\": 0.004831655648489736,\n \"acc_norm\": 0.8130850428201554,\n \"acc_norm_stderr\": 0.00389046515827181\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252606,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252606\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.042320736951515885,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.042320736951515885\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6644736842105263,\n \"acc_stderr\": 0.03842498559395269,\n \"acc_norm\": 0.6644736842105263,\n \"acc_norm_stderr\": 0.03842498559395269\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.55,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.55,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6679245283018868,\n \"acc_stderr\": 0.02898545565233439,\n \"acc_norm\": 0.6679245283018868,\n \"acc_norm_stderr\": 0.02898545565233439\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6944444444444444,\n \"acc_stderr\": 0.03852084696008534,\n \"acc_norm\": 0.6944444444444444,\n \"acc_norm_stderr\": 0.03852084696008534\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5780346820809249,\n \"acc_stderr\": 0.0376574669386515,\n \"acc_norm\": 0.5780346820809249,\n \"acc_norm_stderr\": 0.0376574669386515\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.37254901960784315,\n \"acc_stderr\": 0.048108401480826346,\n \"acc_norm\": 0.37254901960784315,\n \"acc_norm_stderr\": 0.048108401480826346\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.04560480215720684,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.04560480215720684\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5659574468085107,\n \"acc_stderr\": 0.03240038086792747,\n \"acc_norm\": 0.5659574468085107,\n \"acc_norm_stderr\": 0.03240038086792747\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.047036043419179864,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.047036043419179864\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.6137931034482759,\n \"acc_stderr\": 0.04057324734419035,\n \"acc_norm\": 0.6137931034482759,\n \"acc_norm_stderr\": 0.04057324734419035\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.3915343915343915,\n \"acc_stderr\": 0.025138091388851088,\n \"acc_norm\": 0.3915343915343915,\n \"acc_norm_stderr\": 0.025138091388851088\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3888888888888889,\n \"acc_stderr\": 0.0436031486007746,\n \"acc_norm\": 0.3888888888888889,\n \"acc_norm_stderr\": 0.0436031486007746\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.43,\n \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6709677419354839,\n \"acc_stderr\": 0.026729499068349954,\n \"acc_norm\": 0.6709677419354839,\n \"acc_norm_stderr\": 0.026729499068349954\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4729064039408867,\n \"acc_stderr\": 0.03512819077876106,\n \"acc_norm\": 0.4729064039408867,\n \"acc_norm_stderr\": 0.03512819077876106\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.61,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.61,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7393939393939394,\n \"acc_stderr\": 0.034277431758165236,\n \"acc_norm\": 0.7393939393939394,\n \"acc_norm_stderr\": 0.034277431758165236\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7424242424242424,\n \"acc_stderr\": 0.03115626951964683,\n \"acc_norm\": 0.7424242424242424,\n \"acc_norm_stderr\": 0.03115626951964683\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8341968911917098,\n \"acc_stderr\": 0.026839845022314415,\n \"acc_norm\": 0.8341968911917098,\n \"acc_norm_stderr\": 0.026839845022314415\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.5897435897435898,\n \"acc_stderr\": 0.024939313906940798,\n \"acc_norm\": 0.5897435897435898,\n \"acc_norm_stderr\": 0.024939313906940798\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.29259259259259257,\n \"acc_stderr\": 0.027738969632176088,\n \"acc_norm\": 0.29259259259259257,\n \"acc_norm_stderr\": 0.027738969632176088\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6512605042016807,\n \"acc_stderr\": 0.030956636328566545,\n \"acc_norm\": 0.6512605042016807,\n \"acc_norm_stderr\": 0.030956636328566545\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.33774834437086093,\n \"acc_stderr\": 0.0386155754625517,\n \"acc_norm\": 0.33774834437086093,\n \"acc_norm_stderr\": 0.0386155754625517\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.7908256880733945,\n \"acc_stderr\": 0.017437937173343233,\n \"acc_norm\": 0.7908256880733945,\n \"acc_norm_stderr\": 0.017437937173343233\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.4166666666666667,\n \"acc_stderr\": 0.03362277436608043,\n \"acc_norm\": 0.4166666666666667,\n \"acc_norm_stderr\": 0.03362277436608043\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7696078431372549,\n \"acc_stderr\": 0.02955429260569506,\n \"acc_norm\": 0.7696078431372549,\n \"acc_norm_stderr\": 0.02955429260569506\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7637130801687764,\n \"acc_stderr\": 0.02765215314415926,\n \"acc_norm\": 0.7637130801687764,\n \"acc_norm_stderr\": 0.02765215314415926\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6905829596412556,\n \"acc_stderr\": 0.03102441174057221,\n \"acc_norm\": 0.6905829596412556,\n \"acc_norm_stderr\": 0.03102441174057221\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.732824427480916,\n \"acc_stderr\": 0.038808483010823944,\n \"acc_norm\": 0.732824427480916,\n \"acc_norm_stderr\": 0.038808483010823944\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7272727272727273,\n \"acc_stderr\": 0.04065578140908705,\n \"acc_norm\": 0.7272727272727273,\n \"acc_norm_stderr\": 0.04065578140908705\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n \"acc_stderr\": 0.04077494709252626,\n \"acc_norm\": 0.7685185185185185,\n \"acc_norm_stderr\": 0.04077494709252626\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7239263803680982,\n \"acc_stderr\": 0.035123852837050475,\n \"acc_norm\": 0.7239263803680982,\n \"acc_norm_stderr\": 0.035123852837050475\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.41964285714285715,\n \"acc_stderr\": 0.04684099321077106,\n \"acc_norm\": 0.41964285714285715,\n \"acc_norm_stderr\": 0.04684099321077106\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8058252427184466,\n \"acc_stderr\": 0.03916667762822585,\n \"acc_norm\": 0.8058252427184466,\n \"acc_norm_stderr\": 0.03916667762822585\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8034188034188035,\n \"acc_stderr\": 0.026035386098951292,\n \"acc_norm\": 0.8034188034188035,\n \"acc_norm_stderr\": 0.026035386098951292\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.64,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.789272030651341,\n \"acc_stderr\": 0.014583812465862543,\n \"acc_norm\": 0.789272030651341,\n \"acc_norm_stderr\": 0.014583812465862543\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.630057803468208,\n \"acc_stderr\": 0.025992472029306376,\n \"acc_norm\": 0.630057803468208,\n \"acc_norm_stderr\": 0.025992472029306376\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.38212290502793295,\n \"acc_stderr\": 0.016251139711570762,\n \"acc_norm\": 0.38212290502793295,\n \"acc_norm_stderr\": 0.016251139711570762\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.6601307189542484,\n \"acc_stderr\": 0.02712195607138886,\n \"acc_norm\": 0.6601307189542484,\n \"acc_norm_stderr\": 0.02712195607138886\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6527331189710611,\n \"acc_stderr\": 0.027040745502307336,\n \"acc_norm\": 0.6527331189710611,\n \"acc_norm_stderr\": 0.027040745502307336\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.6790123456790124,\n \"acc_stderr\": 0.025976566010862737,\n \"acc_norm\": 0.6790123456790124,\n \"acc_norm_stderr\": 0.025976566010862737\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.450354609929078,\n \"acc_stderr\": 0.02968010556502904,\n \"acc_norm\": 0.450354609929078,\n \"acc_norm_stderr\": 0.02968010556502904\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3924380704041721,\n \"acc_stderr\": 0.012471243669229106,\n \"acc_norm\": 0.3924380704041721,\n \"acc_norm_stderr\": 0.012471243669229106\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6066176470588235,\n \"acc_stderr\": 0.029674288281311155,\n \"acc_norm\": 0.6066176470588235,\n \"acc_norm_stderr\": 0.029674288281311155\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6160130718954249,\n \"acc_stderr\": 0.01967580813528151,\n \"acc_norm\": 0.6160130718954249,\n \"acc_norm_stderr\": 0.01967580813528151\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6272727272727273,\n \"acc_stderr\": 0.046313813194254656,\n \"acc_norm\": 0.6272727272727273,\n \"acc_norm_stderr\": 0.046313813194254656\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.636734693877551,\n \"acc_stderr\": 0.03078905113903081,\n \"acc_norm\": 0.636734693877551,\n \"acc_norm_stderr\": 0.03078905113903081\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7761194029850746,\n \"acc_stderr\": 0.029475250236017204,\n \"acc_norm\": 0.7761194029850746,\n \"acc_norm_stderr\": 0.029475250236017204\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.83,\n \"acc_stderr\": 0.0377525168068637,\n \"acc_norm\": 0.83,\n \"acc_norm_stderr\": 0.0377525168068637\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.03892494720807614,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.03892494720807614\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.783625730994152,\n \"acc_stderr\": 0.031581495393387324,\n \"acc_norm\": 0.783625730994152,\n \"acc_norm_stderr\": 0.031581495393387324\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2778457772337821,\n \"mc1_stderr\": 0.015680929364024647,\n \"mc2\": 0.4001041496199733,\n \"mc2_stderr\": 0.014571617835253216\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.77663772691397,\n \"acc_stderr\": 0.011705697565205198\n },\n \"harness|drop|3\": {\n \"em\": 0.001572986577181208,\n \"em_stderr\": 0.00040584511324177344,\n \"f1\": 0.06579802852349013,\n \"f1_stderr\": 0.0014930152947085352\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.02047005307050796,\n \"acc_stderr\": 0.003900413385915721\n }\n}\n```", "repo_url": "https://huggingface.co/euclaise/Ferret-7B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_11_19T15_52_54.018947", "path": ["**/details_harness|arc:challenge|25_2023-11-19T15-52-54.018947.parquet"]}, {"split": "2023_11_25T02_44_41.580934", "path": ["**/details_harness|arc:challenge|25_2023-11-25T02-44-41.580934.parquet"]}, {"split": "2023_11_25T02_50_24.454188", "path": ["**/details_harness|arc:challenge|25_2023-11-25T02-50-24.454188.parquet"]}, {"split": "2023_11_25T03_02_51.561913", "path": ["**/details_harness|arc:challenge|25_2023-11-25T03-02-51.561913.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-11-25T03-02-51.561913.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_11_19T15_52_54.018947", "path": ["**/details_harness|drop|3_2023-11-19T15-52-54.018947.parquet"]}, {"split": "2023_11_25T02_44_41.580934", "path": ["**/details_harness|drop|3_2023-11-25T02-44-41.580934.parquet"]}, {"split": "2023_11_25T02_50_24.454188", "path": ["**/details_harness|drop|3_2023-11-25T02-50-24.454188.parquet"]}, {"split": "2023_11_25T03_02_51.561913", "path": ["**/details_harness|drop|3_2023-11-25T03-02-51.561913.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-11-25T03-02-51.561913.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_11_19T15_52_54.018947", "path": ["**/details_harness|gsm8k|5_2023-11-19T15-52-54.018947.parquet"]}, {"split": "2023_11_25T02_44_41.580934", "path": ["**/details_harness|gsm8k|5_2023-11-25T02-44-41.580934.parquet"]}, {"split": "2023_11_25T02_50_24.454188", "path": ["**/details_harness|gsm8k|5_2023-11-25T02-50-24.454188.parquet"]}, {"split": "2023_11_25T03_02_51.561913", "path": ["**/details_harness|gsm8k|5_2023-11-25T03-02-51.561913.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-11-25T03-02-51.561913.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_11_19T15_52_54.018947", "path": ["**/details_harness|hellaswag|10_2023-11-19T15-52-54.018947.parquet"]}, {"split": "2023_11_25T02_44_41.580934", "path": ["**/details_harness|hellaswag|10_2023-11-25T02-44-41.580934.parquet"]}, {"split": "2023_11_25T02_50_24.454188", "path": ["**/details_harness|hellaswag|10_2023-11-25T02-50-24.454188.parquet"]}, {"split": "2023_11_25T03_02_51.561913", "path": ["**/details_harness|hellaswag|10_2023-11-25T03-02-51.561913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-11-25T03-02-51.561913.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_11_19T15_52_54.018947", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-19T15-52-54.018947.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-11-19T15-52-54.018947.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-11-19T15-52-54.018947.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-11-19T15-52-54.018947.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-19T15-52-54.018947.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-11-19T15-52-54.018947.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-11-19T15-52-54.018947.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-11-19T15-52-54.018947.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-11-19T15-52-54.018947.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-11-19T15-52-54.018947.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-11-19T15-52-54.018947.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-11-19T15-52-54.018947.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-19T15-52-54.018947.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-11-19T15-52-54.018947.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-19T15-52-54.018947.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-19T15-52-54.018947.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-11-19T15-52-54.018947.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-11-19T15-52-54.018947.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-11-19T15-52-54.018947.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-19T15-52-54.018947.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-19T15-52-54.018947.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-19T15-52-54.018947.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-11-19T15-52-54.018947.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-19T15-52-54.018947.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-19T15-52-54.018947.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-19T15-52-54.018947.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-19T15-52-54.018947.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-11-19T15-52-54.018947.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-19T15-52-54.018947.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-19T15-52-54.018947.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-19T15-52-54.018947.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-19T15-52-54.018947.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-11-19T15-52-54.018947.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-11-19T15-52-54.018947.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-11-19T15-52-54.018947.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-11-19T15-52-54.018947.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-19T15-52-54.018947.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-11-19T15-52-54.018947.parquet", "**/details_harness|hendrycksTest-management|5_2023-11-19T15-52-54.018947.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-11-19T15-52-54.018947.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-11-19T15-52-54.018947.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-11-19T15-52-54.018947.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-11-19T15-52-54.018947.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-19T15-52-54.018947.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-11-19T15-52-54.018947.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-11-19T15-52-54.018947.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-11-19T15-52-54.018947.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-11-19T15-52-54.018947.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-11-19T15-52-54.018947.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-11-19T15-52-54.018947.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-11-19T15-52-54.018947.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-11-19T15-52-54.018947.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-11-19T15-52-54.018947.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-11-19T15-52-54.018947.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-19T15-52-54.018947.parquet", "**/details_harness|hendrycksTest-virology|5_2023-11-19T15-52-54.018947.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-11-19T15-52-54.018947.parquet"]}, {"split": "2023_11_25T02_44_41.580934", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-25T02-44-41.580934.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-11-25T02-44-41.580934.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-11-25T02-44-41.580934.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-11-25T02-44-41.580934.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-25T02-44-41.580934.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-11-25T02-44-41.580934.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-11-25T02-44-41.580934.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-11-25T02-44-41.580934.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-11-25T02-44-41.580934.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-11-25T02-44-41.580934.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-11-25T02-44-41.580934.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-11-25T02-44-41.580934.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-25T02-44-41.580934.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-11-25T02-44-41.580934.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-25T02-44-41.580934.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-25T02-44-41.580934.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-11-25T02-44-41.580934.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-11-25T02-44-41.580934.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-11-25T02-44-41.580934.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-25T02-44-41.580934.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-25T02-44-41.580934.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-25T02-44-41.580934.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-11-25T02-44-41.580934.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-25T02-44-41.580934.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-25T02-44-41.580934.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-25T02-44-41.580934.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-25T02-44-41.580934.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-11-25T02-44-41.580934.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-25T02-44-41.580934.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-25T02-44-41.580934.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-25T02-44-41.580934.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-25T02-44-41.580934.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-11-25T02-44-41.580934.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-11-25T02-44-41.580934.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-11-25T02-44-41.580934.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-11-25T02-44-41.580934.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-25T02-44-41.580934.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-11-25T02-44-41.580934.parquet", "**/details_harness|hendrycksTest-management|5_2023-11-25T02-44-41.580934.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-11-25T02-44-41.580934.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-11-25T02-44-41.580934.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-11-25T02-44-41.580934.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-11-25T02-44-41.580934.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-25T02-44-41.580934.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-11-25T02-44-41.580934.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-11-25T02-44-41.580934.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-11-25T02-44-41.580934.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-11-25T02-44-41.580934.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-11-25T02-44-41.580934.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-11-25T02-44-41.580934.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-11-25T02-44-41.580934.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-11-25T02-44-41.580934.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-11-25T02-44-41.580934.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-11-25T02-44-41.580934.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-25T02-44-41.580934.parquet", "**/details_harness|hendrycksTest-virology|5_2023-11-25T02-44-41.580934.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-11-25T02-44-41.580934.parquet"]}, {"split": "2023_11_25T02_50_24.454188", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-25T02-50-24.454188.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-11-25T02-50-24.454188.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-11-25T02-50-24.454188.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-11-25T02-50-24.454188.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-25T02-50-24.454188.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-11-25T02-50-24.454188.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-11-25T02-50-24.454188.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-11-25T02-50-24.454188.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-11-25T02-50-24.454188.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-11-25T02-50-24.454188.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-11-25T02-50-24.454188.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-11-25T02-50-24.454188.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-25T02-50-24.454188.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-11-25T02-50-24.454188.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-25T02-50-24.454188.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-25T02-50-24.454188.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-11-25T02-50-24.454188.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-11-25T02-50-24.454188.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-11-25T02-50-24.454188.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-25T02-50-24.454188.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-25T02-50-24.454188.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-25T02-50-24.454188.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-11-25T02-50-24.454188.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-25T02-50-24.454188.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-25T02-50-24.454188.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-25T02-50-24.454188.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-25T02-50-24.454188.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-11-25T02-50-24.454188.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-25T02-50-24.454188.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-25T02-50-24.454188.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-25T02-50-24.454188.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-25T02-50-24.454188.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-11-25T02-50-24.454188.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-11-25T02-50-24.454188.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-11-25T02-50-24.454188.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-11-25T02-50-24.454188.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-25T02-50-24.454188.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-11-25T02-50-24.454188.parquet", "**/details_harness|hendrycksTest-management|5_2023-11-25T02-50-24.454188.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-11-25T02-50-24.454188.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-11-25T02-50-24.454188.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-11-25T02-50-24.454188.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-11-25T02-50-24.454188.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-25T02-50-24.454188.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-11-25T02-50-24.454188.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-11-25T02-50-24.454188.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-11-25T02-50-24.454188.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-11-25T02-50-24.454188.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-11-25T02-50-24.454188.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-11-25T02-50-24.454188.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-11-25T02-50-24.454188.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-11-25T02-50-24.454188.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-11-25T02-50-24.454188.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-11-25T02-50-24.454188.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-25T02-50-24.454188.parquet", "**/details_harness|hendrycksTest-virology|5_2023-11-25T02-50-24.454188.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-11-25T02-50-24.454188.parquet"]}, {"split": "2023_11_25T03_02_51.561913", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-25T03-02-51.561913.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-11-25T03-02-51.561913.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-11-25T03-02-51.561913.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-11-25T03-02-51.561913.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-25T03-02-51.561913.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-11-25T03-02-51.561913.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-11-25T03-02-51.561913.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-11-25T03-02-51.561913.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-11-25T03-02-51.561913.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-11-25T03-02-51.561913.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-11-25T03-02-51.561913.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-11-25T03-02-51.561913.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-25T03-02-51.561913.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-11-25T03-02-51.561913.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-25T03-02-51.561913.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-25T03-02-51.561913.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-11-25T03-02-51.561913.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-11-25T03-02-51.561913.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-11-25T03-02-51.561913.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-25T03-02-51.561913.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-25T03-02-51.561913.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-25T03-02-51.561913.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-11-25T03-02-51.561913.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-25T03-02-51.561913.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-25T03-02-51.561913.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-25T03-02-51.561913.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-25T03-02-51.561913.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-11-25T03-02-51.561913.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-25T03-02-51.561913.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-25T03-02-51.561913.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-25T03-02-51.561913.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-25T03-02-51.561913.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-11-25T03-02-51.561913.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-11-25T03-02-51.561913.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-11-25T03-02-51.561913.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-11-25T03-02-51.561913.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-25T03-02-51.561913.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-11-25T03-02-51.561913.parquet", "**/details_harness|hendrycksTest-management|5_2023-11-25T03-02-51.561913.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-11-25T03-02-51.561913.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-11-25T03-02-51.561913.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-11-25T03-02-51.561913.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-11-25T03-02-51.561913.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-25T03-02-51.561913.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-11-25T03-02-51.561913.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-11-25T03-02-51.561913.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-11-25T03-02-51.561913.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-11-25T03-02-51.561913.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-11-25T03-02-51.561913.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-11-25T03-02-51.561913.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-11-25T03-02-51.561913.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-11-25T03-02-51.561913.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-11-25T03-02-51.561913.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-11-25T03-02-51.561913.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-25T03-02-51.561913.parquet", "**/details_harness|hendrycksTest-virology|5_2023-11-25T03-02-51.561913.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-11-25T03-02-51.561913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-25T03-02-51.561913.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-11-25T03-02-51.561913.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-11-25T03-02-51.561913.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-11-25T03-02-51.561913.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-25T03-02-51.561913.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-11-25T03-02-51.561913.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-11-25T03-02-51.561913.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-11-25T03-02-51.561913.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-11-25T03-02-51.561913.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-11-25T03-02-51.561913.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-11-25T03-02-51.561913.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-11-25T03-02-51.561913.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-25T03-02-51.561913.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-11-25T03-02-51.561913.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-25T03-02-51.561913.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-25T03-02-51.561913.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-11-25T03-02-51.561913.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-11-25T03-02-51.561913.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-11-25T03-02-51.561913.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-25T03-02-51.561913.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-25T03-02-51.561913.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-25T03-02-51.561913.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-11-25T03-02-51.561913.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-25T03-02-51.561913.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-25T03-02-51.561913.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-25T03-02-51.561913.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-25T03-02-51.561913.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-11-25T03-02-51.561913.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-25T03-02-51.561913.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-25T03-02-51.561913.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-25T03-02-51.561913.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-25T03-02-51.561913.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-11-25T03-02-51.561913.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-11-25T03-02-51.561913.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-11-25T03-02-51.561913.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-11-25T03-02-51.561913.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-25T03-02-51.561913.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-11-25T03-02-51.561913.parquet", "**/details_harness|hendrycksTest-management|5_2023-11-25T03-02-51.561913.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-11-25T03-02-51.561913.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-11-25T03-02-51.561913.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-11-25T03-02-51.561913.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-11-25T03-02-51.561913.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-25T03-02-51.561913.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-11-25T03-02-51.561913.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-11-25T03-02-51.561913.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-11-25T03-02-51.561913.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-11-25T03-02-51.561913.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-11-25T03-02-51.561913.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-11-25T03-02-51.561913.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-11-25T03-02-51.561913.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-11-25T03-02-51.561913.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-11-25T03-02-51.561913.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-11-25T03-02-51.561913.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-25T03-02-51.561913.parquet", "**/details_harness|hendrycksTest-virology|5_2023-11-25T03-02-51.561913.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-11-25T03-02-51.561913.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_11_19T15_52_54.018947", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-19T15-52-54.018947.parquet"]}, {"split": "2023_11_25T02_44_41.580934", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-25T02-44-41.580934.parquet"]}, {"split": "2023_11_25T02_50_24.454188", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-25T02-50-24.454188.parquet"]}, {"split": "2023_11_25T03_02_51.561913", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-25T03-02-51.561913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-25T03-02-51.561913.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_11_19T15_52_54.018947", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-11-19T15-52-54.018947.parquet"]}, {"split": "2023_11_25T02_44_41.580934", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-11-25T02-44-41.580934.parquet"]}, {"split": "2023_11_25T02_50_24.454188", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-11-25T02-50-24.454188.parquet"]}, {"split": "2023_11_25T03_02_51.561913", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-11-25T03-02-51.561913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-11-25T03-02-51.561913.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_11_19T15_52_54.018947", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-11-19T15-52-54.018947.parquet"]}, {"split": "2023_11_25T02_44_41.580934", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-11-25T02-44-41.580934.parquet"]}, {"split": "2023_11_25T02_50_24.454188", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-11-25T02-50-24.454188.parquet"]}, {"split": "2023_11_25T03_02_51.561913", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-11-25T03-02-51.561913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-11-25T03-02-51.561913.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_11_19T15_52_54.018947", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-11-19T15-52-54.018947.parquet"]}, {"split": "2023_11_25T02_44_41.580934", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-11-25T02-44-41.580934.parquet"]}, {"split": "2023_11_25T02_50_24.454188", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-11-25T02-50-24.454188.parquet"]}, {"split": "2023_11_25T03_02_51.561913", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-11-25T03-02-51.561913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-11-25T03-02-51.561913.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_11_19T15_52_54.018947", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-19T15-52-54.018947.parquet"]}, {"split": "2023_11_25T02_44_41.580934", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-25T02-44-41.580934.parquet"]}, {"split": "2023_11_25T02_50_24.454188", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-25T02-50-24.454188.parquet"]}, {"split": "2023_11_25T03_02_51.561913", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-25T03-02-51.561913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-25T03-02-51.561913.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_11_19T15_52_54.018947", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-11-19T15-52-54.018947.parquet"]}, {"split": "2023_11_25T02_44_41.580934", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-11-25T02-44-41.580934.parquet"]}, {"split": "2023_11_25T02_50_24.454188", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-11-25T02-50-24.454188.parquet"]}, {"split": "2023_11_25T03_02_51.561913", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-11-25T03-02-51.561913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-11-25T03-02-51.561913.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_11_19T15_52_54.018947", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-11-19T15-52-54.018947.parquet"]}, {"split": "2023_11_25T02_44_41.580934", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-11-25T02-44-41.580934.parquet"]}, {"split": "2023_11_25T02_50_24.454188", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-11-25T02-50-24.454188.parquet"]}, {"split": "2023_11_25T03_02_51.561913", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-11-25T03-02-51.561913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-11-25T03-02-51.561913.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_11_19T15_52_54.018947", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-11-19T15-52-54.018947.parquet"]}, {"split": "2023_11_25T02_44_41.580934", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-11-25T02-44-41.580934.parquet"]}, {"split": "2023_11_25T02_50_24.454188", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-11-25T02-50-24.454188.parquet"]}, {"split": "2023_11_25T03_02_51.561913", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-11-25T03-02-51.561913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-11-25T03-02-51.561913.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_11_19T15_52_54.018947", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-11-19T15-52-54.018947.parquet"]}, {"split": "2023_11_25T02_44_41.580934", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-11-25T02-44-41.580934.parquet"]}, {"split": "2023_11_25T02_50_24.454188", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-11-25T02-50-24.454188.parquet"]}, {"split": "2023_11_25T03_02_51.561913", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-11-25T03-02-51.561913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-11-25T03-02-51.561913.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_11_19T15_52_54.018947", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-11-19T15-52-54.018947.parquet"]}, {"split": "2023_11_25T02_44_41.580934", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-11-25T02-44-41.580934.parquet"]}, {"split": "2023_11_25T02_50_24.454188", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-11-25T02-50-24.454188.parquet"]}, {"split": "2023_11_25T03_02_51.561913", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-11-25T03-02-51.561913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-11-25T03-02-51.561913.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_11_19T15_52_54.018947", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-11-19T15-52-54.018947.parquet"]}, {"split": "2023_11_25T02_44_41.580934", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-11-25T02-44-41.580934.parquet"]}, {"split": "2023_11_25T02_50_24.454188", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-11-25T02-50-24.454188.parquet"]}, {"split": "2023_11_25T03_02_51.561913", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-11-25T03-02-51.561913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-11-25T03-02-51.561913.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_11_19T15_52_54.018947", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-11-19T15-52-54.018947.parquet"]}, {"split": "2023_11_25T02_44_41.580934", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-11-25T02-44-41.580934.parquet"]}, {"split": "2023_11_25T02_50_24.454188", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-11-25T02-50-24.454188.parquet"]}, {"split": "2023_11_25T03_02_51.561913", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-11-25T03-02-51.561913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-11-25T03-02-51.561913.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_11_19T15_52_54.018947", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-19T15-52-54.018947.parquet"]}, {"split": "2023_11_25T02_44_41.580934", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-25T02-44-41.580934.parquet"]}, {"split": "2023_11_25T02_50_24.454188", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-25T02-50-24.454188.parquet"]}, {"split": "2023_11_25T03_02_51.561913", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-25T03-02-51.561913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-25T03-02-51.561913.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_11_19T15_52_54.018947", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-11-19T15-52-54.018947.parquet"]}, {"split": "2023_11_25T02_44_41.580934", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-11-25T02-44-41.580934.parquet"]}, {"split": "2023_11_25T02_50_24.454188", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-11-25T02-50-24.454188.parquet"]}, {"split": "2023_11_25T03_02_51.561913", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-11-25T03-02-51.561913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-11-25T03-02-51.561913.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_11_19T15_52_54.018947", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-19T15-52-54.018947.parquet"]}, {"split": "2023_11_25T02_44_41.580934", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-25T02-44-41.580934.parquet"]}, {"split": "2023_11_25T02_50_24.454188", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-25T02-50-24.454188.parquet"]}, {"split": "2023_11_25T03_02_51.561913", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-25T03-02-51.561913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-25T03-02-51.561913.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_11_19T15_52_54.018947", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-19T15-52-54.018947.parquet"]}, {"split": "2023_11_25T02_44_41.580934", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-25T02-44-41.580934.parquet"]}, {"split": "2023_11_25T02_50_24.454188", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-25T02-50-24.454188.parquet"]}, {"split": "2023_11_25T03_02_51.561913", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-25T03-02-51.561913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-25T03-02-51.561913.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_11_19T15_52_54.018947", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-11-19T15-52-54.018947.parquet"]}, {"split": "2023_11_25T02_44_41.580934", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-11-25T02-44-41.580934.parquet"]}, {"split": "2023_11_25T02_50_24.454188", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-11-25T02-50-24.454188.parquet"]}, {"split": "2023_11_25T03_02_51.561913", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-11-25T03-02-51.561913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-11-25T03-02-51.561913.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_11_19T15_52_54.018947", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-11-19T15-52-54.018947.parquet"]}, {"split": "2023_11_25T02_44_41.580934", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-11-25T02-44-41.580934.parquet"]}, {"split": "2023_11_25T02_50_24.454188", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-11-25T02-50-24.454188.parquet"]}, {"split": "2023_11_25T03_02_51.561913", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-11-25T03-02-51.561913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-11-25T03-02-51.561913.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_11_19T15_52_54.018947", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-11-19T15-52-54.018947.parquet"]}, {"split": "2023_11_25T02_44_41.580934", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-11-25T02-44-41.580934.parquet"]}, {"split": "2023_11_25T02_50_24.454188", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-11-25T02-50-24.454188.parquet"]}, {"split": "2023_11_25T03_02_51.561913", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-11-25T03-02-51.561913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-11-25T03-02-51.561913.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_11_19T15_52_54.018947", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-19T15-52-54.018947.parquet"]}, {"split": "2023_11_25T02_44_41.580934", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-25T02-44-41.580934.parquet"]}, {"split": "2023_11_25T02_50_24.454188", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-25T02-50-24.454188.parquet"]}, {"split": "2023_11_25T03_02_51.561913", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-25T03-02-51.561913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-25T03-02-51.561913.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_11_19T15_52_54.018947", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-19T15-52-54.018947.parquet"]}, {"split": "2023_11_25T02_44_41.580934", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-25T02-44-41.580934.parquet"]}, {"split": "2023_11_25T02_50_24.454188", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-25T02-50-24.454188.parquet"]}, {"split": "2023_11_25T03_02_51.561913", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-25T03-02-51.561913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-25T03-02-51.561913.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_11_19T15_52_54.018947", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-19T15-52-54.018947.parquet"]}, {"split": "2023_11_25T02_44_41.580934", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-25T02-44-41.580934.parquet"]}, {"split": "2023_11_25T02_50_24.454188", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-25T02-50-24.454188.parquet"]}, {"split": "2023_11_25T03_02_51.561913", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-25T03-02-51.561913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-25T03-02-51.561913.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_11_19T15_52_54.018947", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-11-19T15-52-54.018947.parquet"]}, {"split": "2023_11_25T02_44_41.580934", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-11-25T02-44-41.580934.parquet"]}, {"split": "2023_11_25T02_50_24.454188", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-11-25T02-50-24.454188.parquet"]}, {"split": "2023_11_25T03_02_51.561913", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-11-25T03-02-51.561913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-11-25T03-02-51.561913.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_11_19T15_52_54.018947", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-19T15-52-54.018947.parquet"]}, {"split": "2023_11_25T02_44_41.580934", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-25T02-44-41.580934.parquet"]}, {"split": "2023_11_25T02_50_24.454188", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-25T02-50-24.454188.parquet"]}, {"split": "2023_11_25T03_02_51.561913", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-25T03-02-51.561913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-25T03-02-51.561913.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_11_19T15_52_54.018947", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-19T15-52-54.018947.parquet"]}, {"split": "2023_11_25T02_44_41.580934", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-25T02-44-41.580934.parquet"]}, {"split": "2023_11_25T02_50_24.454188", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-25T02-50-24.454188.parquet"]}, {"split": "2023_11_25T03_02_51.561913", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-25T03-02-51.561913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-25T03-02-51.561913.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_11_19T15_52_54.018947", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-19T15-52-54.018947.parquet"]}, {"split": "2023_11_25T02_44_41.580934", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-25T02-44-41.580934.parquet"]}, {"split": "2023_11_25T02_50_24.454188", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-25T02-50-24.454188.parquet"]}, {"split": "2023_11_25T03_02_51.561913", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-25T03-02-51.561913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-25T03-02-51.561913.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_11_19T15_52_54.018947", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-19T15-52-54.018947.parquet"]}, {"split": "2023_11_25T02_44_41.580934", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-25T02-44-41.580934.parquet"]}, {"split": "2023_11_25T02_50_24.454188", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-25T02-50-24.454188.parquet"]}, {"split": "2023_11_25T03_02_51.561913", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-25T03-02-51.561913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-25T03-02-51.561913.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_11_19T15_52_54.018947", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-11-19T15-52-54.018947.parquet"]}, {"split": "2023_11_25T02_44_41.580934", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-11-25T02-44-41.580934.parquet"]}, {"split": "2023_11_25T02_50_24.454188", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-11-25T02-50-24.454188.parquet"]}, {"split": "2023_11_25T03_02_51.561913", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-11-25T03-02-51.561913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-11-25T03-02-51.561913.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_11_19T15_52_54.018947", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-19T15-52-54.018947.parquet"]}, {"split": "2023_11_25T02_44_41.580934", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-25T02-44-41.580934.parquet"]}, {"split": "2023_11_25T02_50_24.454188", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-25T02-50-24.454188.parquet"]}, {"split": "2023_11_25T03_02_51.561913", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-25T03-02-51.561913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-25T03-02-51.561913.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_11_19T15_52_54.018947", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-19T15-52-54.018947.parquet"]}, {"split": "2023_11_25T02_44_41.580934", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-25T02-44-41.580934.parquet"]}, {"split": "2023_11_25T02_50_24.454188", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-25T02-50-24.454188.parquet"]}, {"split": "2023_11_25T03_02_51.561913", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-25T03-02-51.561913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-25T03-02-51.561913.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_11_19T15_52_54.018947", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-19T15-52-54.018947.parquet"]}, {"split": "2023_11_25T02_44_41.580934", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-25T02-44-41.580934.parquet"]}, {"split": "2023_11_25T02_50_24.454188", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-25T02-50-24.454188.parquet"]}, {"split": "2023_11_25T03_02_51.561913", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-25T03-02-51.561913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-25T03-02-51.561913.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_11_19T15_52_54.018947", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-19T15-52-54.018947.parquet"]}, {"split": "2023_11_25T02_44_41.580934", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-25T02-44-41.580934.parquet"]}, {"split": "2023_11_25T02_50_24.454188", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-25T02-50-24.454188.parquet"]}, {"split": "2023_11_25T03_02_51.561913", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-25T03-02-51.561913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-25T03-02-51.561913.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_11_19T15_52_54.018947", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-11-19T15-52-54.018947.parquet"]}, {"split": "2023_11_25T02_44_41.580934", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-11-25T02-44-41.580934.parquet"]}, {"split": "2023_11_25T02_50_24.454188", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-11-25T02-50-24.454188.parquet"]}, {"split": "2023_11_25T03_02_51.561913", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-11-25T03-02-51.561913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-11-25T03-02-51.561913.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_11_19T15_52_54.018947", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-11-19T15-52-54.018947.parquet"]}, {"split": "2023_11_25T02_44_41.580934", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-11-25T02-44-41.580934.parquet"]}, {"split": "2023_11_25T02_50_24.454188", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-11-25T02-50-24.454188.parquet"]}, {"split": "2023_11_25T03_02_51.561913", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-11-25T03-02-51.561913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-11-25T03-02-51.561913.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_11_19T15_52_54.018947", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-11-19T15-52-54.018947.parquet"]}, {"split": "2023_11_25T02_44_41.580934", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-11-25T02-44-41.580934.parquet"]}, {"split": "2023_11_25T02_50_24.454188", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-11-25T02-50-24.454188.parquet"]}, {"split": "2023_11_25T03_02_51.561913", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-11-25T03-02-51.561913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-11-25T03-02-51.561913.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_11_19T15_52_54.018947", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-11-19T15-52-54.018947.parquet"]}, {"split": "2023_11_25T02_44_41.580934", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-11-25T02-44-41.580934.parquet"]}, {"split": "2023_11_25T02_50_24.454188", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-11-25T02-50-24.454188.parquet"]}, {"split": "2023_11_25T03_02_51.561913", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-11-25T03-02-51.561913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-11-25T03-02-51.561913.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_11_19T15_52_54.018947", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-19T15-52-54.018947.parquet"]}, {"split": "2023_11_25T02_44_41.580934", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-25T02-44-41.580934.parquet"]}, {"split": "2023_11_25T02_50_24.454188", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-25T02-50-24.454188.parquet"]}, {"split": "2023_11_25T03_02_51.561913", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-25T03-02-51.561913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-25T03-02-51.561913.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_11_19T15_52_54.018947", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-11-19T15-52-54.018947.parquet"]}, {"split": "2023_11_25T02_44_41.580934", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-11-25T02-44-41.580934.parquet"]}, {"split": "2023_11_25T02_50_24.454188", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-11-25T02-50-24.454188.parquet"]}, {"split": "2023_11_25T03_02_51.561913", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-11-25T03-02-51.561913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-11-25T03-02-51.561913.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_11_19T15_52_54.018947", "path": ["**/details_harness|hendrycksTest-management|5_2023-11-19T15-52-54.018947.parquet"]}, {"split": "2023_11_25T02_44_41.580934", "path": ["**/details_harness|hendrycksTest-management|5_2023-11-25T02-44-41.580934.parquet"]}, {"split": "2023_11_25T02_50_24.454188", "path": ["**/details_harness|hendrycksTest-management|5_2023-11-25T02-50-24.454188.parquet"]}, {"split": "2023_11_25T03_02_51.561913", "path": ["**/details_harness|hendrycksTest-management|5_2023-11-25T03-02-51.561913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-11-25T03-02-51.561913.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_11_19T15_52_54.018947", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-11-19T15-52-54.018947.parquet"]}, {"split": "2023_11_25T02_44_41.580934", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-11-25T02-44-41.580934.parquet"]}, {"split": "2023_11_25T02_50_24.454188", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-11-25T02-50-24.454188.parquet"]}, {"split": "2023_11_25T03_02_51.561913", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-11-25T03-02-51.561913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-11-25T03-02-51.561913.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_11_19T15_52_54.018947", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-11-19T15-52-54.018947.parquet"]}, {"split": "2023_11_25T02_44_41.580934", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-11-25T02-44-41.580934.parquet"]}, {"split": "2023_11_25T02_50_24.454188", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-11-25T02-50-24.454188.parquet"]}, {"split": "2023_11_25T03_02_51.561913", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-11-25T03-02-51.561913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-11-25T03-02-51.561913.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_11_19T15_52_54.018947", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-11-19T15-52-54.018947.parquet"]}, {"split": "2023_11_25T02_44_41.580934", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-11-25T02-44-41.580934.parquet"]}, {"split": "2023_11_25T02_50_24.454188", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-11-25T02-50-24.454188.parquet"]}, {"split": "2023_11_25T03_02_51.561913", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-11-25T03-02-51.561913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-11-25T03-02-51.561913.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_11_19T15_52_54.018947", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-11-19T15-52-54.018947.parquet"]}, {"split": "2023_11_25T02_44_41.580934", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-11-25T02-44-41.580934.parquet"]}, {"split": "2023_11_25T02_50_24.454188", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-11-25T02-50-24.454188.parquet"]}, {"split": "2023_11_25T03_02_51.561913", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-11-25T03-02-51.561913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-11-25T03-02-51.561913.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_11_19T15_52_54.018947", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-19T15-52-54.018947.parquet"]}, {"split": "2023_11_25T02_44_41.580934", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-25T02-44-41.580934.parquet"]}, {"split": "2023_11_25T02_50_24.454188", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-25T02-50-24.454188.parquet"]}, {"split": "2023_11_25T03_02_51.561913", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-25T03-02-51.561913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-25T03-02-51.561913.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_11_19T15_52_54.018947", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-11-19T15-52-54.018947.parquet"]}, {"split": "2023_11_25T02_44_41.580934", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-11-25T02-44-41.580934.parquet"]}, {"split": "2023_11_25T02_50_24.454188", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-11-25T02-50-24.454188.parquet"]}, {"split": "2023_11_25T03_02_51.561913", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-11-25T03-02-51.561913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-11-25T03-02-51.561913.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_11_19T15_52_54.018947", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-11-19T15-52-54.018947.parquet"]}, {"split": "2023_11_25T02_44_41.580934", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-11-25T02-44-41.580934.parquet"]}, {"split": "2023_11_25T02_50_24.454188", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-11-25T02-50-24.454188.parquet"]}, {"split": "2023_11_25T03_02_51.561913", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-11-25T03-02-51.561913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-11-25T03-02-51.561913.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_11_19T15_52_54.018947", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-11-19T15-52-54.018947.parquet"]}, {"split": "2023_11_25T02_44_41.580934", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-11-25T02-44-41.580934.parquet"]}, {"split": "2023_11_25T02_50_24.454188", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-11-25T02-50-24.454188.parquet"]}, {"split": "2023_11_25T03_02_51.561913", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-11-25T03-02-51.561913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-11-25T03-02-51.561913.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_11_19T15_52_54.018947", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-11-19T15-52-54.018947.parquet"]}, {"split": "2023_11_25T02_44_41.580934", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-11-25T02-44-41.580934.parquet"]}, {"split": "2023_11_25T02_50_24.454188", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-11-25T02-50-24.454188.parquet"]}, {"split": "2023_11_25T03_02_51.561913", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-11-25T03-02-51.561913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-11-25T03-02-51.561913.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_11_19T15_52_54.018947", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-11-19T15-52-54.018947.parquet"]}, {"split": "2023_11_25T02_44_41.580934", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-11-25T02-44-41.580934.parquet"]}, {"split": "2023_11_25T02_50_24.454188", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-11-25T02-50-24.454188.parquet"]}, {"split": "2023_11_25T03_02_51.561913", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-11-25T03-02-51.561913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-11-25T03-02-51.561913.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_11_19T15_52_54.018947", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-11-19T15-52-54.018947.parquet"]}, {"split": "2023_11_25T02_44_41.580934", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-11-25T02-44-41.580934.parquet"]}, {"split": "2023_11_25T02_50_24.454188", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-11-25T02-50-24.454188.parquet"]}, {"split": "2023_11_25T03_02_51.561913", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-11-25T03-02-51.561913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-11-25T03-02-51.561913.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_11_19T15_52_54.018947", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-11-19T15-52-54.018947.parquet"]}, {"split": "2023_11_25T02_44_41.580934", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-11-25T02-44-41.580934.parquet"]}, {"split": "2023_11_25T02_50_24.454188", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-11-25T02-50-24.454188.parquet"]}, {"split": "2023_11_25T03_02_51.561913", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-11-25T03-02-51.561913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-11-25T03-02-51.561913.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_11_19T15_52_54.018947", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-11-19T15-52-54.018947.parquet"]}, {"split": "2023_11_25T02_44_41.580934", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-11-25T02-44-41.580934.parquet"]}, {"split": "2023_11_25T02_50_24.454188", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-11-25T02-50-24.454188.parquet"]}, {"split": "2023_11_25T03_02_51.561913", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-11-25T03-02-51.561913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-11-25T03-02-51.561913.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_11_19T15_52_54.018947", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-11-19T15-52-54.018947.parquet"]}, {"split": "2023_11_25T02_44_41.580934", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-11-25T02-44-41.580934.parquet"]}, {"split": "2023_11_25T02_50_24.454188", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-11-25T02-50-24.454188.parquet"]}, {"split": "2023_11_25T03_02_51.561913", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-11-25T03-02-51.561913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-11-25T03-02-51.561913.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_11_19T15_52_54.018947", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-11-19T15-52-54.018947.parquet"]}, {"split": "2023_11_25T02_44_41.580934", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-11-25T02-44-41.580934.parquet"]}, {"split": "2023_11_25T02_50_24.454188", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-11-25T02-50-24.454188.parquet"]}, {"split": "2023_11_25T03_02_51.561913", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-11-25T03-02-51.561913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-11-25T03-02-51.561913.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_11_19T15_52_54.018947", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-19T15-52-54.018947.parquet"]}, {"split": "2023_11_25T02_44_41.580934", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-25T02-44-41.580934.parquet"]}, {"split": "2023_11_25T02_50_24.454188", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-25T02-50-24.454188.parquet"]}, {"split": "2023_11_25T03_02_51.561913", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-25T03-02-51.561913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-25T03-02-51.561913.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_11_19T15_52_54.018947", "path": ["**/details_harness|hendrycksTest-virology|5_2023-11-19T15-52-54.018947.parquet"]}, {"split": "2023_11_25T02_44_41.580934", "path": ["**/details_harness|hendrycksTest-virology|5_2023-11-25T02-44-41.580934.parquet"]}, {"split": "2023_11_25T02_50_24.454188", "path": ["**/details_harness|hendrycksTest-virology|5_2023-11-25T02-50-24.454188.parquet"]}, {"split": "2023_11_25T03_02_51.561913", "path": ["**/details_harness|hendrycksTest-virology|5_2023-11-25T03-02-51.561913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-11-25T03-02-51.561913.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_11_19T15_52_54.018947", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-11-19T15-52-54.018947.parquet"]}, {"split": "2023_11_25T02_44_41.580934", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-11-25T02-44-41.580934.parquet"]}, {"split": "2023_11_25T02_50_24.454188", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-11-25T02-50-24.454188.parquet"]}, {"split": "2023_11_25T03_02_51.561913", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-11-25T03-02-51.561913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-11-25T03-02-51.561913.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_11_19T15_52_54.018947", "path": ["**/details_harness|truthfulqa:mc|0_2023-11-19T15-52-54.018947.parquet"]}, {"split": "2023_11_25T02_44_41.580934", "path": ["**/details_harness|truthfulqa:mc|0_2023-11-25T02-44-41.580934.parquet"]}, {"split": "2023_11_25T02_50_24.454188", "path": ["**/details_harness|truthfulqa:mc|0_2023-11-25T02-50-24.454188.parquet"]}, {"split": "2023_11_25T03_02_51.561913", "path": ["**/details_harness|truthfulqa:mc|0_2023-11-25T03-02-51.561913.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-11-25T03-02-51.561913.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_11_19T15_52_54.018947", "path": ["**/details_harness|winogrande|5_2023-11-19T15-52-54.018947.parquet"]}, {"split": "2023_11_25T02_44_41.580934", "path": ["**/details_harness|winogrande|5_2023-11-25T02-44-41.580934.parquet"]}, {"split": "2023_11_25T02_50_24.454188", "path": ["**/details_harness|winogrande|5_2023-11-25T02-50-24.454188.parquet"]}, {"split": "2023_11_25T03_02_51.561913", "path": ["**/details_harness|winogrande|5_2023-11-25T03-02-51.561913.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-11-25T03-02-51.561913.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_11_19T15_52_54.018947", "path": ["results_2023-11-19T15-52-54.018947.parquet"]}, {"split": "2023_11_25T02_44_41.580934", "path": ["results_2023-11-25T02-44-41.580934.parquet"]}, {"split": "2023_11_25T02_50_24.454188", "path": ["results_2023-11-25T02-50-24.454188.parquet"]}, {"split": "2023_11_25T03_02_51.561913", "path": ["results_2023-11-25T03-02-51.561913.parquet"]}, {"split": "latest", "path": ["results_2023-11-25T03-02-51.561913.parquet"]}]}]} | 2023-11-25T03:06:02+00:00 | []
| []
| TAGS
#region-us
|
# Dataset Card for Evaluation run of euclaise/Ferret-7B
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model euclaise/Ferret-7B on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 4 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-11-25T03:02:51.561913(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
| [
"# Dataset Card for Evaluation run of euclaise/Ferret-7B",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model euclaise/Ferret-7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 4 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-11-25T03:02:51.561913(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
]
| [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of euclaise/Ferret-7B",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model euclaise/Ferret-7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 4 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-11-25T03:02:51.561913(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
]
| [
6,
17,
31,
166,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
]
| [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of euclaise/Ferret-7B## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model euclaise/Ferret-7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 4 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-11-25T03:02:51.561913(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
]
|
e3b8c9ebf887d4090fc9fbd524e3dad07eabeefa |
# Dataset Card for Evaluation run of dfurman/Llama-2-13B-Instruct-v0.2
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/dfurman/Llama-2-13B-Instruct-v0.2
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [dfurman/Llama-2-13B-Instruct-v0.2](https://huggingface.co/dfurman/Llama-2-13B-Instruct-v0.2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_dfurman__Llama-2-13B-Instruct-v0.2_public",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-11-19T16:07:16.774440](https://huggingface.co/datasets/open-llm-leaderboard/details_dfurman__Llama-2-13B-Instruct-v0.2_public/blob/main/results_2023-11-19T16-07-16.774440.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5518079956209404,
"acc_stderr": 0.03372838747309601,
"acc_norm": 0.5599366815450036,
"acc_norm_stderr": 0.034517191338240875,
"mc1": 0.31211750305997554,
"mc1_stderr": 0.016220756769520932,
"mc2": 0.4571097653810718,
"mc2_stderr": 0.014996550862444632,
"em": 0.003355704697986577,
"em_stderr": 0.0005922452850005415,
"f1": 0.07218225671140904,
"f1_stderr": 0.0015063738201574525
},
"harness|arc:challenge|25": {
"acc": 0.5639931740614335,
"acc_stderr": 0.014491225699230916,
"acc_norm": 0.60580204778157,
"acc_norm_stderr": 0.014280522667467327
},
"harness|hellaswag|10": {
"acc": 0.6139215295757817,
"acc_stderr": 0.004858539527872461,
"acc_norm": 0.8195578570005975,
"acc_norm_stderr": 0.0038376937398170133
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5185185185185185,
"acc_stderr": 0.043163785995113245,
"acc_norm": 0.5185185185185185,
"acc_norm_stderr": 0.043163785995113245
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5460526315789473,
"acc_stderr": 0.04051646342874142,
"acc_norm": 0.5460526315789473,
"acc_norm_stderr": 0.04051646342874142
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.57,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.57,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6037735849056604,
"acc_stderr": 0.030102793781791197,
"acc_norm": 0.6037735849056604,
"acc_norm_stderr": 0.030102793781791197
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5555555555555556,
"acc_stderr": 0.041553199555931467,
"acc_norm": 0.5555555555555556,
"acc_norm_stderr": 0.041553199555931467
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5028901734104047,
"acc_stderr": 0.038124005659748335,
"acc_norm": 0.5028901734104047,
"acc_norm_stderr": 0.038124005659748335
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.30392156862745096,
"acc_stderr": 0.04576665403207762,
"acc_norm": 0.30392156862745096,
"acc_norm_stderr": 0.04576665403207762
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.451063829787234,
"acc_stderr": 0.032529096196131965,
"acc_norm": 0.451063829787234,
"acc_norm_stderr": 0.032529096196131965
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.32456140350877194,
"acc_stderr": 0.04404556157374767,
"acc_norm": 0.32456140350877194,
"acc_norm_stderr": 0.04404556157374767
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5103448275862069,
"acc_stderr": 0.04165774775728763,
"acc_norm": 0.5103448275862069,
"acc_norm_stderr": 0.04165774775728763
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.328042328042328,
"acc_stderr": 0.0241804971643769,
"acc_norm": 0.328042328042328,
"acc_norm_stderr": 0.0241804971643769
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.042163702135578345,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.042163702135578345
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.4,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.4,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6548387096774193,
"acc_stderr": 0.027045746573534323,
"acc_norm": 0.6548387096774193,
"acc_norm_stderr": 0.027045746573534323
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4482758620689655,
"acc_stderr": 0.034991131376767445,
"acc_norm": 0.4482758620689655,
"acc_norm_stderr": 0.034991131376767445
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.55,
"acc_stderr": 0.05,
"acc_norm": 0.55,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.036639749943912434,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.036639749943912434
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6818181818181818,
"acc_stderr": 0.0331847733384533,
"acc_norm": 0.6818181818181818,
"acc_norm_stderr": 0.0331847733384533
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7875647668393783,
"acc_stderr": 0.029519282616817234,
"acc_norm": 0.7875647668393783,
"acc_norm_stderr": 0.029519282616817234
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5230769230769231,
"acc_stderr": 0.025323990861736236,
"acc_norm": 0.5230769230769231,
"acc_norm_stderr": 0.025323990861736236
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.027940457136228402,
"acc_norm": 0.3,
"acc_norm_stderr": 0.027940457136228402
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5882352941176471,
"acc_stderr": 0.03196876989195778,
"acc_norm": 0.5882352941176471,
"acc_norm_stderr": 0.03196876989195778
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2847682119205298,
"acc_stderr": 0.03684881521389023,
"acc_norm": 0.2847682119205298,
"acc_norm_stderr": 0.03684881521389023
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7357798165137615,
"acc_stderr": 0.018904164171510186,
"acc_norm": 0.7357798165137615,
"acc_norm_stderr": 0.018904164171510186
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.375,
"acc_stderr": 0.033016908987210894,
"acc_norm": 0.375,
"acc_norm_stderr": 0.033016908987210894
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7892156862745098,
"acc_stderr": 0.02862654791243738,
"acc_norm": 0.7892156862745098,
"acc_norm_stderr": 0.02862654791243738
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7468354430379747,
"acc_stderr": 0.0283046579430353,
"acc_norm": 0.7468354430379747,
"acc_norm_stderr": 0.0283046579430353
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6547085201793722,
"acc_stderr": 0.03191100192835794,
"acc_norm": 0.6547085201793722,
"acc_norm_stderr": 0.03191100192835794
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6259541984732825,
"acc_stderr": 0.042438692422305246,
"acc_norm": 0.6259541984732825,
"acc_norm_stderr": 0.042438692422305246
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7603305785123967,
"acc_stderr": 0.03896878985070416,
"acc_norm": 0.7603305785123967,
"acc_norm_stderr": 0.03896878985070416
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6851851851851852,
"acc_stderr": 0.04489931073591312,
"acc_norm": 0.6851851851851852,
"acc_norm_stderr": 0.04489931073591312
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6319018404907976,
"acc_stderr": 0.03789213935838396,
"acc_norm": 0.6319018404907976,
"acc_norm_stderr": 0.03789213935838396
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.3392857142857143,
"acc_stderr": 0.04493949068613539,
"acc_norm": 0.3392857142857143,
"acc_norm_stderr": 0.04493949068613539
},
"harness|hendrycksTest-management|5": {
"acc": 0.7087378640776699,
"acc_stderr": 0.044986763205729245,
"acc_norm": 0.7087378640776699,
"acc_norm_stderr": 0.044986763205729245
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8076923076923077,
"acc_stderr": 0.025819233256483717,
"acc_norm": 0.8076923076923077,
"acc_norm_stderr": 0.025819233256483717
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7522349936143039,
"acc_stderr": 0.01543808308056897,
"acc_norm": 0.7522349936143039,
"acc_norm_stderr": 0.01543808308056897
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6242774566473989,
"acc_stderr": 0.02607431485165708,
"acc_norm": 0.6242774566473989,
"acc_norm_stderr": 0.02607431485165708
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3664804469273743,
"acc_stderr": 0.016115235504865467,
"acc_norm": 0.3664804469273743,
"acc_norm_stderr": 0.016115235504865467
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5784313725490197,
"acc_stderr": 0.02827549015679146,
"acc_norm": 0.5784313725490197,
"acc_norm_stderr": 0.02827549015679146
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6302250803858521,
"acc_stderr": 0.02741799670563099,
"acc_norm": 0.6302250803858521,
"acc_norm_stderr": 0.02741799670563099
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6265432098765432,
"acc_stderr": 0.026915003011380154,
"acc_norm": 0.6265432098765432,
"acc_norm_stderr": 0.026915003011380154
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.39361702127659576,
"acc_stderr": 0.029144544781596143,
"acc_norm": 0.39361702127659576,
"acc_norm_stderr": 0.029144544781596143
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.42046936114732725,
"acc_stderr": 0.012607654553832705,
"acc_norm": 0.42046936114732725,
"acc_norm_stderr": 0.012607654553832705
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5220588235294118,
"acc_stderr": 0.030343264224213514,
"acc_norm": 0.5220588235294118,
"acc_norm_stderr": 0.030343264224213514
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5506535947712419,
"acc_stderr": 0.020123766528027266,
"acc_norm": 0.5506535947712419,
"acc_norm_stderr": 0.020123766528027266
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6090909090909091,
"acc_stderr": 0.04673752333670239,
"acc_norm": 0.6090909090909091,
"acc_norm_stderr": 0.04673752333670239
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6,
"acc_stderr": 0.03136250240935893,
"acc_norm": 0.6,
"acc_norm_stderr": 0.03136250240935893
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7313432835820896,
"acc_stderr": 0.03134328358208954,
"acc_norm": 0.7313432835820896,
"acc_norm_stderr": 0.03134328358208954
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.82,
"acc_stderr": 0.03861229196653694,
"acc_norm": 0.82,
"acc_norm_stderr": 0.03861229196653694
},
"harness|hendrycksTest-virology|5": {
"acc": 0.42771084337349397,
"acc_stderr": 0.038515976837185335,
"acc_norm": 0.42771084337349397,
"acc_norm_stderr": 0.038515976837185335
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.03188578017686398,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.03188578017686398
},
"harness|truthfulqa:mc|0": {
"mc1": 0.31211750305997554,
"mc1_stderr": 0.016220756769520932,
"mc2": 0.4571097653810718,
"mc2_stderr": 0.014996550862444632
},
"harness|winogrande|5": {
"acc": 0.7782162588792423,
"acc_stderr": 0.011676109244497811
},
"harness|drop|3": {
"em": 0.003355704697986577,
"em_stderr": 0.0005922452850005415,
"f1": 0.07218225671140904,
"f1_stderr": 0.0015063738201574525
},
"harness|gsm8k|5": {
"acc": 0.0932524639878696,
"acc_stderr": 0.00800968883832858
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] | open-llm-leaderboard/details_dfurman__Llama-2-13B-Instruct-v0.2 | [
"region:us"
]
| 2023-11-19T16:10:22+00:00 | {"pretty_name": "Evaluation run of dfurman/Llama-2-13B-Instruct-v0.2", "dataset_summary": "Dataset automatically created during the evaluation run of model [dfurman/Llama-2-13B-Instruct-v0.2](https://huggingface.co/dfurman/Llama-2-13B-Instruct-v0.2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_dfurman__Llama-2-13B-Instruct-v0.2_public\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-11-19T16:07:16.774440](https://huggingface.co/datasets/open-llm-leaderboard/details_dfurman__Llama-2-13B-Instruct-v0.2_public/blob/main/results_2023-11-19T16-07-16.774440.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5518079956209404,\n \"acc_stderr\": 0.03372838747309601,\n \"acc_norm\": 0.5599366815450036,\n \"acc_norm_stderr\": 0.034517191338240875,\n \"mc1\": 0.31211750305997554,\n \"mc1_stderr\": 0.016220756769520932,\n \"mc2\": 0.4571097653810718,\n \"mc2_stderr\": 0.014996550862444632,\n \"em\": 0.003355704697986577,\n \"em_stderr\": 0.0005922452850005415,\n \"f1\": 0.07218225671140904,\n \"f1_stderr\": 0.0015063738201574525\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5639931740614335,\n \"acc_stderr\": 0.014491225699230916,\n \"acc_norm\": 0.60580204778157,\n \"acc_norm_stderr\": 0.014280522667467327\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6139215295757817,\n \"acc_stderr\": 0.004858539527872461,\n \"acc_norm\": 0.8195578570005975,\n \"acc_norm_stderr\": 0.0038376937398170133\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5185185185185185,\n \"acc_stderr\": 0.043163785995113245,\n \"acc_norm\": 0.5185185185185185,\n \"acc_norm_stderr\": 0.043163785995113245\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.5460526315789473,\n \"acc_stderr\": 0.04051646342874142,\n \"acc_norm\": 0.5460526315789473,\n \"acc_norm_stderr\": 0.04051646342874142\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.57,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.57,\n \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6037735849056604,\n \"acc_stderr\": 0.030102793781791197,\n \"acc_norm\": 0.6037735849056604,\n \"acc_norm_stderr\": 0.030102793781791197\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5555555555555556,\n \"acc_stderr\": 0.041553199555931467,\n \"acc_norm\": 0.5555555555555556,\n \"acc_norm_stderr\": 0.041553199555931467\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956911,\n \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956911\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5028901734104047,\n \"acc_stderr\": 0.038124005659748335,\n \"acc_norm\": 0.5028901734104047,\n \"acc_norm_stderr\": 0.038124005659748335\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.30392156862745096,\n \"acc_stderr\": 0.04576665403207762,\n \"acc_norm\": 0.30392156862745096,\n \"acc_norm_stderr\": 0.04576665403207762\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.451063829787234,\n \"acc_stderr\": 0.032529096196131965,\n \"acc_norm\": 0.451063829787234,\n \"acc_norm_stderr\": 0.032529096196131965\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.32456140350877194,\n \"acc_stderr\": 0.04404556157374767,\n \"acc_norm\": 0.32456140350877194,\n \"acc_norm_stderr\": 0.04404556157374767\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5103448275862069,\n \"acc_stderr\": 0.04165774775728763,\n \"acc_norm\": 0.5103448275862069,\n \"acc_norm_stderr\": 0.04165774775728763\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.328042328042328,\n \"acc_stderr\": 0.0241804971643769,\n \"acc_norm\": 0.328042328042328,\n \"acc_norm_stderr\": 0.0241804971643769\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.042163702135578345,\n \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.042163702135578345\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.4,\n \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6548387096774193,\n \"acc_stderr\": 0.027045746573534323,\n \"acc_norm\": 0.6548387096774193,\n \"acc_norm_stderr\": 0.027045746573534323\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4482758620689655,\n \"acc_stderr\": 0.034991131376767445,\n \"acc_norm\": 0.4482758620689655,\n \"acc_norm_stderr\": 0.034991131376767445\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.55,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.55,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.6727272727272727,\n \"acc_stderr\": 0.036639749943912434,\n \"acc_norm\": 0.6727272727272727,\n \"acc_norm_stderr\": 0.036639749943912434\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.6818181818181818,\n \"acc_stderr\": 0.0331847733384533,\n \"acc_norm\": 0.6818181818181818,\n \"acc_norm_stderr\": 0.0331847733384533\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.7875647668393783,\n \"acc_stderr\": 0.029519282616817234,\n \"acc_norm\": 0.7875647668393783,\n \"acc_norm_stderr\": 0.029519282616817234\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.5230769230769231,\n \"acc_stderr\": 0.025323990861736236,\n \"acc_norm\": 0.5230769230769231,\n \"acc_norm_stderr\": 0.025323990861736236\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.027940457136228402,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.027940457136228402\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.5882352941176471,\n \"acc_stderr\": 0.03196876989195778,\n \"acc_norm\": 0.5882352941176471,\n \"acc_norm_stderr\": 0.03196876989195778\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.2847682119205298,\n \"acc_stderr\": 0.03684881521389023,\n \"acc_norm\": 0.2847682119205298,\n \"acc_norm_stderr\": 0.03684881521389023\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.7357798165137615,\n \"acc_stderr\": 0.018904164171510186,\n \"acc_norm\": 0.7357798165137615,\n \"acc_norm_stderr\": 0.018904164171510186\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.375,\n \"acc_stderr\": 0.033016908987210894,\n \"acc_norm\": 0.375,\n \"acc_norm_stderr\": 0.033016908987210894\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7892156862745098,\n \"acc_stderr\": 0.02862654791243738,\n \"acc_norm\": 0.7892156862745098,\n \"acc_norm_stderr\": 0.02862654791243738\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7468354430379747,\n \"acc_stderr\": 0.0283046579430353,\n \"acc_norm\": 0.7468354430379747,\n \"acc_norm_stderr\": 0.0283046579430353\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6547085201793722,\n \"acc_stderr\": 0.03191100192835794,\n \"acc_norm\": 0.6547085201793722,\n \"acc_norm_stderr\": 0.03191100192835794\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.6259541984732825,\n \"acc_stderr\": 0.042438692422305246,\n \"acc_norm\": 0.6259541984732825,\n \"acc_norm_stderr\": 0.042438692422305246\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7603305785123967,\n \"acc_stderr\": 0.03896878985070416,\n \"acc_norm\": 0.7603305785123967,\n \"acc_norm_stderr\": 0.03896878985070416\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6851851851851852,\n \"acc_stderr\": 0.04489931073591312,\n \"acc_norm\": 0.6851851851851852,\n \"acc_norm_stderr\": 0.04489931073591312\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.6319018404907976,\n \"acc_stderr\": 0.03789213935838396,\n \"acc_norm\": 0.6319018404907976,\n \"acc_norm_stderr\": 0.03789213935838396\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3392857142857143,\n \"acc_stderr\": 0.04493949068613539,\n \"acc_norm\": 0.3392857142857143,\n \"acc_norm_stderr\": 0.04493949068613539\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7087378640776699,\n \"acc_stderr\": 0.044986763205729245,\n \"acc_norm\": 0.7087378640776699,\n \"acc_norm_stderr\": 0.044986763205729245\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8076923076923077,\n \"acc_stderr\": 0.025819233256483717,\n \"acc_norm\": 0.8076923076923077,\n \"acc_norm_stderr\": 0.025819233256483717\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.58,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7522349936143039,\n \"acc_stderr\": 0.01543808308056897,\n \"acc_norm\": 0.7522349936143039,\n \"acc_norm_stderr\": 0.01543808308056897\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.6242774566473989,\n \"acc_stderr\": 0.02607431485165708,\n \"acc_norm\": 0.6242774566473989,\n \"acc_norm_stderr\": 0.02607431485165708\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3664804469273743,\n \"acc_stderr\": 0.016115235504865467,\n \"acc_norm\": 0.3664804469273743,\n \"acc_norm_stderr\": 0.016115235504865467\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.5784313725490197,\n \"acc_stderr\": 0.02827549015679146,\n \"acc_norm\": 0.5784313725490197,\n \"acc_norm_stderr\": 0.02827549015679146\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6302250803858521,\n \"acc_stderr\": 0.02741799670563099,\n \"acc_norm\": 0.6302250803858521,\n \"acc_norm_stderr\": 0.02741799670563099\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.6265432098765432,\n \"acc_stderr\": 0.026915003011380154,\n \"acc_norm\": 0.6265432098765432,\n \"acc_norm_stderr\": 0.026915003011380154\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.39361702127659576,\n \"acc_stderr\": 0.029144544781596143,\n \"acc_norm\": 0.39361702127659576,\n \"acc_norm_stderr\": 0.029144544781596143\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.42046936114732725,\n \"acc_stderr\": 0.012607654553832705,\n \"acc_norm\": 0.42046936114732725,\n \"acc_norm_stderr\": 0.012607654553832705\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.5220588235294118,\n \"acc_stderr\": 0.030343264224213514,\n \"acc_norm\": 0.5220588235294118,\n \"acc_norm_stderr\": 0.030343264224213514\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.5506535947712419,\n \"acc_stderr\": 0.020123766528027266,\n \"acc_norm\": 0.5506535947712419,\n \"acc_norm_stderr\": 0.020123766528027266\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6090909090909091,\n \"acc_stderr\": 0.04673752333670239,\n \"acc_norm\": 0.6090909090909091,\n \"acc_norm_stderr\": 0.04673752333670239\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.03136250240935893,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.03136250240935893\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7313432835820896,\n \"acc_stderr\": 0.03134328358208954,\n \"acc_norm\": 0.7313432835820896,\n \"acc_norm_stderr\": 0.03134328358208954\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.82,\n \"acc_stderr\": 0.03861229196653694,\n \"acc_norm\": 0.82,\n \"acc_norm_stderr\": 0.03861229196653694\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.42771084337349397,\n \"acc_stderr\": 0.038515976837185335,\n \"acc_norm\": 0.42771084337349397,\n \"acc_norm_stderr\": 0.038515976837185335\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.03188578017686398,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.03188578017686398\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.31211750305997554,\n \"mc1_stderr\": 0.016220756769520932,\n \"mc2\": 0.4571097653810718,\n \"mc2_stderr\": 0.014996550862444632\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7782162588792423,\n \"acc_stderr\": 0.011676109244497811\n },\n \"harness|drop|3\": {\n \"em\": 0.003355704697986577,\n \"em_stderr\": 0.0005922452850005415,\n \"f1\": 0.07218225671140904,\n \"f1_stderr\": 0.0015063738201574525\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0932524639878696,\n \"acc_stderr\": 0.00800968883832858\n }\n}\n```", "repo_url": "https://huggingface.co/dfurman/Llama-2-13B-Instruct-v0.2", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_11_19T16_07_16.774440", "path": ["**/details_harness|arc:challenge|25_2023-11-19T16-07-16.774440.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-11-19T16-07-16.774440.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_11_19T16_07_16.774440", "path": ["**/details_harness|drop|3_2023-11-19T16-07-16.774440.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-11-19T16-07-16.774440.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_11_19T16_07_16.774440", "path": ["**/details_harness|gsm8k|5_2023-11-19T16-07-16.774440.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-11-19T16-07-16.774440.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_11_19T16_07_16.774440", "path": ["**/details_harness|hellaswag|10_2023-11-19T16-07-16.774440.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-11-19T16-07-16.774440.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_11_19T16_07_16.774440", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-19T16-07-16.774440.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-11-19T16-07-16.774440.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-11-19T16-07-16.774440.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-11-19T16-07-16.774440.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-19T16-07-16.774440.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-11-19T16-07-16.774440.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-11-19T16-07-16.774440.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-11-19T16-07-16.774440.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-11-19T16-07-16.774440.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-11-19T16-07-16.774440.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-11-19T16-07-16.774440.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-11-19T16-07-16.774440.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-19T16-07-16.774440.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-11-19T16-07-16.774440.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-19T16-07-16.774440.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-19T16-07-16.774440.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-11-19T16-07-16.774440.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-11-19T16-07-16.774440.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-11-19T16-07-16.774440.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-19T16-07-16.774440.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-19T16-07-16.774440.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-19T16-07-16.774440.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-11-19T16-07-16.774440.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-19T16-07-16.774440.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-19T16-07-16.774440.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-19T16-07-16.774440.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-19T16-07-16.774440.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-11-19T16-07-16.774440.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-19T16-07-16.774440.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-19T16-07-16.774440.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-19T16-07-16.774440.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-19T16-07-16.774440.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-11-19T16-07-16.774440.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-11-19T16-07-16.774440.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-11-19T16-07-16.774440.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-11-19T16-07-16.774440.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-19T16-07-16.774440.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-11-19T16-07-16.774440.parquet", "**/details_harness|hendrycksTest-management|5_2023-11-19T16-07-16.774440.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-11-19T16-07-16.774440.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-11-19T16-07-16.774440.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-11-19T16-07-16.774440.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-11-19T16-07-16.774440.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-19T16-07-16.774440.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-11-19T16-07-16.774440.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-11-19T16-07-16.774440.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-11-19T16-07-16.774440.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-11-19T16-07-16.774440.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-11-19T16-07-16.774440.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-11-19T16-07-16.774440.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-11-19T16-07-16.774440.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-11-19T16-07-16.774440.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-11-19T16-07-16.774440.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-11-19T16-07-16.774440.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-19T16-07-16.774440.parquet", "**/details_harness|hendrycksTest-virology|5_2023-11-19T16-07-16.774440.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-11-19T16-07-16.774440.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-19T16-07-16.774440.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-11-19T16-07-16.774440.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-11-19T16-07-16.774440.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-11-19T16-07-16.774440.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-19T16-07-16.774440.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-11-19T16-07-16.774440.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-11-19T16-07-16.774440.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-11-19T16-07-16.774440.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-11-19T16-07-16.774440.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-11-19T16-07-16.774440.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-11-19T16-07-16.774440.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-11-19T16-07-16.774440.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-19T16-07-16.774440.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-11-19T16-07-16.774440.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-19T16-07-16.774440.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-19T16-07-16.774440.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-11-19T16-07-16.774440.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-11-19T16-07-16.774440.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-11-19T16-07-16.774440.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-19T16-07-16.774440.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-19T16-07-16.774440.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-19T16-07-16.774440.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-11-19T16-07-16.774440.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-19T16-07-16.774440.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-19T16-07-16.774440.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-19T16-07-16.774440.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-19T16-07-16.774440.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-11-19T16-07-16.774440.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-19T16-07-16.774440.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-19T16-07-16.774440.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-19T16-07-16.774440.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-19T16-07-16.774440.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-11-19T16-07-16.774440.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-11-19T16-07-16.774440.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-11-19T16-07-16.774440.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-11-19T16-07-16.774440.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-19T16-07-16.774440.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-11-19T16-07-16.774440.parquet", "**/details_harness|hendrycksTest-management|5_2023-11-19T16-07-16.774440.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-11-19T16-07-16.774440.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-11-19T16-07-16.774440.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-11-19T16-07-16.774440.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-11-19T16-07-16.774440.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-19T16-07-16.774440.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-11-19T16-07-16.774440.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-11-19T16-07-16.774440.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-11-19T16-07-16.774440.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-11-19T16-07-16.774440.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-11-19T16-07-16.774440.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-11-19T16-07-16.774440.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-11-19T16-07-16.774440.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-11-19T16-07-16.774440.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-11-19T16-07-16.774440.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-11-19T16-07-16.774440.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-19T16-07-16.774440.parquet", "**/details_harness|hendrycksTest-virology|5_2023-11-19T16-07-16.774440.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-11-19T16-07-16.774440.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_11_19T16_07_16.774440", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-19T16-07-16.774440.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-19T16-07-16.774440.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_11_19T16_07_16.774440", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-11-19T16-07-16.774440.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-11-19T16-07-16.774440.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_11_19T16_07_16.774440", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-11-19T16-07-16.774440.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-11-19T16-07-16.774440.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_11_19T16_07_16.774440", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-11-19T16-07-16.774440.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-11-19T16-07-16.774440.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_11_19T16_07_16.774440", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-19T16-07-16.774440.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-19T16-07-16.774440.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_11_19T16_07_16.774440", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-11-19T16-07-16.774440.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-11-19T16-07-16.774440.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_11_19T16_07_16.774440", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-11-19T16-07-16.774440.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-11-19T16-07-16.774440.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_11_19T16_07_16.774440", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-11-19T16-07-16.774440.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-11-19T16-07-16.774440.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_11_19T16_07_16.774440", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-11-19T16-07-16.774440.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-11-19T16-07-16.774440.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_11_19T16_07_16.774440", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-11-19T16-07-16.774440.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-11-19T16-07-16.774440.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_11_19T16_07_16.774440", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-11-19T16-07-16.774440.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-11-19T16-07-16.774440.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_11_19T16_07_16.774440", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-11-19T16-07-16.774440.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-11-19T16-07-16.774440.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_11_19T16_07_16.774440", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-19T16-07-16.774440.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-19T16-07-16.774440.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_11_19T16_07_16.774440", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-11-19T16-07-16.774440.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-11-19T16-07-16.774440.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_11_19T16_07_16.774440", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-19T16-07-16.774440.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-19T16-07-16.774440.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_11_19T16_07_16.774440", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-19T16-07-16.774440.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-19T16-07-16.774440.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_11_19T16_07_16.774440", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-11-19T16-07-16.774440.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-11-19T16-07-16.774440.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_11_19T16_07_16.774440", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-11-19T16-07-16.774440.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-11-19T16-07-16.774440.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_11_19T16_07_16.774440", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-11-19T16-07-16.774440.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-11-19T16-07-16.774440.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_11_19T16_07_16.774440", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-19T16-07-16.774440.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-19T16-07-16.774440.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_11_19T16_07_16.774440", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-19T16-07-16.774440.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-19T16-07-16.774440.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_11_19T16_07_16.774440", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-19T16-07-16.774440.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-19T16-07-16.774440.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_11_19T16_07_16.774440", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-11-19T16-07-16.774440.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-11-19T16-07-16.774440.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_11_19T16_07_16.774440", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-19T16-07-16.774440.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-19T16-07-16.774440.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_11_19T16_07_16.774440", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-19T16-07-16.774440.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-19T16-07-16.774440.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_11_19T16_07_16.774440", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-19T16-07-16.774440.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-19T16-07-16.774440.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_11_19T16_07_16.774440", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-19T16-07-16.774440.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-19T16-07-16.774440.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_11_19T16_07_16.774440", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-11-19T16-07-16.774440.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-11-19T16-07-16.774440.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_11_19T16_07_16.774440", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-19T16-07-16.774440.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-19T16-07-16.774440.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_11_19T16_07_16.774440", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-19T16-07-16.774440.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-19T16-07-16.774440.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_11_19T16_07_16.774440", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-19T16-07-16.774440.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-19T16-07-16.774440.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_11_19T16_07_16.774440", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-19T16-07-16.774440.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-19T16-07-16.774440.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_11_19T16_07_16.774440", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-11-19T16-07-16.774440.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-11-19T16-07-16.774440.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_11_19T16_07_16.774440", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-11-19T16-07-16.774440.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-11-19T16-07-16.774440.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_11_19T16_07_16.774440", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-11-19T16-07-16.774440.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-11-19T16-07-16.774440.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_11_19T16_07_16.774440", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-11-19T16-07-16.774440.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-11-19T16-07-16.774440.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_11_19T16_07_16.774440", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-19T16-07-16.774440.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-19T16-07-16.774440.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_11_19T16_07_16.774440", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-11-19T16-07-16.774440.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-11-19T16-07-16.774440.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_11_19T16_07_16.774440", "path": ["**/details_harness|hendrycksTest-management|5_2023-11-19T16-07-16.774440.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-11-19T16-07-16.774440.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_11_19T16_07_16.774440", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-11-19T16-07-16.774440.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-11-19T16-07-16.774440.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_11_19T16_07_16.774440", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-11-19T16-07-16.774440.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-11-19T16-07-16.774440.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_11_19T16_07_16.774440", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-11-19T16-07-16.774440.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-11-19T16-07-16.774440.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_11_19T16_07_16.774440", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-11-19T16-07-16.774440.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-11-19T16-07-16.774440.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_11_19T16_07_16.774440", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-19T16-07-16.774440.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-19T16-07-16.774440.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_11_19T16_07_16.774440", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-11-19T16-07-16.774440.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-11-19T16-07-16.774440.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_11_19T16_07_16.774440", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-11-19T16-07-16.774440.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-11-19T16-07-16.774440.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_11_19T16_07_16.774440", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-11-19T16-07-16.774440.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-11-19T16-07-16.774440.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_11_19T16_07_16.774440", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-11-19T16-07-16.774440.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-11-19T16-07-16.774440.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_11_19T16_07_16.774440", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-11-19T16-07-16.774440.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-11-19T16-07-16.774440.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_11_19T16_07_16.774440", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-11-19T16-07-16.774440.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-11-19T16-07-16.774440.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_11_19T16_07_16.774440", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-11-19T16-07-16.774440.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-11-19T16-07-16.774440.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_11_19T16_07_16.774440", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-11-19T16-07-16.774440.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-11-19T16-07-16.774440.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_11_19T16_07_16.774440", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-11-19T16-07-16.774440.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-11-19T16-07-16.774440.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_11_19T16_07_16.774440", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-11-19T16-07-16.774440.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-11-19T16-07-16.774440.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_11_19T16_07_16.774440", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-19T16-07-16.774440.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-19T16-07-16.774440.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_11_19T16_07_16.774440", "path": ["**/details_harness|hendrycksTest-virology|5_2023-11-19T16-07-16.774440.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-11-19T16-07-16.774440.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_11_19T16_07_16.774440", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-11-19T16-07-16.774440.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-11-19T16-07-16.774440.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_11_19T16_07_16.774440", "path": ["**/details_harness|truthfulqa:mc|0_2023-11-19T16-07-16.774440.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-11-19T16-07-16.774440.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_11_19T16_07_16.774440", "path": ["**/details_harness|winogrande|5_2023-11-19T16-07-16.774440.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-11-19T16-07-16.774440.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_11_19T16_07_16.774440", "path": ["results_2023-11-19T16-07-16.774440.parquet"]}, {"split": "latest", "path": ["results_2023-11-19T16-07-16.774440.parquet"]}]}]} | 2023-11-19T16:11:09+00:00 | []
| []
| TAGS
#region-us
|
# Dataset Card for Evaluation run of dfurman/Llama-2-13B-Instruct-v0.2
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model dfurman/Llama-2-13B-Instruct-v0.2 on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-11-19T16:07:16.774440(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
| [
"# Dataset Card for Evaluation run of dfurman/Llama-2-13B-Instruct-v0.2",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model dfurman/Llama-2-13B-Instruct-v0.2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-11-19T16:07:16.774440(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
]
| [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of dfurman/Llama-2-13B-Instruct-v0.2",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model dfurman/Llama-2-13B-Instruct-v0.2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-11-19T16:07:16.774440(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
]
| [
6,
24,
31,
173,
66,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
]
| [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of dfurman/Llama-2-13B-Instruct-v0.2## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model dfurman/Llama-2-13B-Instruct-v0.2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-11-19T16:07:16.774440(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
]
|
f6ce4cb872b53c18440a26c8ec5f524f128c8bb1 | # Dataset Card for "turkishReviews-ds"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | imelike/turkishReviews-ds | [
"region:us"
]
| 2023-11-19T16:25:59+00:00 | {"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "validation", "path": "data/validation-*"}]}], "dataset_info": {"features": [{"name": "review", "dtype": "string"}, {"name": "review_length", "dtype": "int64"}], "splits": [{"name": "train", "num_bytes": 134596742.14721414, "num_examples": 362520}, {"name": "validation", "num_bytes": 14955564.852785867, "num_examples": 40281}], "download_size": 95516965, "dataset_size": 149552307.0}} | 2023-12-06T15:10:14+00:00 | []
| []
| TAGS
#region-us
| # Dataset Card for "turkishReviews-ds"
More Information needed | [
"# Dataset Card for \"turkishReviews-ds\"\n\nMore Information needed"
]
| [
"TAGS\n#region-us \n",
"# Dataset Card for \"turkishReviews-ds\"\n\nMore Information needed"
]
| [
6,
17
]
| [
"passage: TAGS\n#region-us \n# Dataset Card for \"turkishReviews-ds\"\n\nMore Information needed"
]
|
fecb652ca568905befa133e2102a72a05b7f6eac | # Style Chatbot Dataset
A style recommendation dataset that contains input (body type and personal clothing style), context (event context) and response triplets. The responses are GPT 3.5 generated outfit combination recommendations given the input body type and personal style prompt and the target / context event.
Our dataset contains a variety of events such as business functions, cocktail parties, casual gatherings, fancy dates, etc. See an example Mistral-based finetuned [model](https://huggingface.co/neuralwork).
| neuralwork/fashion-style-instruct | [
"task_categories:text-generation",
"size_categories:1K<n<10K",
"language:en",
"license:mit",
"region:us"
]
| 2023-11-19T16:38:42+00:00 | {"language": ["en"], "license": "mit", "size_categories": ["1K<n<10K"], "task_categories": ["text-generation"], "dataset_info": {"features": [{"name": "input", "dtype": "string"}, {"name": "completion", "dtype": "string"}, {"name": "context", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 7153246, "num_examples": 3193}], "download_size": 2643016, "dataset_size": 7153246}} | 2023-11-19T17:06:10+00:00 | []
| [
"en"
]
| TAGS
#task_categories-text-generation #size_categories-1K<n<10K #language-English #license-mit #region-us
| # Style Chatbot Dataset
A style recommendation dataset that contains input (body type and personal clothing style), context (event context) and response triplets. The responses are GPT 3.5 generated outfit combination recommendations given the input body type and personal style prompt and the target / context event.
Our dataset contains a variety of events such as business functions, cocktail parties, casual gatherings, fancy dates, etc. See an example Mistral-based finetuned model.
| [
"# Style Chatbot Dataset\nA style recommendation dataset that contains input (body type and personal clothing style), context (event context) and response triplets. The responses are GPT 3.5 generated outfit combination recommendations given the input body type and personal style prompt and the target / context event.\n\nOur dataset contains a variety of events such as business functions, cocktail parties, casual gatherings, fancy dates, etc. See an example Mistral-based finetuned model."
]
| [
"TAGS\n#task_categories-text-generation #size_categories-1K<n<10K #language-English #license-mit #region-us \n",
"# Style Chatbot Dataset\nA style recommendation dataset that contains input (body type and personal clothing style), context (event context) and response triplets. The responses are GPT 3.5 generated outfit combination recommendations given the input body type and personal style prompt and the target / context event.\n\nOur dataset contains a variety of events such as business functions, cocktail parties, casual gatherings, fancy dates, etc. See an example Mistral-based finetuned model."
]
| [
38,
103
]
| [
"passage: TAGS\n#task_categories-text-generation #size_categories-1K<n<10K #language-English #license-mit #region-us \n# Style Chatbot Dataset\nA style recommendation dataset that contains input (body type and personal clothing style), context (event context) and response triplets. The responses are GPT 3.5 generated outfit combination recommendations given the input body type and personal style prompt and the target / context event.\n\nOur dataset contains a variety of events such as business functions, cocktail parties, casual gatherings, fancy dates, etc. See an example Mistral-based finetuned model."
]
|
62bdec1b56d03760f1f9cdc0cffbdedae3e3bc9c |
# Dataset Card for Evaluation run of postbot/gpt2-medium-emailgen
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/postbot/gpt2-medium-emailgen
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [postbot/gpt2-medium-emailgen](https://huggingface.co/postbot/gpt2-medium-emailgen) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_postbot__gpt2-medium-emailgen_public",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-11-19T16:44:21.952672](https://huggingface.co/datasets/open-llm-leaderboard/details_postbot__gpt2-medium-emailgen_public/blob/main/results_2023-11-19T16-44-21.952672.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.24213502321663855,
"acc_stderr": 0.030210866111969045,
"acc_norm": 0.2431559232771965,
"acc_norm_stderr": 0.031011858860463776,
"mc1": 0.2668298653610771,
"mc1_stderr": 0.015483691939237269,
"mc2": 0.43956041135282153,
"mc2_stderr": 0.015361204238680572,
"em": 0.0005243288590604027,
"em_stderr": 0.00023443780464839703,
"f1": 0.02527684563758395,
"f1_stderr": 0.0009458090371986776
},
"harness|arc:challenge|25": {
"acc": 0.22184300341296928,
"acc_stderr": 0.012141659068147882,
"acc_norm": 0.2645051194539249,
"acc_norm_stderr": 0.012889272949313364
},
"harness|hellaswag|10": {
"acc": 0.30541724756024696,
"acc_stderr": 0.00459642622000091,
"acc_norm": 0.3430591515634336,
"acc_norm_stderr": 0.004737608340163401
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.27,
"acc_stderr": 0.0446196043338474,
"acc_norm": 0.27,
"acc_norm_stderr": 0.0446196043338474
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.03785714465066653,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.03785714465066653
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.17763157894736842,
"acc_stderr": 0.031103182383123398,
"acc_norm": 0.17763157894736842,
"acc_norm_stderr": 0.031103182383123398
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.23,
"acc_stderr": 0.042295258468165044,
"acc_norm": 0.23,
"acc_norm_stderr": 0.042295258468165044
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.23773584905660378,
"acc_stderr": 0.02619980880756191,
"acc_norm": 0.23773584905660378,
"acc_norm_stderr": 0.02619980880756191
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.03476590104304134,
"acc_norm": 0.2222222222222222,
"acc_norm_stderr": 0.03476590104304134
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.23,
"acc_stderr": 0.042295258468165065,
"acc_norm": 0.23,
"acc_norm_stderr": 0.042295258468165065
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.22,
"acc_stderr": 0.04163331998932269,
"acc_norm": 0.22,
"acc_norm_stderr": 0.04163331998932269
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.22,
"acc_stderr": 0.04163331998932268,
"acc_norm": 0.22,
"acc_norm_stderr": 0.04163331998932268
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.21965317919075145,
"acc_stderr": 0.031568093627031744,
"acc_norm": 0.21965317919075145,
"acc_norm_stderr": 0.031568093627031744
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.20588235294117646,
"acc_stderr": 0.04023382273617746,
"acc_norm": 0.20588235294117646,
"acc_norm_stderr": 0.04023382273617746
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.22,
"acc_stderr": 0.041633319989322695,
"acc_norm": 0.22,
"acc_norm_stderr": 0.041633319989322695
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.26382978723404255,
"acc_stderr": 0.028809989854102973,
"acc_norm": 0.26382978723404255,
"acc_norm_stderr": 0.028809989854102973
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2719298245614035,
"acc_stderr": 0.04185774424022056,
"acc_norm": 0.2719298245614035,
"acc_norm_stderr": 0.04185774424022056
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.20689655172413793,
"acc_stderr": 0.03375672449560554,
"acc_norm": 0.20689655172413793,
"acc_norm_stderr": 0.03375672449560554
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.24074074074074073,
"acc_stderr": 0.022019080012217897,
"acc_norm": 0.24074074074074073,
"acc_norm_stderr": 0.022019080012217897
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.20634920634920634,
"acc_stderr": 0.03619604524124251,
"acc_norm": 0.20634920634920634,
"acc_norm_stderr": 0.03619604524124251
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.19,
"acc_stderr": 0.039427724440366255,
"acc_norm": 0.19,
"acc_norm_stderr": 0.039427724440366255
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.267741935483871,
"acc_stderr": 0.02518900666021238,
"acc_norm": 0.267741935483871,
"acc_norm_stderr": 0.02518900666021238
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.1625615763546798,
"acc_stderr": 0.02596030006460558,
"acc_norm": 0.1625615763546798,
"acc_norm_stderr": 0.02596030006460558
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909283,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909283
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.21818181818181817,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.21818181818181817,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.21717171717171718,
"acc_stderr": 0.029376616484945633,
"acc_norm": 0.21717171717171718,
"acc_norm_stderr": 0.029376616484945633
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.19689119170984457,
"acc_stderr": 0.02869787397186068,
"acc_norm": 0.19689119170984457,
"acc_norm_stderr": 0.02869787397186068
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.21794871794871795,
"acc_stderr": 0.020932445774463206,
"acc_norm": 0.21794871794871795,
"acc_norm_stderr": 0.020932445774463206
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2962962962962963,
"acc_stderr": 0.027840811495871948,
"acc_norm": 0.2962962962962963,
"acc_norm_stderr": 0.027840811495871948
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.22268907563025211,
"acc_stderr": 0.027025433498882392,
"acc_norm": 0.22268907563025211,
"acc_norm_stderr": 0.027025433498882392
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2980132450331126,
"acc_stderr": 0.037345356767871984,
"acc_norm": 0.2980132450331126,
"acc_norm_stderr": 0.037345356767871984
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.25688073394495414,
"acc_stderr": 0.018732492928342462,
"acc_norm": 0.25688073394495414,
"acc_norm_stderr": 0.018732492928342462
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4583333333333333,
"acc_stderr": 0.033981108902946366,
"acc_norm": 0.4583333333333333,
"acc_norm_stderr": 0.033981108902946366
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.24019607843137256,
"acc_stderr": 0.02998373305591361,
"acc_norm": 0.24019607843137256,
"acc_norm_stderr": 0.02998373305591361
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.22784810126582278,
"acc_stderr": 0.02730348459906942,
"acc_norm": 0.22784810126582278,
"acc_norm_stderr": 0.02730348459906942
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.2556053811659193,
"acc_stderr": 0.029275891003969923,
"acc_norm": 0.2556053811659193,
"acc_norm_stderr": 0.029275891003969923
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.25190839694656486,
"acc_stderr": 0.03807387116306086,
"acc_norm": 0.25190839694656486,
"acc_norm_stderr": 0.03807387116306086
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.2231404958677686,
"acc_stderr": 0.03800754475228733,
"acc_norm": 0.2231404958677686,
"acc_norm_stderr": 0.03800754475228733
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.04236511258094632,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.04236511258094632
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.2331288343558282,
"acc_stderr": 0.03322015795776741,
"acc_norm": 0.2331288343558282,
"acc_norm_stderr": 0.03322015795776741
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.2767857142857143,
"acc_stderr": 0.042466243366976235,
"acc_norm": 0.2767857142857143,
"acc_norm_stderr": 0.042466243366976235
},
"harness|hendrycksTest-management|5": {
"acc": 0.21359223300970873,
"acc_stderr": 0.04058042015646034,
"acc_norm": 0.21359223300970873,
"acc_norm_stderr": 0.04058042015646034
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.1794871794871795,
"acc_stderr": 0.025140935950335442,
"acc_norm": 0.1794871794871795,
"acc_norm_stderr": 0.025140935950335442
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816506,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816506
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.23499361430395913,
"acc_stderr": 0.01516202415227844,
"acc_norm": 0.23499361430395913,
"acc_norm_stderr": 0.01516202415227844
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.2543352601156069,
"acc_stderr": 0.02344582627654555,
"acc_norm": 0.2543352601156069,
"acc_norm_stderr": 0.02344582627654555
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2424581005586592,
"acc_stderr": 0.014333522059217892,
"acc_norm": 0.2424581005586592,
"acc_norm_stderr": 0.014333522059217892
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.22875816993464052,
"acc_stderr": 0.024051029739912258,
"acc_norm": 0.22875816993464052,
"acc_norm_stderr": 0.024051029739912258
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.21221864951768488,
"acc_stderr": 0.023222756797435105,
"acc_norm": 0.21221864951768488,
"acc_norm_stderr": 0.023222756797435105
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.20679012345679013,
"acc_stderr": 0.022535006705942825,
"acc_norm": 0.20679012345679013,
"acc_norm_stderr": 0.022535006705942825
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.2730496453900709,
"acc_stderr": 0.026577860943307854,
"acc_norm": 0.2730496453900709,
"acc_norm_stderr": 0.026577860943307854
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.24315514993481094,
"acc_stderr": 0.010956556654417353,
"acc_norm": 0.24315514993481094,
"acc_norm_stderr": 0.010956556654417353
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.41544117647058826,
"acc_stderr": 0.029935342707877743,
"acc_norm": 0.41544117647058826,
"acc_norm_stderr": 0.029935342707877743
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.2647058823529412,
"acc_stderr": 0.017848089574913226,
"acc_norm": 0.2647058823529412,
"acc_norm_stderr": 0.017848089574913226
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.2545454545454545,
"acc_stderr": 0.041723430387053825,
"acc_norm": 0.2545454545454545,
"acc_norm_stderr": 0.041723430387053825
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.2,
"acc_stderr": 0.025607375986579153,
"acc_norm": 0.2,
"acc_norm_stderr": 0.025607375986579153
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.23383084577114427,
"acc_stderr": 0.029929415408348384,
"acc_norm": 0.23383084577114427,
"acc_norm_stderr": 0.029929415408348384
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-virology|5": {
"acc": 0.19879518072289157,
"acc_stderr": 0.03106939026078942,
"acc_norm": 0.19879518072289157,
"acc_norm_stderr": 0.03106939026078942
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.23976608187134502,
"acc_stderr": 0.03274485211946956,
"acc_norm": 0.23976608187134502,
"acc_norm_stderr": 0.03274485211946956
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2668298653610771,
"mc1_stderr": 0.015483691939237269,
"mc2": 0.43956041135282153,
"mc2_stderr": 0.015361204238680572
},
"harness|winogrande|5": {
"acc": 0.5043409629044988,
"acc_stderr": 0.0140519560640769
},
"harness|drop|3": {
"em": 0.0005243288590604027,
"em_stderr": 0.00023443780464839703,
"f1": 0.02527684563758395,
"f1_stderr": 0.0009458090371986776
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] | open-llm-leaderboard/details_postbot__gpt2-medium-emailgen | [
"region:us"
]
| 2023-11-19T16:46:06+00:00 | {"pretty_name": "Evaluation run of postbot/gpt2-medium-emailgen", "dataset_summary": "Dataset automatically created during the evaluation run of model [postbot/gpt2-medium-emailgen](https://huggingface.co/postbot/gpt2-medium-emailgen) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_postbot__gpt2-medium-emailgen_public\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-11-19T16:44:21.952672](https://huggingface.co/datasets/open-llm-leaderboard/details_postbot__gpt2-medium-emailgen_public/blob/main/results_2023-11-19T16-44-21.952672.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.24213502321663855,\n \"acc_stderr\": 0.030210866111969045,\n \"acc_norm\": 0.2431559232771965,\n \"acc_norm_stderr\": 0.031011858860463776,\n \"mc1\": 0.2668298653610771,\n \"mc1_stderr\": 0.015483691939237269,\n \"mc2\": 0.43956041135282153,\n \"mc2_stderr\": 0.015361204238680572,\n \"em\": 0.0005243288590604027,\n \"em_stderr\": 0.00023443780464839703,\n \"f1\": 0.02527684563758395,\n \"f1_stderr\": 0.0009458090371986776\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.22184300341296928,\n \"acc_stderr\": 0.012141659068147882,\n \"acc_norm\": 0.2645051194539249,\n \"acc_norm_stderr\": 0.012889272949313364\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.30541724756024696,\n \"acc_stderr\": 0.00459642622000091,\n \"acc_norm\": 0.3430591515634336,\n \"acc_norm_stderr\": 0.004737608340163401\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.27,\n \"acc_stderr\": 0.0446196043338474,\n \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.0446196043338474\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.25925925925925924,\n \"acc_stderr\": 0.03785714465066653,\n \"acc_norm\": 0.25925925925925924,\n \"acc_norm_stderr\": 0.03785714465066653\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.17763157894736842,\n \"acc_stderr\": 0.031103182383123398,\n \"acc_norm\": 0.17763157894736842,\n \"acc_norm_stderr\": 0.031103182383123398\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.23,\n \"acc_stderr\": 0.042295258468165044,\n \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.042295258468165044\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.23773584905660378,\n \"acc_stderr\": 0.02619980880756191,\n \"acc_norm\": 0.23773584905660378,\n \"acc_norm_stderr\": 0.02619980880756191\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2222222222222222,\n \"acc_stderr\": 0.03476590104304134,\n \"acc_norm\": 0.2222222222222222,\n \"acc_norm_stderr\": 0.03476590104304134\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.23,\n \"acc_stderr\": 0.042295258468165065,\n \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.042295258468165065\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.22,\n \"acc_stderr\": 0.04163331998932269,\n \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.04163331998932269\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.22,\n \"acc_stderr\": 0.04163331998932268,\n \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.04163331998932268\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.21965317919075145,\n \"acc_stderr\": 0.031568093627031744,\n \"acc_norm\": 0.21965317919075145,\n \"acc_norm_stderr\": 0.031568093627031744\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.20588235294117646,\n \"acc_stderr\": 0.04023382273617746,\n \"acc_norm\": 0.20588235294117646,\n \"acc_norm_stderr\": 0.04023382273617746\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.22,\n \"acc_stderr\": 0.041633319989322695,\n \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.041633319989322695\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.26382978723404255,\n \"acc_stderr\": 0.028809989854102973,\n \"acc_norm\": 0.26382978723404255,\n \"acc_norm_stderr\": 0.028809989854102973\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2719298245614035,\n \"acc_stderr\": 0.04185774424022056,\n \"acc_norm\": 0.2719298245614035,\n \"acc_norm_stderr\": 0.04185774424022056\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.20689655172413793,\n \"acc_stderr\": 0.03375672449560554,\n \"acc_norm\": 0.20689655172413793,\n \"acc_norm_stderr\": 0.03375672449560554\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.24074074074074073,\n \"acc_stderr\": 0.022019080012217897,\n \"acc_norm\": 0.24074074074074073,\n \"acc_norm_stderr\": 0.022019080012217897\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.20634920634920634,\n \"acc_stderr\": 0.03619604524124251,\n \"acc_norm\": 0.20634920634920634,\n \"acc_norm_stderr\": 0.03619604524124251\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.19,\n \"acc_stderr\": 0.039427724440366255,\n \"acc_norm\": 0.19,\n \"acc_norm_stderr\": 0.039427724440366255\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.267741935483871,\n \"acc_stderr\": 0.02518900666021238,\n \"acc_norm\": 0.267741935483871,\n \"acc_norm_stderr\": 0.02518900666021238\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.1625615763546798,\n \"acc_stderr\": 0.02596030006460558,\n \"acc_norm\": 0.1625615763546798,\n \"acc_norm_stderr\": 0.02596030006460558\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.24,\n \"acc_stderr\": 0.04292346959909283,\n \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.04292346959909283\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.21818181818181817,\n \"acc_stderr\": 0.03225078108306289,\n \"acc_norm\": 0.21818181818181817,\n \"acc_norm_stderr\": 0.03225078108306289\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.21717171717171718,\n \"acc_stderr\": 0.029376616484945633,\n \"acc_norm\": 0.21717171717171718,\n \"acc_norm_stderr\": 0.029376616484945633\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.19689119170984457,\n \"acc_stderr\": 0.02869787397186068,\n \"acc_norm\": 0.19689119170984457,\n \"acc_norm_stderr\": 0.02869787397186068\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.21794871794871795,\n \"acc_stderr\": 0.020932445774463206,\n \"acc_norm\": 0.21794871794871795,\n \"acc_norm_stderr\": 0.020932445774463206\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.2962962962962963,\n \"acc_stderr\": 0.027840811495871948,\n \"acc_norm\": 0.2962962962962963,\n \"acc_norm_stderr\": 0.027840811495871948\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.22268907563025211,\n \"acc_stderr\": 0.027025433498882392,\n \"acc_norm\": 0.22268907563025211,\n \"acc_norm_stderr\": 0.027025433498882392\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.2980132450331126,\n \"acc_stderr\": 0.037345356767871984,\n \"acc_norm\": 0.2980132450331126,\n \"acc_norm_stderr\": 0.037345356767871984\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.25688073394495414,\n \"acc_stderr\": 0.018732492928342462,\n \"acc_norm\": 0.25688073394495414,\n \"acc_norm_stderr\": 0.018732492928342462\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.4583333333333333,\n \"acc_stderr\": 0.033981108902946366,\n \"acc_norm\": 0.4583333333333333,\n \"acc_norm_stderr\": 0.033981108902946366\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.24019607843137256,\n \"acc_stderr\": 0.02998373305591361,\n \"acc_norm\": 0.24019607843137256,\n \"acc_norm_stderr\": 0.02998373305591361\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.22784810126582278,\n \"acc_stderr\": 0.02730348459906942,\n \"acc_norm\": 0.22784810126582278,\n \"acc_norm_stderr\": 0.02730348459906942\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.2556053811659193,\n \"acc_stderr\": 0.029275891003969923,\n \"acc_norm\": 0.2556053811659193,\n \"acc_norm_stderr\": 0.029275891003969923\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.25190839694656486,\n \"acc_stderr\": 0.03807387116306086,\n \"acc_norm\": 0.25190839694656486,\n \"acc_norm_stderr\": 0.03807387116306086\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.2231404958677686,\n \"acc_stderr\": 0.03800754475228733,\n \"acc_norm\": 0.2231404958677686,\n \"acc_norm_stderr\": 0.03800754475228733\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.25925925925925924,\n \"acc_stderr\": 0.04236511258094632,\n \"acc_norm\": 0.25925925925925924,\n \"acc_norm_stderr\": 0.04236511258094632\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.2331288343558282,\n \"acc_stderr\": 0.03322015795776741,\n \"acc_norm\": 0.2331288343558282,\n \"acc_norm_stderr\": 0.03322015795776741\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.2767857142857143,\n \"acc_stderr\": 0.042466243366976235,\n \"acc_norm\": 0.2767857142857143,\n \"acc_norm_stderr\": 0.042466243366976235\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.21359223300970873,\n \"acc_stderr\": 0.04058042015646034,\n \"acc_norm\": 0.21359223300970873,\n \"acc_norm_stderr\": 0.04058042015646034\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.1794871794871795,\n \"acc_stderr\": 0.025140935950335442,\n \"acc_norm\": 0.1794871794871795,\n \"acc_norm_stderr\": 0.025140935950335442\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816506,\n \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816506\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.23499361430395913,\n \"acc_stderr\": 0.01516202415227844,\n \"acc_norm\": 0.23499361430395913,\n \"acc_norm_stderr\": 0.01516202415227844\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.2543352601156069,\n \"acc_stderr\": 0.02344582627654555,\n \"acc_norm\": 0.2543352601156069,\n \"acc_norm_stderr\": 0.02344582627654555\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2424581005586592,\n \"acc_stderr\": 0.014333522059217892,\n \"acc_norm\": 0.2424581005586592,\n \"acc_norm_stderr\": 0.014333522059217892\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.22875816993464052,\n \"acc_stderr\": 0.024051029739912258,\n \"acc_norm\": 0.22875816993464052,\n \"acc_norm_stderr\": 0.024051029739912258\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.21221864951768488,\n \"acc_stderr\": 0.023222756797435105,\n \"acc_norm\": 0.21221864951768488,\n \"acc_norm_stderr\": 0.023222756797435105\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.20679012345679013,\n \"acc_stderr\": 0.022535006705942825,\n \"acc_norm\": 0.20679012345679013,\n \"acc_norm_stderr\": 0.022535006705942825\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.2730496453900709,\n \"acc_stderr\": 0.026577860943307854,\n \"acc_norm\": 0.2730496453900709,\n \"acc_norm_stderr\": 0.026577860943307854\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.24315514993481094,\n \"acc_stderr\": 0.010956556654417353,\n \"acc_norm\": 0.24315514993481094,\n \"acc_norm_stderr\": 0.010956556654417353\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.41544117647058826,\n \"acc_stderr\": 0.029935342707877743,\n \"acc_norm\": 0.41544117647058826,\n \"acc_norm_stderr\": 0.029935342707877743\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.2647058823529412,\n \"acc_stderr\": 0.017848089574913226,\n \"acc_norm\": 0.2647058823529412,\n \"acc_norm_stderr\": 0.017848089574913226\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.2545454545454545,\n \"acc_stderr\": 0.041723430387053825,\n \"acc_norm\": 0.2545454545454545,\n \"acc_norm_stderr\": 0.041723430387053825\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.2,\n \"acc_stderr\": 0.025607375986579153,\n \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.025607375986579153\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.23383084577114427,\n \"acc_stderr\": 0.029929415408348384,\n \"acc_norm\": 0.23383084577114427,\n \"acc_norm_stderr\": 0.029929415408348384\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.19879518072289157,\n \"acc_stderr\": 0.03106939026078942,\n \"acc_norm\": 0.19879518072289157,\n \"acc_norm_stderr\": 0.03106939026078942\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.23976608187134502,\n \"acc_stderr\": 0.03274485211946956,\n \"acc_norm\": 0.23976608187134502,\n \"acc_norm_stderr\": 0.03274485211946956\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2668298653610771,\n \"mc1_stderr\": 0.015483691939237269,\n \"mc2\": 0.43956041135282153,\n \"mc2_stderr\": 0.015361204238680572\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.5043409629044988,\n \"acc_stderr\": 0.0140519560640769\n },\n \"harness|drop|3\": {\n \"em\": 0.0005243288590604027,\n \"em_stderr\": 0.00023443780464839703,\n \"f1\": 0.02527684563758395,\n \"f1_stderr\": 0.0009458090371986776\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\": 0.0\n }\n}\n```", "repo_url": "https://huggingface.co/postbot/gpt2-medium-emailgen", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_11_19T16_44_21.952672", "path": ["**/details_harness|arc:challenge|25_2023-11-19T16-44-21.952672.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-11-19T16-44-21.952672.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_11_19T16_44_21.952672", "path": ["**/details_harness|drop|3_2023-11-19T16-44-21.952672.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-11-19T16-44-21.952672.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_11_19T16_44_21.952672", "path": ["**/details_harness|gsm8k|5_2023-11-19T16-44-21.952672.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-11-19T16-44-21.952672.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_11_19T16_44_21.952672", "path": ["**/details_harness|hellaswag|10_2023-11-19T16-44-21.952672.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-11-19T16-44-21.952672.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_11_19T16_44_21.952672", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-19T16-44-21.952672.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-11-19T16-44-21.952672.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-11-19T16-44-21.952672.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-11-19T16-44-21.952672.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-19T16-44-21.952672.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-11-19T16-44-21.952672.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-11-19T16-44-21.952672.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-11-19T16-44-21.952672.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-11-19T16-44-21.952672.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-11-19T16-44-21.952672.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-11-19T16-44-21.952672.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-11-19T16-44-21.952672.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-19T16-44-21.952672.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-11-19T16-44-21.952672.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-19T16-44-21.952672.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-19T16-44-21.952672.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-11-19T16-44-21.952672.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-11-19T16-44-21.952672.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-11-19T16-44-21.952672.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-19T16-44-21.952672.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-19T16-44-21.952672.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-19T16-44-21.952672.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-11-19T16-44-21.952672.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-19T16-44-21.952672.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-19T16-44-21.952672.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-19T16-44-21.952672.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-19T16-44-21.952672.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-11-19T16-44-21.952672.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-19T16-44-21.952672.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-19T16-44-21.952672.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-19T16-44-21.952672.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-19T16-44-21.952672.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-11-19T16-44-21.952672.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-11-19T16-44-21.952672.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-11-19T16-44-21.952672.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-11-19T16-44-21.952672.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-19T16-44-21.952672.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-11-19T16-44-21.952672.parquet", "**/details_harness|hendrycksTest-management|5_2023-11-19T16-44-21.952672.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-11-19T16-44-21.952672.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-11-19T16-44-21.952672.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-11-19T16-44-21.952672.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-11-19T16-44-21.952672.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-19T16-44-21.952672.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-11-19T16-44-21.952672.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-11-19T16-44-21.952672.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-11-19T16-44-21.952672.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-11-19T16-44-21.952672.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-11-19T16-44-21.952672.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-11-19T16-44-21.952672.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-11-19T16-44-21.952672.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-11-19T16-44-21.952672.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-11-19T16-44-21.952672.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-11-19T16-44-21.952672.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-19T16-44-21.952672.parquet", "**/details_harness|hendrycksTest-virology|5_2023-11-19T16-44-21.952672.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-11-19T16-44-21.952672.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-19T16-44-21.952672.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-11-19T16-44-21.952672.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-11-19T16-44-21.952672.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-11-19T16-44-21.952672.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-19T16-44-21.952672.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-11-19T16-44-21.952672.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-11-19T16-44-21.952672.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-11-19T16-44-21.952672.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-11-19T16-44-21.952672.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-11-19T16-44-21.952672.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-11-19T16-44-21.952672.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-11-19T16-44-21.952672.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-19T16-44-21.952672.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-11-19T16-44-21.952672.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-19T16-44-21.952672.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-19T16-44-21.952672.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-11-19T16-44-21.952672.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-11-19T16-44-21.952672.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-11-19T16-44-21.952672.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-19T16-44-21.952672.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-19T16-44-21.952672.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-19T16-44-21.952672.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-11-19T16-44-21.952672.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-19T16-44-21.952672.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-19T16-44-21.952672.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-19T16-44-21.952672.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-19T16-44-21.952672.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-11-19T16-44-21.952672.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-19T16-44-21.952672.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-19T16-44-21.952672.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-19T16-44-21.952672.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-19T16-44-21.952672.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-11-19T16-44-21.952672.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-11-19T16-44-21.952672.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-11-19T16-44-21.952672.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-11-19T16-44-21.952672.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-19T16-44-21.952672.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-11-19T16-44-21.952672.parquet", "**/details_harness|hendrycksTest-management|5_2023-11-19T16-44-21.952672.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-11-19T16-44-21.952672.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-11-19T16-44-21.952672.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-11-19T16-44-21.952672.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-11-19T16-44-21.952672.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-19T16-44-21.952672.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-11-19T16-44-21.952672.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-11-19T16-44-21.952672.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-11-19T16-44-21.952672.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-11-19T16-44-21.952672.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-11-19T16-44-21.952672.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-11-19T16-44-21.952672.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-11-19T16-44-21.952672.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-11-19T16-44-21.952672.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-11-19T16-44-21.952672.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-11-19T16-44-21.952672.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-19T16-44-21.952672.parquet", "**/details_harness|hendrycksTest-virology|5_2023-11-19T16-44-21.952672.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-11-19T16-44-21.952672.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_11_19T16_44_21.952672", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-19T16-44-21.952672.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-19T16-44-21.952672.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_11_19T16_44_21.952672", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-11-19T16-44-21.952672.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-11-19T16-44-21.952672.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_11_19T16_44_21.952672", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-11-19T16-44-21.952672.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-11-19T16-44-21.952672.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_11_19T16_44_21.952672", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-11-19T16-44-21.952672.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-11-19T16-44-21.952672.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_11_19T16_44_21.952672", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-19T16-44-21.952672.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-19T16-44-21.952672.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_11_19T16_44_21.952672", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-11-19T16-44-21.952672.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-11-19T16-44-21.952672.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_11_19T16_44_21.952672", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-11-19T16-44-21.952672.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-11-19T16-44-21.952672.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_11_19T16_44_21.952672", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-11-19T16-44-21.952672.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-11-19T16-44-21.952672.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_11_19T16_44_21.952672", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-11-19T16-44-21.952672.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-11-19T16-44-21.952672.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_11_19T16_44_21.952672", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-11-19T16-44-21.952672.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-11-19T16-44-21.952672.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_11_19T16_44_21.952672", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-11-19T16-44-21.952672.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-11-19T16-44-21.952672.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_11_19T16_44_21.952672", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-11-19T16-44-21.952672.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-11-19T16-44-21.952672.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_11_19T16_44_21.952672", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-19T16-44-21.952672.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-19T16-44-21.952672.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_11_19T16_44_21.952672", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-11-19T16-44-21.952672.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-11-19T16-44-21.952672.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_11_19T16_44_21.952672", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-19T16-44-21.952672.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-19T16-44-21.952672.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_11_19T16_44_21.952672", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-19T16-44-21.952672.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-19T16-44-21.952672.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_11_19T16_44_21.952672", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-11-19T16-44-21.952672.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-11-19T16-44-21.952672.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_11_19T16_44_21.952672", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-11-19T16-44-21.952672.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-11-19T16-44-21.952672.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_11_19T16_44_21.952672", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-11-19T16-44-21.952672.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-11-19T16-44-21.952672.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_11_19T16_44_21.952672", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-19T16-44-21.952672.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-19T16-44-21.952672.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_11_19T16_44_21.952672", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-19T16-44-21.952672.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-19T16-44-21.952672.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_11_19T16_44_21.952672", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-19T16-44-21.952672.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-19T16-44-21.952672.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_11_19T16_44_21.952672", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-11-19T16-44-21.952672.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-11-19T16-44-21.952672.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_11_19T16_44_21.952672", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-19T16-44-21.952672.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-19T16-44-21.952672.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_11_19T16_44_21.952672", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-19T16-44-21.952672.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-19T16-44-21.952672.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_11_19T16_44_21.952672", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-19T16-44-21.952672.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-19T16-44-21.952672.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_11_19T16_44_21.952672", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-19T16-44-21.952672.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-19T16-44-21.952672.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_11_19T16_44_21.952672", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-11-19T16-44-21.952672.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-11-19T16-44-21.952672.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_11_19T16_44_21.952672", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-19T16-44-21.952672.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-19T16-44-21.952672.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_11_19T16_44_21.952672", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-19T16-44-21.952672.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-19T16-44-21.952672.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_11_19T16_44_21.952672", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-19T16-44-21.952672.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-19T16-44-21.952672.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_11_19T16_44_21.952672", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-19T16-44-21.952672.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-19T16-44-21.952672.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_11_19T16_44_21.952672", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-11-19T16-44-21.952672.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-11-19T16-44-21.952672.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_11_19T16_44_21.952672", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-11-19T16-44-21.952672.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-11-19T16-44-21.952672.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_11_19T16_44_21.952672", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-11-19T16-44-21.952672.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-11-19T16-44-21.952672.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_11_19T16_44_21.952672", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-11-19T16-44-21.952672.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-11-19T16-44-21.952672.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_11_19T16_44_21.952672", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-19T16-44-21.952672.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-19T16-44-21.952672.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_11_19T16_44_21.952672", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-11-19T16-44-21.952672.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-11-19T16-44-21.952672.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_11_19T16_44_21.952672", "path": ["**/details_harness|hendrycksTest-management|5_2023-11-19T16-44-21.952672.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-11-19T16-44-21.952672.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_11_19T16_44_21.952672", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-11-19T16-44-21.952672.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-11-19T16-44-21.952672.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_11_19T16_44_21.952672", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-11-19T16-44-21.952672.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-11-19T16-44-21.952672.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_11_19T16_44_21.952672", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-11-19T16-44-21.952672.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-11-19T16-44-21.952672.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_11_19T16_44_21.952672", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-11-19T16-44-21.952672.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-11-19T16-44-21.952672.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_11_19T16_44_21.952672", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-19T16-44-21.952672.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-19T16-44-21.952672.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_11_19T16_44_21.952672", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-11-19T16-44-21.952672.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-11-19T16-44-21.952672.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_11_19T16_44_21.952672", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-11-19T16-44-21.952672.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-11-19T16-44-21.952672.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_11_19T16_44_21.952672", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-11-19T16-44-21.952672.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-11-19T16-44-21.952672.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_11_19T16_44_21.952672", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-11-19T16-44-21.952672.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-11-19T16-44-21.952672.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_11_19T16_44_21.952672", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-11-19T16-44-21.952672.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-11-19T16-44-21.952672.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_11_19T16_44_21.952672", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-11-19T16-44-21.952672.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-11-19T16-44-21.952672.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_11_19T16_44_21.952672", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-11-19T16-44-21.952672.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-11-19T16-44-21.952672.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_11_19T16_44_21.952672", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-11-19T16-44-21.952672.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-11-19T16-44-21.952672.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_11_19T16_44_21.952672", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-11-19T16-44-21.952672.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-11-19T16-44-21.952672.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_11_19T16_44_21.952672", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-11-19T16-44-21.952672.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-11-19T16-44-21.952672.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_11_19T16_44_21.952672", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-19T16-44-21.952672.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-19T16-44-21.952672.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_11_19T16_44_21.952672", "path": ["**/details_harness|hendrycksTest-virology|5_2023-11-19T16-44-21.952672.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-11-19T16-44-21.952672.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_11_19T16_44_21.952672", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-11-19T16-44-21.952672.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-11-19T16-44-21.952672.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_11_19T16_44_21.952672", "path": ["**/details_harness|truthfulqa:mc|0_2023-11-19T16-44-21.952672.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-11-19T16-44-21.952672.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_11_19T16_44_21.952672", "path": ["**/details_harness|winogrande|5_2023-11-19T16-44-21.952672.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-11-19T16-44-21.952672.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_11_19T16_44_21.952672", "path": ["results_2023-11-19T16-44-21.952672.parquet"]}, {"split": "latest", "path": ["results_2023-11-19T16-44-21.952672.parquet"]}]}]} | 2023-11-19T16:46:50+00:00 | []
| []
| TAGS
#region-us
|
# Dataset Card for Evaluation run of postbot/gpt2-medium-emailgen
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model postbot/gpt2-medium-emailgen on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-11-19T16:44:21.952672(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
| [
"# Dataset Card for Evaluation run of postbot/gpt2-medium-emailgen",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model postbot/gpt2-medium-emailgen on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-11-19T16:44:21.952672(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
]
| [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of postbot/gpt2-medium-emailgen",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model postbot/gpt2-medium-emailgen on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-11-19T16:44:21.952672(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
]
| [
6,
20,
31,
169,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
]
| [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of postbot/gpt2-medium-emailgen## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model postbot/gpt2-medium-emailgen on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-11-19T16:44:21.952672(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
]
|
61205c82b6f44ce26c9a5bb9ea95f94d8a4b0c77 |
### OpenAssistant Conversations Dataset (OASST1-TOP1) English
#### Dataset Summary
This is an OpenAssistant Coversations dataset TOP1, only in English language only. The dataset contains 4253 conversations. It is a test dataset, aimed to reduce the fine tuning time for English models used in a RAG system.
For furthur information and data structure, please refer to the OpenAssistant page:
https://huggingface.co/datasets/OpenAssistant/oasst1
| SchubergPhilis/OpenAssistant-Top1-ENG-V1 | [
"license:apache-2.0",
"region:us"
]
| 2023-11-19T16:49:43+00:00 | {"license": "apache-2.0"} | 2023-11-19T17:00:01+00:00 | []
| []
| TAGS
#license-apache-2.0 #region-us
|
### OpenAssistant Conversations Dataset (OASST1-TOP1) English
#### Dataset Summary
This is an OpenAssistant Coversations dataset TOP1, only in English language only. The dataset contains 4253 conversations. It is a test dataset, aimed to reduce the fine tuning time for English models used in a RAG system.
For furthur information and data structure, please refer to the OpenAssistant page:
URL
| [
"### OpenAssistant Conversations Dataset (OASST1-TOP1) English",
"#### Dataset Summary\nThis is an OpenAssistant Coversations dataset TOP1, only in English language only. The dataset contains 4253 conversations. It is a test dataset, aimed to reduce the fine tuning time for English models used in a RAG system.\n\nFor furthur information and data structure, please refer to the OpenAssistant page:\nURL"
]
| [
"TAGS\n#license-apache-2.0 #region-us \n",
"### OpenAssistant Conversations Dataset (OASST1-TOP1) English",
"#### Dataset Summary\nThis is an OpenAssistant Coversations dataset TOP1, only in English language only. The dataset contains 4253 conversations. It is a test dataset, aimed to reduce the fine tuning time for English models used in a RAG system.\n\nFor furthur information and data structure, please refer to the OpenAssistant page:\nURL"
]
| [
14,
19,
82
]
| [
"passage: TAGS\n#license-apache-2.0 #region-us \n### OpenAssistant Conversations Dataset (OASST1-TOP1) English#### Dataset Summary\nThis is an OpenAssistant Coversations dataset TOP1, only in English language only. The dataset contains 4253 conversations. It is a test dataset, aimed to reduce the fine tuning time for English models used in a RAG system.\n\nFor furthur information and data structure, please refer to the OpenAssistant page:\nURL"
]
|
9a7affa2ab9121ddf4a68a063f259ed66f74828e |
# Bangumi Image Base of Shikkaku Mon No Saikyou Kenja
This is the image base of bangumi Shikkaku Mon no Saikyou Kenja, we detected 35 characters, 2876 images in total. The full dataset is [here](all.zip).
**Please note that these image bases are not guaranteed to be 100% cleaned, they may be noisy actual.** If you intend to manually train models using this dataset, we recommend performing necessary preprocessing on the downloaded dataset to eliminate potential noisy samples (approximately 1% probability).
Here is the characters' preview:
| # | Images | Download | Preview 1 | Preview 2 | Preview 3 | Preview 4 | Preview 5 | Preview 6 | Preview 7 | Preview 8 |
|:------|---------:|:---------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|
| 0 | 893 | [Download](0/dataset.zip) |  |  |  |  |  |  |  |  |
| 1 | 9 | [Download](1/dataset.zip) |  |  |  |  |  |  |  |  |
| 2 | 19 | [Download](2/dataset.zip) |  |  |  |  |  |  |  |  |
| 3 | 99 | [Download](3/dataset.zip) |  |  |  |  |  |  |  |  |
| 4 | 16 | [Download](4/dataset.zip) |  |  |  |  |  |  |  |  |
| 5 | 27 | [Download](5/dataset.zip) |  |  |  |  |  |  |  |  |
| 6 | 19 | [Download](6/dataset.zip) |  |  |  |  |  |  |  |  |
| 7 | 31 | [Download](7/dataset.zip) |  |  |  |  |  |  |  |  |
| 8 | 22 | [Download](8/dataset.zip) |  |  |  |  |  |  |  |  |
| 9 | 86 | [Download](9/dataset.zip) |  |  |  |  |  |  |  |  |
| 10 | 14 | [Download](10/dataset.zip) |  |  |  |  |  |  |  |  |
| 11 | 22 | [Download](11/dataset.zip) |  |  |  |  |  |  |  |  |
| 12 | 13 | [Download](12/dataset.zip) |  |  |  |  |  |  |  |  |
| 13 | 8 | [Download](13/dataset.zip) |  |  |  |  |  |  |  |  |
| 14 | 19 | [Download](14/dataset.zip) |  |  |  |  |  |  |  |  |
| 15 | 31 | [Download](15/dataset.zip) |  |  |  |  |  |  |  |  |
| 16 | 10 | [Download](16/dataset.zip) |  |  |  |  |  |  |  |  |
| 17 | 9 | [Download](17/dataset.zip) |  |  |  |  |  |  |  |  |
| 18 | 15 | [Download](18/dataset.zip) |  |  |  |  |  |  |  |  |
| 19 | 10 | [Download](19/dataset.zip) |  |  |  |  |  |  |  |  |
| 20 | 9 | [Download](20/dataset.zip) |  |  |  |  |  |  |  |  |
| 21 | 13 | [Download](21/dataset.zip) |  |  |  |  |  |  |  |  |
| 22 | 293 | [Download](22/dataset.zip) |  |  |  |  |  |  |  |  |
| 23 | 11 | [Download](23/dataset.zip) |  |  |  |  |  |  |  |  |
| 24 | 27 | [Download](24/dataset.zip) |  |  |  |  |  |  |  |  |
| 25 | 469 | [Download](25/dataset.zip) |  |  |  |  |  |  |  |  |
| 26 | 23 | [Download](26/dataset.zip) |  |  |  |  |  |  |  |  |
| 27 | 7 | [Download](27/dataset.zip) |  |  |  |  |  |  |  | N/A |
| 28 | 6 | [Download](28/dataset.zip) |  |  |  |  |  |  | N/A | N/A |
| 29 | 12 | [Download](29/dataset.zip) |  |  |  |  |  |  |  |  |
| 30 | 467 | [Download](30/dataset.zip) |  |  |  |  |  |  |  |  |
| 31 | 8 | [Download](31/dataset.zip) |  |  |  |  |  |  |  |  |
| 32 | 6 | [Download](32/dataset.zip) |  |  |  |  |  |  | N/A | N/A |
| 33 | 6 | [Download](33/dataset.zip) |  |  |  |  |  |  | N/A | N/A |
| noise | 147 | [Download](-1/dataset.zip) |  |  |  |  |  |  |  |  |
| BangumiBase/shikkakumonnosaikyoukenja | [
"size_categories:1K<n<10K",
"license:mit",
"art",
"region:us"
]
| 2023-11-19T17:00:50+00:00 | {"license": "mit", "size_categories": ["1K<n<10K"], "tags": ["art"]} | 2023-11-19T18:47:53+00:00 | []
| []
| TAGS
#size_categories-1K<n<10K #license-mit #art #region-us
| Bangumi Image Base of Shikkaku Mon No Saikyou Kenja
===================================================
This is the image base of bangumi Shikkaku Mon no Saikyou Kenja, we detected 35 characters, 2876 images in total. The full dataset is here.
Please note that these image bases are not guaranteed to be 100% cleaned, they may be noisy actual. If you intend to manually train models using this dataset, we recommend performing necessary preprocessing on the downloaded dataset to eliminate potential noisy samples (approximately 1% probability).
Here is the characters' preview:
| []
| [
"TAGS\n#size_categories-1K<n<10K #license-mit #art #region-us \n"
]
| [
25
]
| [
"passage: TAGS\n#size_categories-1K<n<10K #license-mit #art #region-us \n"
]
|
dcc3b4ec2ac913bbc4282ea9d0be1ef7be32ab32 | # Dataset Card for "lsc_multiplechoice_top2vec"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | tomashs/lsc_multiplechoice_top2vec | [
"region:us"
]
| 2023-11-19T17:06:14+00:00 | {"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "val", "path": "data/val-*"}, {"split": "test", "path": "data/test-*"}]}], "dataset_info": {"features": [{"name": "text", "dtype": "string"}, {"name": "short_form", "dtype": "string"}, {"name": "long_form", "dtype": "string"}, {"name": "freq", "dtype": "int64"}, {"name": "num_candidates", "dtype": "int64"}, {"name": "__index_level_0__", "dtype": "int64"}, {"name": "topic_vector", "sequence": "float64"}], "splits": [{"name": "train", "num_bytes": 150188148, "num_examples": 110752}, {"name": "val", "num_bytes": 34578554, "num_examples": 25932}, {"name": "test", "num_bytes": 34161105, "num_examples": 25175}], "download_size": 190641646, "dataset_size": 218927807}} | 2023-11-19T17:12:35+00:00 | []
| []
| TAGS
#region-us
| # Dataset Card for "lsc_multiplechoice_top2vec"
More Information needed | [
"# Dataset Card for \"lsc_multiplechoice_top2vec\"\n\nMore Information needed"
]
| [
"TAGS\n#region-us \n",
"# Dataset Card for \"lsc_multiplechoice_top2vec\"\n\nMore Information needed"
]
| [
6,
21
]
| [
"passage: TAGS\n#region-us \n# Dataset Card for \"lsc_multiplechoice_top2vec\"\n\nMore Information needed"
]
|
df9e171ff12d2c4618721e9d84645e762b450f3a | # Dataset Card for "vocalSoundRecognition_vocalSound"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | DynamicSuperb/vocalSoundRecognition_vocalSound | [
"region:us"
]
| 2023-11-19T17:08:30+00:00 | {"dataset_info": {"features": [{"name": "audio", "dtype": "audio"}, {"name": "file", "dtype": "string"}, {"name": "instruction", "dtype": "string"}, {"name": "label", "dtype": "string"}], "splits": [{"name": "test", "num_bytes": 1427179683.235, "num_examples": 3591}], "download_size": 1107141703, "dataset_size": 1427179683.235}} | 2023-11-19T17:12:35+00:00 | []
| []
| TAGS
#region-us
| # Dataset Card for "vocalSoundRecognition_vocalSound"
More Information needed | [
"# Dataset Card for \"vocalSoundRecognition_vocalSound\"\n\nMore Information needed"
]
| [
"TAGS\n#region-us \n",
"# Dataset Card for \"vocalSoundRecognition_vocalSound\"\n\nMore Information needed"
]
| [
6,
22
]
| [
"passage: TAGS\n#region-us \n# Dataset Card for \"vocalSoundRecognition_vocalSound\"\n\nMore Information needed"
]
|
f1c5815bc1b850ccb5b4aab9425249de090b2bbd | # Dataset Card for "taiga_stripped_rest"
This is a subset of the Taiga corpus (https://tatianashavrina.github.io/taiga_site), derived from the all the sources, except
[stihi](https://huggingface.co/datasets/cointegrated/taiga_stripped_stihi) and [proza](https://huggingface.co/datasets/cointegrated/taiga_stripped_proza):
`Arzamas`, `Interfax`, `Lenta`, `Magazines`, `NPlus1`, `KP`, `Fontanka`, `Subtitles` and `social`.
The dataset consists of plain texts, without morphological and syntactic annotation or metainformation.
For the `Subtitles` subset, we dropped all non-Russian texts.
For the `social` subset, we split the texts into indidividual database items,
or (for LiveJournal) into "posts" (defined as lines with 1000+ characters) and subsequent "comments".
For more details and analysis, and for the texts with annotation or metadata, please refer to website of the corpus.
Other subsets of Taiga: [proza](https://huggingface.co/datasets/cointegrated/taiga_stripped_proza) (fiction)
and [stihi](https://huggingface.co/datasets/cointegrated/taiga_stripped_stihi) (poetry).
License: [CC BY-SA 3.0](https://creativecommons.org/licenses/by-sa/3.0/). | cointegrated/taiga_stripped_rest | [
"task_categories:text-generation",
"task_categories:fill-mask",
"size_categories:1M<n<10M",
"language:ru",
"license:cc-by-sa-3.0",
"taiga",
"tayga",
"region:us"
]
| 2023-11-19T17:56:03+00:00 | {"language": ["ru"], "license": "cc-by-sa-3.0", "size_categories": ["1M<n<10M"], "task_categories": ["text-generation", "fill-mask"], "dataset_info": {"features": [{"name": "text", "dtype": "string"}, {"name": "file", "dtype": "string"}], "splits": [{"name": "Arzamas", "num_bytes": 4725465, "num_examples": 311}, {"name": "Interfax", "num_bytes": 82478694, "num_examples": 46000}, {"name": "Lenta", "num_bytes": 99984639, "num_examples": 36000}, {"name": "Magazines", "num_bytes": 2295653294, "num_examples": 39000}, {"name": "NPlus1", "num_bytes": 23506941, "num_examples": 7000}, {"name": "KP", "num_bytes": 65444392, "num_examples": 45000}, {"name": "Fontanka", "num_bytes": 840679591, "num_examples": 342683}, {"name": "Subtitles", "num_bytes": 311508573, "num_examples": 7903}, {"name": "social", "num_bytes": 600396164, "num_examples": 804356}], "download_size": 2180717682, "dataset_size": 4324377753}, "tags": ["taiga", "tayga"]} | 2023-11-23T09:48:58+00:00 | []
| [
"ru"
]
| TAGS
#task_categories-text-generation #task_categories-fill-mask #size_categories-1M<n<10M #language-Russian #license-cc-by-sa-3.0 #taiga #tayga #region-us
| # Dataset Card for "taiga_stripped_rest"
This is a subset of the Taiga corpus (URL derived from the all the sources, except
stihi and proza:
'Arzamas', 'Interfax', 'Lenta', 'Magazines', 'NPlus1', 'KP', 'Fontanka', 'Subtitles' and 'social'.
The dataset consists of plain texts, without morphological and syntactic annotation or metainformation.
For the 'Subtitles' subset, we dropped all non-Russian texts.
For the 'social' subset, we split the texts into indidividual database items,
or (for LiveJournal) into "posts" (defined as lines with 1000+ characters) and subsequent "comments".
For more details and analysis, and for the texts with annotation or metadata, please refer to website of the corpus.
Other subsets of Taiga: proza (fiction)
and stihi (poetry).
License: CC BY-SA 3.0. | [
"# Dataset Card for \"taiga_stripped_rest\"\n\nThis is a subset of the Taiga corpus (URL derived from the all the sources, except \nstihi and proza:\n'Arzamas', 'Interfax', 'Lenta', 'Magazines', 'NPlus1', 'KP', 'Fontanka', 'Subtitles' and 'social'. \n\nThe dataset consists of plain texts, without morphological and syntactic annotation or metainformation. \nFor the 'Subtitles' subset, we dropped all non-Russian texts. \nFor the 'social' subset, we split the texts into indidividual database items, \nor (for LiveJournal) into \"posts\" (defined as lines with 1000+ characters) and subsequent \"comments\".\n\nFor more details and analysis, and for the texts with annotation or metadata, please refer to website of the corpus.\n\nOther subsets of Taiga: proza (fiction) \nand stihi (poetry).\n\nLicense: CC BY-SA 3.0."
]
| [
"TAGS\n#task_categories-text-generation #task_categories-fill-mask #size_categories-1M<n<10M #language-Russian #license-cc-by-sa-3.0 #taiga #tayga #region-us \n",
"# Dataset Card for \"taiga_stripped_rest\"\n\nThis is a subset of the Taiga corpus (URL derived from the all the sources, except \nstihi and proza:\n'Arzamas', 'Interfax', 'Lenta', 'Magazines', 'NPlus1', 'KP', 'Fontanka', 'Subtitles' and 'social'. \n\nThe dataset consists of plain texts, without morphological and syntactic annotation or metainformation. \nFor the 'Subtitles' subset, we dropped all non-Russian texts. \nFor the 'social' subset, we split the texts into indidividual database items, \nor (for LiveJournal) into \"posts\" (defined as lines with 1000+ characters) and subsequent \"comments\".\n\nFor more details and analysis, and for the texts with annotation or metadata, please refer to website of the corpus.\n\nOther subsets of Taiga: proza (fiction) \nand stihi (poetry).\n\nLicense: CC BY-SA 3.0."
]
| [
62,
238
]
| [
"passage: TAGS\n#task_categories-text-generation #task_categories-fill-mask #size_categories-1M<n<10M #language-Russian #license-cc-by-sa-3.0 #taiga #tayga #region-us \n# Dataset Card for \"taiga_stripped_rest\"\n\nThis is a subset of the Taiga corpus (URL derived from the all the sources, except \nstihi and proza:\n'Arzamas', 'Interfax', 'Lenta', 'Magazines', 'NPlus1', 'KP', 'Fontanka', 'Subtitles' and 'social'. \n\nThe dataset consists of plain texts, without morphological and syntactic annotation or metainformation. \nFor the 'Subtitles' subset, we dropped all non-Russian texts. \nFor the 'social' subset, we split the texts into indidividual database items, \nor (for LiveJournal) into \"posts\" (defined as lines with 1000+ characters) and subsequent \"comments\".\n\nFor more details and analysis, and for the texts with annotation or metadata, please refer to website of the corpus.\n\nOther subsets of Taiga: proza (fiction) \nand stihi (poetry).\n\nLicense: CC BY-SA 3.0."
]
|
7ade3c361c875faa1236ee6c359e81159b410aa5 |
# ChatHaruhi
# Reviving Anime Character in Reality via Large Language Model
[]()
[]()
github repo: https://github.com/LC1332/Chat-Haruhi-Suzumiya
**Chat-Haruhi-Suzumiya**is a language model that imitates the tone, personality and storylines of characters like Haruhi Suzumiya,
<details>
<summary> The project was developed by Cheng Li, Ziang Leng, Chenxi Yan, Xiaoyang Feng, HaoSheng Wang, Junyi Shen, Hao Wang, Weishi Mi, Aria Fei, Song Yan, Linkang Zhan, Yaokai Jia, Pingyu Wu, and Haozhen Sun,etc. </summary>
This is an open source project and the members were recruited from open source communities like DataWhale.
Lulu Li( [Cheng Li@SenseTime](https://github.com/LC1332) )initiated the whole project and designed and implemented most of the features.
Ziang Leng( [Ziang Leng@SenseTime](https://blairleng.github.io) )designed and implemented the training, data generation and backend architecture for ChatHaruhi 1.0.
Chenxi Yan( [Chenxi Yan@Chengdu University of Information Technology](https://github.com/todochenxi) )implemented and maintained the backend for ChatHaruhi 1.0.
Junyi Shen( [Junyi Shen@Zhejiang University](https://github.com/J1shen) )implemented the training code and participated in generating the training dataset.
Hao Wang( [Hao Wang](https://github.com/wanghao07456) )collected script data for a TV series and participated in data augmentation.
Weishi Mi( [Weishi MI@Tsinghua University](https://github.com/hhhwmws0117) )participated in data augmentation.
Aria Fei( [Aria Fei@BJUT](https://ariafyy.github.io/) )implemented the ASR feature for the script tool and participated in the Openness-Aware Personality paper project.
Xiaoyang Feng( [Xiaoyang Feng@Nanjing Agricultural University](https://github.com/fengyunzaidushi) )integrated the script recognition tool and participated in the Openness-Aware Personality paper project.
Yue Leng ( [Song Yan](https://github.com/zealot52099) )Collected data from The Big Bang Theory. Implemented script format conversion.
scixing(HaoSheng Wang)( [HaoSheng Wang](https://github.com/ssccinng) ) implemented voiceprint recognition in the script tool and tts-vits speech synthesis.
Linkang Zhan( [JunityZhan@Case Western Reserve University](https://github.com/JunityZhan) ) collected Genshin Impact's system prompts and story data.
Yaokai Jia( [Yaokai Jia](https://github.com/KaiJiaBrother) )implemented the Vue frontend and practiced GPU extraction of Bert in a psychology project.
Pingyu Wu( [Pingyu Wu@Juncai Shuyun](https://github.com/wpydcr) )helped deploy the first version of the training code.
Haozhen Sun( [Haozhen Sun@Tianjin University] )plot the character figures for ChatHaruhi.
</details>
### Citation
Please cite the repo if you use the data or code in this repo.
```
@misc{li2023chatharuhi,
title={ChatHaruhi: Reviving Anime Character in Reality via Large Language Model},
author={Cheng Li and Ziang Leng and Chenxi Yan and Junyi Shen and Hao Wang and Weishi MI and Yaying Fei and Xiaoyang Feng and Song Yan and HaoSheng Wang and Linkang Zhan and Yaokai Jia and Pingyu Wu and Haozhen Sun},
year={2023},
eprint={2308.09597},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
``` | showchen/Kurisu | [
"task_categories:text-generation",
"task_categories:text2text-generation",
"size_categories:10K<n<100K",
"language:en",
"language:zh",
"license:cc-by-4.0",
"arxiv:2308.09597",
"region:us"
]
| 2023-11-19T18:36:41+00:00 | {"language": ["en", "zh"], "license": "cc-by-4.0", "size_categories": ["10K<n<100K"], "task_categories": ["text-generation", "text2text-generation"]} | 2023-11-19T18:46:17+00:00 | [
"2308.09597"
]
| [
"en",
"zh"
]
| TAGS
#task_categories-text-generation #task_categories-text2text-generation #size_categories-10K<n<100K #language-English #language-Chinese #license-cc-by-4.0 #arxiv-2308.09597 #region-us
|
# ChatHaruhi
# Reviving Anime Character in Reality via Large Language Model
![Code License]()
![Data License]()
github repo: URL
Chat-Haruhi-Suzumiyais a language model that imitates the tone, personality and storylines of characters like Haruhi Suzumiya,
<details>
<summary> The project was developed by Cheng Li, Ziang Leng, Chenxi Yan, Xiaoyang Feng, HaoSheng Wang, Junyi Shen, Hao Wang, Weishi Mi, Aria Fei, Song Yan, Linkang Zhan, Yaokai Jia, Pingyu Wu, and Haozhen Sun,etc. </summary>
This is an open source project and the members were recruited from open source communities like DataWhale.
Lulu Li( Cheng Li@SenseTime )initiated the whole project and designed and implemented most of the features.
Ziang Leng( Ziang Leng@SenseTime )designed and implemented the training, data generation and backend architecture for ChatHaruhi 1.0.
Chenxi Yan( Chenxi Yan@Chengdu University of Information Technology )implemented and maintained the backend for ChatHaruhi 1.0.
Junyi Shen( Junyi Shen@Zhejiang University )implemented the training code and participated in generating the training dataset.
Hao Wang( Hao Wang )collected script data for a TV series and participated in data augmentation.
Weishi Mi( Weishi MI@Tsinghua University )participated in data augmentation.
Aria Fei( Aria Fei@BJUT )implemented the ASR feature for the script tool and participated in the Openness-Aware Personality paper project.
Xiaoyang Feng( Xiaoyang Feng@Nanjing Agricultural University )integrated the script recognition tool and participated in the Openness-Aware Personality paper project.
Yue Leng ( Song Yan )Collected data from The Big Bang Theory. Implemented script format conversion.
scixing(HaoSheng Wang)( HaoSheng Wang ) implemented voiceprint recognition in the script tool and tts-vits speech synthesis.
Linkang Zhan( JunityZhan@Case Western Reserve University ) collected Genshin Impact's system prompts and story data.
Yaokai Jia( Yaokai Jia )implemented the Vue frontend and practiced GPU extraction of Bert in a psychology project.
Pingyu Wu( Pingyu Wu@Juncai Shuyun )helped deploy the first version of the training code.
Haozhen Sun( [Haozhen Sun@Tianjin University] )plot the character figures for ChatHaruhi.
</details>
Please cite the repo if you use the data or code in this repo.
| [
"# ChatHaruhi",
"# Reviving Anime Character in Reality via Large Language Model\n\n![Code License]()\n![Data License]()\n\ngithub repo: URL\n\n\n\nChat-Haruhi-Suzumiyais a language model that imitates the tone, personality and storylines of characters like Haruhi Suzumiya,\n\n\n<details>\n <summary> The project was developed by Cheng Li, Ziang Leng, Chenxi Yan, Xiaoyang Feng, HaoSheng Wang, Junyi Shen, Hao Wang, Weishi Mi, Aria Fei, Song Yan, Linkang Zhan, Yaokai Jia, Pingyu Wu, and Haozhen Sun,etc. </summary>\n\nThis is an open source project and the members were recruited from open source communities like DataWhale.\n\nLulu Li( Cheng Li@SenseTime )initiated the whole project and designed and implemented most of the features.\n \nZiang Leng( Ziang Leng@SenseTime )designed and implemented the training, data generation and backend architecture for ChatHaruhi 1.0.\n\nChenxi Yan( Chenxi Yan@Chengdu University of Information Technology )implemented and maintained the backend for ChatHaruhi 1.0.\n\nJunyi Shen( Junyi Shen@Zhejiang University )implemented the training code and participated in generating the training dataset.\n\nHao Wang( Hao Wang )collected script data for a TV series and participated in data augmentation.\n\nWeishi Mi( Weishi MI@Tsinghua University )participated in data augmentation.\n \nAria Fei( Aria Fei@BJUT )implemented the ASR feature for the script tool and participated in the Openness-Aware Personality paper project.\n\nXiaoyang Feng( Xiaoyang Feng@Nanjing Agricultural University )integrated the script recognition tool and participated in the Openness-Aware Personality paper project.\n\nYue Leng ( Song Yan )Collected data from The Big Bang Theory. Implemented script format conversion.\n\nscixing(HaoSheng Wang)( HaoSheng Wang ) implemented voiceprint recognition in the script tool and tts-vits speech synthesis.\n\nLinkang Zhan( JunityZhan@Case Western Reserve University ) collected Genshin Impact's system prompts and story data.\n\nYaokai Jia( Yaokai Jia )implemented the Vue frontend and practiced GPU extraction of Bert in a psychology project.\n\nPingyu Wu( Pingyu Wu@Juncai Shuyun )helped deploy the first version of the training code. \n\nHaozhen Sun( [Haozhen Sun@Tianjin University] )plot the character figures for ChatHaruhi. \n\n\n\n</details>\n\nPlease cite the repo if you use the data or code in this repo."
]
| [
"TAGS\n#task_categories-text-generation #task_categories-text2text-generation #size_categories-10K<n<100K #language-English #language-Chinese #license-cc-by-4.0 #arxiv-2308.09597 #region-us \n",
"# ChatHaruhi",
"# Reviving Anime Character in Reality via Large Language Model\n\n![Code License]()\n![Data License]()\n\ngithub repo: URL\n\n\n\nChat-Haruhi-Suzumiyais a language model that imitates the tone, personality and storylines of characters like Haruhi Suzumiya,\n\n\n<details>\n <summary> The project was developed by Cheng Li, Ziang Leng, Chenxi Yan, Xiaoyang Feng, HaoSheng Wang, Junyi Shen, Hao Wang, Weishi Mi, Aria Fei, Song Yan, Linkang Zhan, Yaokai Jia, Pingyu Wu, and Haozhen Sun,etc. </summary>\n\nThis is an open source project and the members were recruited from open source communities like DataWhale.\n\nLulu Li( Cheng Li@SenseTime )initiated the whole project and designed and implemented most of the features.\n \nZiang Leng( Ziang Leng@SenseTime )designed and implemented the training, data generation and backend architecture for ChatHaruhi 1.0.\n\nChenxi Yan( Chenxi Yan@Chengdu University of Information Technology )implemented and maintained the backend for ChatHaruhi 1.0.\n\nJunyi Shen( Junyi Shen@Zhejiang University )implemented the training code and participated in generating the training dataset.\n\nHao Wang( Hao Wang )collected script data for a TV series and participated in data augmentation.\n\nWeishi Mi( Weishi MI@Tsinghua University )participated in data augmentation.\n \nAria Fei( Aria Fei@BJUT )implemented the ASR feature for the script tool and participated in the Openness-Aware Personality paper project.\n\nXiaoyang Feng( Xiaoyang Feng@Nanjing Agricultural University )integrated the script recognition tool and participated in the Openness-Aware Personality paper project.\n\nYue Leng ( Song Yan )Collected data from The Big Bang Theory. Implemented script format conversion.\n\nscixing(HaoSheng Wang)( HaoSheng Wang ) implemented voiceprint recognition in the script tool and tts-vits speech synthesis.\n\nLinkang Zhan( JunityZhan@Case Western Reserve University ) collected Genshin Impact's system prompts and story data.\n\nYaokai Jia( Yaokai Jia )implemented the Vue frontend and practiced GPU extraction of Bert in a psychology project.\n\nPingyu Wu( Pingyu Wu@Juncai Shuyun )helped deploy the first version of the training code. \n\nHaozhen Sun( [Haozhen Sun@Tianjin University] )plot the character figures for ChatHaruhi. \n\n\n\n</details>\n\nPlease cite the repo if you use the data or code in this repo."
]
| [
68,
5,
612
]
| [
"passage: TAGS\n#task_categories-text-generation #task_categories-text2text-generation #size_categories-10K<n<100K #language-English #language-Chinese #license-cc-by-4.0 #arxiv-2308.09597 #region-us \n# ChatHaruhi"
]
|
772b8a9ee636ea6f76ead9af465c0b4160e5ac9c |
# Bangumi Image Base of Rakudai Kishi No Cavalry
This is the image base of bangumi Rakudai Kishi no Cavalry, we detected 20 characters, 1314 images in total. The full dataset is [here](all.zip).
**Please note that these image bases are not guaranteed to be 100% cleaned, they may be noisy actual.** If you intend to manually train models using this dataset, we recommend performing necessary preprocessing on the downloaded dataset to eliminate potential noisy samples (approximately 1% probability).
Here is the characters' preview:
| # | Images | Download | Preview 1 | Preview 2 | Preview 3 | Preview 4 | Preview 5 | Preview 6 | Preview 7 | Preview 8 |
|:------|---------:|:---------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|
| 0 | 305 | [Download](0/dataset.zip) |  |  |  |  |  |  |  |  |
| 1 | 19 | [Download](1/dataset.zip) |  |  |  |  |  |  |  |  |
| 2 | 365 | [Download](2/dataset.zip) |  |  |  |  |  |  |  |  |
| 3 | 41 | [Download](3/dataset.zip) |  |  |  |  |  |  |  |  |
| 4 | 62 | [Download](4/dataset.zip) |  |  |  |  |  |  |  |  |
| 5 | 44 | [Download](5/dataset.zip) |  |  |  |  |  |  |  |  |
| 6 | 47 | [Download](6/dataset.zip) |  |  |  |  |  |  |  |  |
| 7 | 23 | [Download](7/dataset.zip) |  |  |  |  |  |  |  |  |
| 8 | 105 | [Download](8/dataset.zip) |  |  |  |  |  |  |  |  |
| 9 | 9 | [Download](9/dataset.zip) |  |  |  |  |  |  |  |  |
| 10 | 18 | [Download](10/dataset.zip) |  |  |  |  |  |  |  |  |
| 11 | 20 | [Download](11/dataset.zip) |  |  |  |  |  |  |  |  |
| 12 | 9 | [Download](12/dataset.zip) |  |  |  |  |  |  |  |  |
| 13 | 10 | [Download](13/dataset.zip) |  |  |  |  |  |  |  |  |
| 14 | 9 | [Download](14/dataset.zip) |  |  |  |  |  |  |  |  |
| 15 | 9 | [Download](15/dataset.zip) |  |  |  |  |  |  |  |  |
| 16 | 37 | [Download](16/dataset.zip) |  |  |  |  |  |  |  |  |
| 17 | 6 | [Download](17/dataset.zip) |  |  |  |  |  |  | N/A | N/A |
| 18 | 19 | [Download](18/dataset.zip) |  |  |  |  |  |  |  |  |
| noise | 157 | [Download](-1/dataset.zip) |  |  |  |  |  |  |  |  |
| BangumiBase/rakudaikishinocavalry | [
"size_categories:1K<n<10K",
"license:mit",
"art",
"region:us"
]
| 2023-11-19T18:55:29+00:00 | {"license": "mit", "size_categories": ["1K<n<10K"], "tags": ["art"]} | 2023-11-19T20:05:25+00:00 | []
| []
| TAGS
#size_categories-1K<n<10K #license-mit #art #region-us
| Bangumi Image Base of Rakudai Kishi No Cavalry
==============================================
This is the image base of bangumi Rakudai Kishi no Cavalry, we detected 20 characters, 1314 images in total. The full dataset is here.
Please note that these image bases are not guaranteed to be 100% cleaned, they may be noisy actual. If you intend to manually train models using this dataset, we recommend performing necessary preprocessing on the downloaded dataset to eliminate potential noisy samples (approximately 1% probability).
Here is the characters' preview:
| []
| [
"TAGS\n#size_categories-1K<n<10K #license-mit #art #region-us \n"
]
| [
25
]
| [
"passage: TAGS\n#size_categories-1K<n<10K #license-mit #art #region-us \n"
]
|
57c999a1a9caf59eade350187fd5f0805df877d6 |
# Bangumi Image Base of Masou Gakuen Hxh
This is the image base of bangumi Masou Gakuen HxH, we detected 22 characters, 1642 images in total. The full dataset is [here](all.zip).
**Please note that these image bases are not guaranteed to be 100% cleaned, they may be noisy actual.** If you intend to manually train models using this dataset, we recommend performing necessary preprocessing on the downloaded dataset to eliminate potential noisy samples (approximately 1% probability).
Here is the characters' preview:
| # | Images | Download | Preview 1 | Preview 2 | Preview 3 | Preview 4 | Preview 5 | Preview 6 | Preview 7 | Preview 8 |
|:------|---------:|:---------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|
| 0 | 183 | [Download](0/dataset.zip) |  |  |  |  |  |  |  |  |
| 1 | 62 | [Download](1/dataset.zip) |  |  |  |  |  |  |  |  |
| 2 | 55 | [Download](2/dataset.zip) |  |  |  |  |  |  |  |  |
| 3 | 160 | [Download](3/dataset.zip) |  |  |  |  |  |  |  |  |
| 4 | 21 | [Download](4/dataset.zip) |  |  |  |  |  |  |  |  |
| 5 | 80 | [Download](5/dataset.zip) |  |  |  |  |  |  |  |  |
| 6 | 488 | [Download](6/dataset.zip) |  |  |  |  |  |  |  |  |
| 7 | 32 | [Download](7/dataset.zip) |  |  |  |  |  |  |  |  |
| 8 | 12 | [Download](8/dataset.zip) |  |  |  |  |  |  |  |  |
| 9 | 5 | [Download](9/dataset.zip) |  |  |  |  |  | N/A | N/A | N/A |
| 10 | 80 | [Download](10/dataset.zip) |  |  |  |  |  |  |  |  |
| 11 | 68 | [Download](11/dataset.zip) |  |  |  |  |  |  |  |  |
| 12 | 24 | [Download](12/dataset.zip) |  |  |  |  |  |  |  |  |
| 13 | 32 | [Download](13/dataset.zip) |  |  |  |  |  |  |  |  |
| 14 | 33 | [Download](14/dataset.zip) |  |  |  |  |  |  |  |  |
| 15 | 16 | [Download](15/dataset.zip) |  |  |  |  |  |  |  |  |
| 16 | 38 | [Download](16/dataset.zip) |  |  |  |  |  |  |  |  |
| 17 | 20 | [Download](17/dataset.zip) |  |  |  |  |  |  |  |  |
| 18 | 6 | [Download](18/dataset.zip) |  |  |  |  |  |  | N/A | N/A |
| 19 | 67 | [Download](19/dataset.zip) |  |  |  |  |  |  |  |  |
| 20 | 7 | [Download](20/dataset.zip) |  |  |  |  |  |  |  | N/A |
| noise | 153 | [Download](-1/dataset.zip) |  |  |  |  |  |  |  |  |
| BangumiBase/masougakuenhxh | [
"size_categories:1K<n<10K",
"license:mit",
"art",
"region:us"
]
| 2023-11-19T19:04:51+00:00 | {"license": "mit", "size_categories": ["1K<n<10K"], "tags": ["art"]} | 2023-11-19T20:20:49+00:00 | []
| []
| TAGS
#size_categories-1K<n<10K #license-mit #art #region-us
| Bangumi Image Base of Masou Gakuen Hxh
======================================
This is the image base of bangumi Masou Gakuen HxH, we detected 22 characters, 1642 images in total. The full dataset is here.
Please note that these image bases are not guaranteed to be 100% cleaned, they may be noisy actual. If you intend to manually train models using this dataset, we recommend performing necessary preprocessing on the downloaded dataset to eliminate potential noisy samples (approximately 1% probability).
Here is the characters' preview:
| []
| [
"TAGS\n#size_categories-1K<n<10K #license-mit #art #region-us \n"
]
| [
25
]
| [
"passage: TAGS\n#size_categories-1K<n<10K #license-mit #art #region-us \n"
]
|
938c721bb2f1db01acc50287a6ce446f1ff3d1ad | # Dataset Card for "RickAndMorty-blip-captions-1024"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | Norod78/RickAndMorty-blip-captions-1024 | [
"region:us"
]
| 2023-11-19T19:36:05+00:00 | {"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}], "dataset_info": {"features": [{"name": "image", "dtype": "image"}, {"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 178602553.0, "num_examples": 188}], "download_size": 178603589, "dataset_size": 178602553.0}} | 2023-11-19T19:36:23+00:00 | []
| []
| TAGS
#region-us
| # Dataset Card for "RickAndMorty-blip-captions-1024"
More Information needed | [
"# Dataset Card for \"RickAndMorty-blip-captions-1024\"\n\nMore Information needed"
]
| [
"TAGS\n#region-us \n",
"# Dataset Card for \"RickAndMorty-blip-captions-1024\"\n\nMore Information needed"
]
| [
6,
23
]
| [
"passage: TAGS\n#region-us \n# Dataset Card for \"RickAndMorty-blip-captions-1024\"\n\nMore Information needed"
]
|
74683e973e2030a54d63be20fe1da95f1b31d431 | # Dataset Card for "ClaymationChristmas-blip-captions-1024"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | Norod78/ClaymationChristmas-blip-captions-1024 | [
"region:us"
]
| 2023-11-19T19:41:51+00:00 | {"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}], "dataset_info": {"features": [{"name": "image", "dtype": "image"}, {"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 317094669.0, "num_examples": 331}], "download_size": 315674642, "dataset_size": 317094669.0}} | 2023-11-19T19:42:22+00:00 | []
| []
| TAGS
#region-us
| # Dataset Card for "ClaymationChristmas-blip-captions-1024"
More Information needed | [
"# Dataset Card for \"ClaymationChristmas-blip-captions-1024\"\n\nMore Information needed"
]
| [
"TAGS\n#region-us \n",
"# Dataset Card for \"ClaymationChristmas-blip-captions-1024\"\n\nMore Information needed"
]
| [
6,
24
]
| [
"passage: TAGS\n#region-us \n# Dataset Card for \"ClaymationChristmas-blip-captions-1024\"\n\nMore Information needed"
]
|
c11bdf66e893e8359ef2fe5e95e76fba35e250b1 |
# Dataset Card for DALL·E 3 Reddit Images Dataset
**Description**: This dataset consists of high quality synthetic images produced with Dalle 3 that were shared on Reddit, and is meant to be captioned and combined with other datasets before use in training new models.
Currently this dataset contains 3465 images, and more images will be periodically added.
| ProGamerGov/dalle-3-reddit-dataset | [
"language:en",
"license:mit",
"image-text-dataset",
"synthetic-dataset",
"region:us"
]
| 2023-11-19T19:44:52+00:00 | {"language": ["en"], "license": ["mit"], "tags": ["image-text-dataset", "synthetic-dataset"], "dataset_info": {"features": [{"name": "image", "dtype": "image"}]}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2023-11-20T17:11:49+00:00 | []
| [
"en"
]
| TAGS
#language-English #license-mit #image-text-dataset #synthetic-dataset #region-us
|
# Dataset Card for DALL·E 3 Reddit Images Dataset
Description: This dataset consists of high quality synthetic images produced with Dalle 3 that were shared on Reddit, and is meant to be captioned and combined with other datasets before use in training new models.
Currently this dataset contains 3465 images, and more images will be periodically added.
| [
"# Dataset Card for DALL·E 3 Reddit Images Dataset\n\nDescription: This dataset consists of high quality synthetic images produced with Dalle 3 that were shared on Reddit, and is meant to be captioned and combined with other datasets before use in training new models.\n\nCurrently this dataset contains 3465 images, and more images will be periodically added."
]
| [
"TAGS\n#language-English #license-mit #image-text-dataset #synthetic-dataset #region-us \n",
"# Dataset Card for DALL·E 3 Reddit Images Dataset\n\nDescription: This dataset consists of high quality synthetic images produced with Dalle 3 that were shared on Reddit, and is meant to be captioned and combined with other datasets before use in training new models.\n\nCurrently this dataset contains 3465 images, and more images will be periodically added."
]
| [
29,
81
]
| [
"passage: TAGS\n#language-English #license-mit #image-text-dataset #synthetic-dataset #region-us \n# Dataset Card for DALL·E 3 Reddit Images Dataset\n\nDescription: This dataset consists of high quality synthetic images produced with Dalle 3 that were shared on Reddit, and is meant to be captioned and combined with other datasets before use in training new models.\n\nCurrently this dataset contains 3465 images, and more images will be periodically added."
]
|
a602d48c80a2d16036e0f278849a23cc2d36d847 | # Dataset Card for "arxive_dataset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | AlFrauch/arxive_dataset | [
"region:us"
]
| 2023-11-19T20:05:21+00:00 | {"dataset_info": {"features": [{"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 100081932758, "num_examples": 1903579}], "download_size": 16157271684, "dataset_size": 100081932758}} | 2023-11-25T12:23:18+00:00 | []
| []
| TAGS
#region-us
| # Dataset Card for "arxive_dataset"
More Information needed | [
"# Dataset Card for \"arxive_dataset\"\n\nMore Information needed"
]
| [
"TAGS\n#region-us \n",
"# Dataset Card for \"arxive_dataset\"\n\nMore Information needed"
]
| [
6,
16
]
| [
"passage: TAGS\n#region-us \n# Dataset Card for \"arxive_dataset\"\n\nMore Information needed"
]
|
449fea801c1024078074a4203a02c3d6ed58c537 | Credits: Taken from https://www.kaggle.com/datasets/dasmehdixtr/drone-dataset-uav | xuanzz/Drone | [
"license:mit",
"region:us"
]
| 2023-11-19T20:23:57+00:00 | {"license": "mit", "dataset_info": {"features": [{"name": "image", "dtype": "image"}, {"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 381103939.086, "num_examples": 1359}], "download_size": 379073342, "dataset_size": 381103939.086}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2023-11-19T21:08:14+00:00 | []
| []
| TAGS
#license-mit #region-us
| Credits: Taken from URL | []
| [
"TAGS\n#license-mit #region-us \n"
]
| [
11
]
| [
"passage: TAGS\n#license-mit #region-us \n"
]
|
53c40ec7cbd85267fa6eaef09d06d3e68d19c729 | # Dataset Card for "PPLM-PQA"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | Yijia-Xiao/PII-PQA-raw | [
"region:us"
]
| 2023-11-19T20:28:04+00:00 | {"dataset_info": {"features": [{"name": "Question", "dtype": "string"}, {"name": "Answer", "dtype": "string"}, {"name": "Protected Answer", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 7185732, "num_examples": 42499}, {"name": "test", "num_bytes": 1274128, "num_examples": 7504}], "download_size": 1212545, "dataset_size": 8459860}} | 2023-11-19T20:36:49+00:00 | []
| []
| TAGS
#region-us
| # Dataset Card for "PPLM-PQA"
More Information needed | [
"# Dataset Card for \"PPLM-PQA\"\n\nMore Information needed"
]
| [
"TAGS\n#region-us \n",
"# Dataset Card for \"PPLM-PQA\"\n\nMore Information needed"
]
| [
6,
15
]
| [
"passage: TAGS\n#region-us \n# Dataset Card for \"PPLM-PQA\"\n\nMore Information needed"
]
|
37d5516a2c0300e4a16dd7039a2d8ae41f81cb2e | # Dataset Card for "autotrain-data-tu9p-fvi7-zb2n"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | healthcorum/autotrain-data-tu9p-fvi7-zb2n | [
"region:us"
]
| 2023-11-19T20:48:34+00:00 | {"dataset_info": {"features": [{"name": "Unnamed: 0", "dtype": "int64"}, {"name": "responses", "dtype": "string"}, {"name": "autotrain_text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 36088167, "num_examples": 9998}, {"name": "validation", "num_bytes": 36088167, "num_examples": 9998}], "download_size": 12071286, "dataset_size": 72176334}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "validation", "path": "data/validation-*"}]}]} | 2023-11-19T20:48:35+00:00 | []
| []
| TAGS
#region-us
| # Dataset Card for "autotrain-data-tu9p-fvi7-zb2n"
More Information needed | [
"# Dataset Card for \"autotrain-data-tu9p-fvi7-zb2n\"\n\nMore Information needed"
]
| [
"TAGS\n#region-us \n",
"# Dataset Card for \"autotrain-data-tu9p-fvi7-zb2n\"\n\nMore Information needed"
]
| [
6,
26
]
| [
"passage: TAGS\n#region-us \n# Dataset Card for \"autotrain-data-tu9p-fvi7-zb2n\"\n\nMore Information needed"
]
|
c790131694b0c5ca476740b64a55c1dda1760bb4 | # Dataset Card for "PPLM-PQA"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | Yijia-Xiao/PPLM-PQA | [
"region:us"
]
| 2023-11-19T20:53:54+00:00 | {"dataset_info": {"features": [{"name": "instruction", "dtype": "string"}, {"name": "input", "dtype": "string"}, {"name": "output", "dtype": "string"}, {"name": "cleaned_output", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 8673197, "num_examples": 42499}, {"name": "test", "num_bytes": 1536768, "num_examples": 7504}], "download_size": 1233735, "dataset_size": 10209965}} | 2023-11-19T20:53:59+00:00 | []
| []
| TAGS
#region-us
| # Dataset Card for "PPLM-PQA"
More Information needed | [
"# Dataset Card for \"PPLM-PQA\"\n\nMore Information needed"
]
| [
"TAGS\n#region-us \n",
"# Dataset Card for \"PPLM-PQA\"\n\nMore Information needed"
]
| [
6,
15
]
| [
"passage: TAGS\n#region-us \n# Dataset Card for \"PPLM-PQA\"\n\nMore Information needed"
]
|
ab1220e6823732093a1c8a0122af98f7da1f4217 |
# govgis_nov2023-slim-spatial
🤖 This README was written by [`HuggingFaceH4/zephyr-7b-beta`](https://huggingface.co/HuggingFaceH4/zephyr-7b-beta). 🤖
Introducing the govgis_nov2023-slim-spatial dataset, a carefully curated and georeferenced subset of the extensive [govgis_nov2023](https://huggingface.co/datasets/joshuasundance/govgis_nov2023) collection. This dataset stands out for its focus on geospatial data analysis, enriched with vector embeddings. While we have only explored a portion of this vast collection, the variety and richness of the content encountered have been remarkable, making it challenging to fully capture the dataset's breadth in a brief overview.
## Overview
The govgis_nov2023-slim-spatial dataset condenses key elements from the larger govgis_nov2023 collection into a more manageable format. It offers a glimpse into an extensive range of geospatial data types, all augmented with vector embeddings using [`BAAI/bge-large-en-v1.5`](https://huggingface.co/BAAI/bge-large-en-v1.5). Our exploration has revealed a staggering variety in the data, suggesting vast potential applications.
Key Features:
- **Diverse Geospatial Data Types:** The dataset includes samples of data like ecological data, census data, administrative boundaries, transportation networks, and land use maps, representing just a fraction of what's available.
- **Advanced Vector Search Capabilities:** Augmented with vector embeddings using [`BAAI/bge-large-en-v1.5`](https://huggingface.co/BAAI/bge-large-en-v1.5) for sophisticated content discovery.
## Dataset Files
The dataset comprises two distinct files:
1. **`govgis_nov2023_slim_spatial.geoparquet`** This file offers core georeferenced spatial data, suitable for a broad range of analysis needs.
2. **`govgis_nov2023_slim_spatial_embs.geoparquet`:** A more comprehensive file with detailed vector embeddings, catering to more in-depth analytical demands.
This two-tiered approach allows users to tailor their engagement with the dataset based on their specific requirements.
## Benefits:
- **Selective Accessibility:** The dataset provides an accessible entry point to a seemingly endless variety of spatial data.
- **Efficient yet Comprehensive:** It distills a vast array of data into a more practical format without losing the essence of its diversity.
- **Untapped Application Potential:** The examples we provide are merely starting points; the dataset's true scope is far more extensive and varied.
- **Enhanced Analytical Depth:** Vector embeddings from [`BAAI/bge-large-en-v1.5`](https://huggingface.co/BAAI/bge-large-en-v1.5) offer advanced data analysis capabilities.
## Use Cases:
Given the sheer variety of data we've glimpsed, the dataset is poised to serve a myriad of applications, far beyond the few examples we can confidently cite. It's designed to be adaptable to diverse analytical pursuits across different fields.
# Conclusion:
The govgis_nov2023-slim-spatial dataset is a thoughtfully distilled, georeferenced, and vector-embedded version of its more extensive counterpart. Our limited exploration has revealed an astonishing variety of data, hinting at a much broader scope of potential applications than we can definitively describe. This dual-file dataset is crafted to meet a wide spectrum of spatial data analysis needs, from the straightforward to the highly specialized, accommodating the extensive possibilities that lie within the realm of geospatial data. | joshuasundance/govgis_nov2023-slim-spatial | [
"size_categories:100K<n<1M",
"language:en",
"license:mit",
"gis",
"geospatial",
"doi:10.57967/hf/1369",
"region:us"
]
| 2023-11-19T20:53:59+00:00 | {"language": ["en"], "license": "mit", "size_categories": ["100K<n<1M"], "pretty_name": "govgis_nov2023-slim-spatial", "tags": ["gis", "geospatial"]} | 2023-11-23T00:18:04+00:00 | []
| [
"en"
]
| TAGS
#size_categories-100K<n<1M #language-English #license-mit #gis #geospatial #doi-10.57967/hf/1369 #region-us
|
# govgis_nov2023-slim-spatial
This README was written by 'HuggingFaceH4/zephyr-7b-beta'.
Introducing the govgis_nov2023-slim-spatial dataset, a carefully curated and georeferenced subset of the extensive govgis_nov2023 collection. This dataset stands out for its focus on geospatial data analysis, enriched with vector embeddings. While we have only explored a portion of this vast collection, the variety and richness of the content encountered have been remarkable, making it challenging to fully capture the dataset's breadth in a brief overview.
## Overview
The govgis_nov2023-slim-spatial dataset condenses key elements from the larger govgis_nov2023 collection into a more manageable format. It offers a glimpse into an extensive range of geospatial data types, all augmented with vector embeddings using 'BAAI/bge-large-en-v1.5'. Our exploration has revealed a staggering variety in the data, suggesting vast potential applications.
Key Features:
- Diverse Geospatial Data Types: The dataset includes samples of data like ecological data, census data, administrative boundaries, transportation networks, and land use maps, representing just a fraction of what's available.
- Advanced Vector Search Capabilities: Augmented with vector embeddings using 'BAAI/bge-large-en-v1.5' for sophisticated content discovery.
## Dataset Files
The dataset comprises two distinct files:
1. 'govgis_nov2023_slim_spatial.geoparquet' This file offers core georeferenced spatial data, suitable for a broad range of analysis needs.
2. 'govgis_nov2023_slim_spatial_embs.geoparquet': A more comprehensive file with detailed vector embeddings, catering to more in-depth analytical demands.
This two-tiered approach allows users to tailor their engagement with the dataset based on their specific requirements.
## Benefits:
- Selective Accessibility: The dataset provides an accessible entry point to a seemingly endless variety of spatial data.
- Efficient yet Comprehensive: It distills a vast array of data into a more practical format without losing the essence of its diversity.
- Untapped Application Potential: The examples we provide are merely starting points; the dataset's true scope is far more extensive and varied.
- Enhanced Analytical Depth: Vector embeddings from 'BAAI/bge-large-en-v1.5' offer advanced data analysis capabilities.
## Use Cases:
Given the sheer variety of data we've glimpsed, the dataset is poised to serve a myriad of applications, far beyond the few examples we can confidently cite. It's designed to be adaptable to diverse analytical pursuits across different fields.
# Conclusion:
The govgis_nov2023-slim-spatial dataset is a thoughtfully distilled, georeferenced, and vector-embedded version of its more extensive counterpart. Our limited exploration has revealed an astonishing variety of data, hinting at a much broader scope of potential applications than we can definitively describe. This dual-file dataset is crafted to meet a wide spectrum of spatial data analysis needs, from the straightforward to the highly specialized, accommodating the extensive possibilities that lie within the realm of geospatial data. | [
"# govgis_nov2023-slim-spatial\n\n This README was written by 'HuggingFaceH4/zephyr-7b-beta'. \n\n\nIntroducing the govgis_nov2023-slim-spatial dataset, a carefully curated and georeferenced subset of the extensive govgis_nov2023 collection. This dataset stands out for its focus on geospatial data analysis, enriched with vector embeddings. While we have only explored a portion of this vast collection, the variety and richness of the content encountered have been remarkable, making it challenging to fully capture the dataset's breadth in a brief overview.",
"## Overview\n\nThe govgis_nov2023-slim-spatial dataset condenses key elements from the larger govgis_nov2023 collection into a more manageable format. It offers a glimpse into an extensive range of geospatial data types, all augmented with vector embeddings using 'BAAI/bge-large-en-v1.5'. Our exploration has revealed a staggering variety in the data, suggesting vast potential applications.\n\nKey Features:\n\n- Diverse Geospatial Data Types: The dataset includes samples of data like ecological data, census data, administrative boundaries, transportation networks, and land use maps, representing just a fraction of what's available.\n- Advanced Vector Search Capabilities: Augmented with vector embeddings using 'BAAI/bge-large-en-v1.5' for sophisticated content discovery.",
"## Dataset Files\n\nThe dataset comprises two distinct files:\n\n1. 'govgis_nov2023_slim_spatial.geoparquet' This file offers core georeferenced spatial data, suitable for a broad range of analysis needs.\n2. 'govgis_nov2023_slim_spatial_embs.geoparquet': A more comprehensive file with detailed vector embeddings, catering to more in-depth analytical demands.\n\nThis two-tiered approach allows users to tailor their engagement with the dataset based on their specific requirements.",
"## Benefits:\n\n- Selective Accessibility: The dataset provides an accessible entry point to a seemingly endless variety of spatial data.\n- Efficient yet Comprehensive: It distills a vast array of data into a more practical format without losing the essence of its diversity.\n- Untapped Application Potential: The examples we provide are merely starting points; the dataset's true scope is far more extensive and varied.\n- Enhanced Analytical Depth: Vector embeddings from 'BAAI/bge-large-en-v1.5' offer advanced data analysis capabilities.",
"## Use Cases:\n\nGiven the sheer variety of data we've glimpsed, the dataset is poised to serve a myriad of applications, far beyond the few examples we can confidently cite. It's designed to be adaptable to diverse analytical pursuits across different fields.",
"# Conclusion:\n\nThe govgis_nov2023-slim-spatial dataset is a thoughtfully distilled, georeferenced, and vector-embedded version of its more extensive counterpart. Our limited exploration has revealed an astonishing variety of data, hinting at a much broader scope of potential applications than we can definitively describe. This dual-file dataset is crafted to meet a wide spectrum of spatial data analysis needs, from the straightforward to the highly specialized, accommodating the extensive possibilities that lie within the realm of geospatial data."
]
| [
"TAGS\n#size_categories-100K<n<1M #language-English #license-mit #gis #geospatial #doi-10.57967/hf/1369 #region-us \n",
"# govgis_nov2023-slim-spatial\n\n This README was written by 'HuggingFaceH4/zephyr-7b-beta'. \n\n\nIntroducing the govgis_nov2023-slim-spatial dataset, a carefully curated and georeferenced subset of the extensive govgis_nov2023 collection. This dataset stands out for its focus on geospatial data analysis, enriched with vector embeddings. While we have only explored a portion of this vast collection, the variety and richness of the content encountered have been remarkable, making it challenging to fully capture the dataset's breadth in a brief overview.",
"## Overview\n\nThe govgis_nov2023-slim-spatial dataset condenses key elements from the larger govgis_nov2023 collection into a more manageable format. It offers a glimpse into an extensive range of geospatial data types, all augmented with vector embeddings using 'BAAI/bge-large-en-v1.5'. Our exploration has revealed a staggering variety in the data, suggesting vast potential applications.\n\nKey Features:\n\n- Diverse Geospatial Data Types: The dataset includes samples of data like ecological data, census data, administrative boundaries, transportation networks, and land use maps, representing just a fraction of what's available.\n- Advanced Vector Search Capabilities: Augmented with vector embeddings using 'BAAI/bge-large-en-v1.5' for sophisticated content discovery.",
"## Dataset Files\n\nThe dataset comprises two distinct files:\n\n1. 'govgis_nov2023_slim_spatial.geoparquet' This file offers core georeferenced spatial data, suitable for a broad range of analysis needs.\n2. 'govgis_nov2023_slim_spatial_embs.geoparquet': A more comprehensive file with detailed vector embeddings, catering to more in-depth analytical demands.\n\nThis two-tiered approach allows users to tailor their engagement with the dataset based on their specific requirements.",
"## Benefits:\n\n- Selective Accessibility: The dataset provides an accessible entry point to a seemingly endless variety of spatial data.\n- Efficient yet Comprehensive: It distills a vast array of data into a more practical format without losing the essence of its diversity.\n- Untapped Application Potential: The examples we provide are merely starting points; the dataset's true scope is far more extensive and varied.\n- Enhanced Analytical Depth: Vector embeddings from 'BAAI/bge-large-en-v1.5' offer advanced data analysis capabilities.",
"## Use Cases:\n\nGiven the sheer variety of data we've glimpsed, the dataset is poised to serve a myriad of applications, far beyond the few examples we can confidently cite. It's designed to be adaptable to diverse analytical pursuits across different fields.",
"# Conclusion:\n\nThe govgis_nov2023-slim-spatial dataset is a thoughtfully distilled, georeferenced, and vector-embedded version of its more extensive counterpart. Our limited exploration has revealed an astonishing variety of data, hinting at a much broader scope of potential applications than we can definitively describe. This dual-file dataset is crafted to meet a wide spectrum of spatial data analysis needs, from the straightforward to the highly specialized, accommodating the extensive possibilities that lie within the realm of geospatial data."
]
| [
46,
154,
215,
127,
144,
67,
136
]
| [
"passage: TAGS\n#size_categories-100K<n<1M #language-English #license-mit #gis #geospatial #doi-10.57967/hf/1369 #region-us \n# govgis_nov2023-slim-spatial\n\n This README was written by 'HuggingFaceH4/zephyr-7b-beta'. \n\n\nIntroducing the govgis_nov2023-slim-spatial dataset, a carefully curated and georeferenced subset of the extensive govgis_nov2023 collection. This dataset stands out for its focus on geospatial data analysis, enriched with vector embeddings. While we have only explored a portion of this vast collection, the variety and richness of the content encountered have been remarkable, making it challenging to fully capture the dataset's breadth in a brief overview.## Overview\n\nThe govgis_nov2023-slim-spatial dataset condenses key elements from the larger govgis_nov2023 collection into a more manageable format. It offers a glimpse into an extensive range of geospatial data types, all augmented with vector embeddings using 'BAAI/bge-large-en-v1.5'. Our exploration has revealed a staggering variety in the data, suggesting vast potential applications.\n\nKey Features:\n\n- Diverse Geospatial Data Types: The dataset includes samples of data like ecological data, census data, administrative boundaries, transportation networks, and land use maps, representing just a fraction of what's available.\n- Advanced Vector Search Capabilities: Augmented with vector embeddings using 'BAAI/bge-large-en-v1.5' for sophisticated content discovery."
]
|
c5287d2ed0050dd91ba3221c51826f0ef4fca894 |
# Bangumi Image Base of Sailor Moon (2010s)
This is the image base of bangumi Sailor Moon (2010s), we detected 46 characters, 3463 images in total. The full dataset is [here](all.zip).
**Please note that these image bases are not guaranteed to be 100% cleaned, they may be noisy actual.** If you intend to manually train models using this dataset, we recommend performing necessary preprocessing on the downloaded dataset to eliminate potential noisy samples (approximately 1% probability).
Here is the characters' preview:
| # | Images | Download | Preview 1 | Preview 2 | Preview 3 | Preview 4 | Preview 5 | Preview 6 | Preview 7 | Preview 8 |
|:------|---------:|:---------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|
| 0 | 901 | [Download](0/dataset.zip) |  |  |  |  |  |  |  |  |
| 1 | 140 | [Download](1/dataset.zip) |  |  |  |  |  |  |  |  |
| 2 | 16 | [Download](2/dataset.zip) |  |  |  |  |  |  |  |  |
| 3 | 313 | [Download](3/dataset.zip) |  |  |  |  |  |  |  |  |
| 4 | 19 | [Download](4/dataset.zip) |  |  |  |  |  |  |  |  |
| 5 | 77 | [Download](5/dataset.zip) |  |  |  |  |  |  |  |  |
| 6 | 52 | [Download](6/dataset.zip) |  |  |  |  |  |  |  |  |
| 7 | 17 | [Download](7/dataset.zip) |  |  |  |  |  |  |  |  |
| 8 | 17 | [Download](8/dataset.zip) |  |  |  |  |  |  |  |  |
| 9 | 26 | [Download](9/dataset.zip) |  |  |  |  |  |  |  |  |
| 10 | 21 | [Download](10/dataset.zip) |  |  |  |  |  |  |  |  |
| 11 | 102 | [Download](11/dataset.zip) |  |  |  |  |  |  |  |  |
| 12 | 164 | [Download](12/dataset.zip) |  |  |  |  |  |  |  |  |
| 13 | 73 | [Download](13/dataset.zip) |  |  |  |  |  |  |  |  |
| 14 | 46 | [Download](14/dataset.zip) |  |  |  |  |  |  |  |  |
| 15 | 9 | [Download](15/dataset.zip) |  |  |  |  |  |  |  |  |
| 16 | 269 | [Download](16/dataset.zip) |  |  |  |  |  |  |  |  |
| 17 | 24 | [Download](17/dataset.zip) |  |  |  |  |  |  |  |  |
| 18 | 10 | [Download](18/dataset.zip) |  |  |  |  |  |  |  |  |
| 19 | 21 | [Download](19/dataset.zip) |  |  |  |  |  |  |  |  |
| 20 | 11 | [Download](20/dataset.zip) |  |  |  |  |  |  |  |  |
| 21 | 271 | [Download](21/dataset.zip) |  |  |  |  |  |  |  |  |
| 22 | 99 | [Download](22/dataset.zip) |  |  |  |  |  |  |  |  |
| 23 | 14 | [Download](23/dataset.zip) |  |  |  |  |  |  |  |  |
| 24 | 40 | [Download](24/dataset.zip) |  |  |  |  |  |  |  |  |
| 25 | 9 | [Download](25/dataset.zip) |  |  |  |  |  |  |  |  |
| 26 | 205 | [Download](26/dataset.zip) |  |  |  |  |  |  |  |  |
| 27 | 18 | [Download](27/dataset.zip) |  |  |  |  |  |  |  |  |
| 28 | 12 | [Download](28/dataset.zip) |  |  |  |  |  |  |  |  |
| 29 | 22 | [Download](29/dataset.zip) |  |  |  |  |  |  |  |  |
| 30 | 15 | [Download](30/dataset.zip) |  |  |  |  |  |  |  |  |
| 31 | 14 | [Download](31/dataset.zip) |  |  |  |  |  |  |  |  |
| 32 | 16 | [Download](32/dataset.zip) |  |  |  |  |  |  |  |  |
| 33 | 7 | [Download](33/dataset.zip) |  |  |  |  |  |  |  | N/A |
| 34 | 9 | [Download](34/dataset.zip) |  |  |  |  |  |  |  |  |
| 35 | 23 | [Download](35/dataset.zip) |  |  |  |  |  |  |  |  |
| 36 | 26 | [Download](36/dataset.zip) |  |  |  |  |  |  |  |  |
| 37 | 15 | [Download](37/dataset.zip) |  |  |  |  |  |  |  |  |
| 38 | 8 | [Download](38/dataset.zip) |  |  |  |  |  |  |  |  |
| 39 | 5 | [Download](39/dataset.zip) |  |  |  |  |  | N/A | N/A | N/A |
| 40 | 11 | [Download](40/dataset.zip) |  |  |  |  |  |  |  |  |
| 41 | 9 | [Download](41/dataset.zip) |  |  |  |  |  |  |  |  |
| 42 | 9 | [Download](42/dataset.zip) |  |  |  |  |  |  |  |  |
| 43 | 21 | [Download](43/dataset.zip) |  |  |  |  |  |  |  |  |
| 44 | 12 | [Download](44/dataset.zip) |  |  |  |  |  |  |  |  |
| noise | 245 | [Download](-1/dataset.zip) |  |  |  |  |  |  |  |  |
| BangumiBase/sailormoon2010s | [
"size_categories:1K<n<10K",
"license:mit",
"art",
"region:us"
]
| 2023-11-19T21:14:39+00:00 | {"license": "mit", "size_categories": ["1K<n<10K"], "tags": ["art"]} | 2023-11-19T23:01:48+00:00 | []
| []
| TAGS
#size_categories-1K<n<10K #license-mit #art #region-us
| Bangumi Image Base of Sailor Moon (2010s)
=========================================
This is the image base of bangumi Sailor Moon (2010s), we detected 46 characters, 3463 images in total. The full dataset is here.
Please note that these image bases are not guaranteed to be 100% cleaned, they may be noisy actual. If you intend to manually train models using this dataset, we recommend performing necessary preprocessing on the downloaded dataset to eliminate potential noisy samples (approximately 1% probability).
Here is the characters' preview:
| []
| [
"TAGS\n#size_categories-1K<n<10K #license-mit #art #region-us \n"
]
| [
25
]
| [
"passage: TAGS\n#size_categories-1K<n<10K #license-mit #art #region-us \n"
]
|
a4550494e308abf5a3e5d6ca28e62b58d7b3495c |
# Bangumi Image Base of Sailor Moon (1990s)
This is the image base of bangumi Sailor Moon (1990s), we detected 132 characters, 14684 images in total. The full dataset is [here](all.zip).
**Please note that these image bases are not guaranteed to be 100% cleaned, they may be noisy actual.** If you intend to manually train models using this dataset, we recommend performing necessary preprocessing on the downloaded dataset to eliminate potential noisy samples (approximately 1% probability).
Here is the characters' preview:
| # | Images | Download | Preview 1 | Preview 2 | Preview 3 | Preview 4 | Preview 5 | Preview 6 | Preview 7 | Preview 8 |
|:------|---------:|:----------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|
| 0 | 3008 | [Download](0/dataset.zip) |  |  |  |  |  |  |  |  |
| 1 | 94 | [Download](1/dataset.zip) |  |  |  |  |  |  |  |  |
| 2 | 696 | [Download](2/dataset.zip) |  |  |  |  |  |  |  |  |
| 3 | 49 | [Download](3/dataset.zip) |  |  |  |  |  |  |  |  |
| 4 | 29 | [Download](4/dataset.zip) |  |  |  |  |  |  |  |  |
| 5 | 176 | [Download](5/dataset.zip) |  |  |  |  |  |  |  |  |
| 6 | 95 | [Download](6/dataset.zip) |  |  |  |  |  |  |  |  |
| 7 | 72 | [Download](7/dataset.zip) |  |  |  |  |  |  |  |  |
| 8 | 180 | [Download](8/dataset.zip) |  |  |  |  |  |  |  |  |
| 9 | 75 | [Download](9/dataset.zip) |  |  |  |  |  |  |  |  |
| 10 | 108 | [Download](10/dataset.zip) |  |  |  |  |  |  |  |  |
| 11 | 113 | [Download](11/dataset.zip) |  |  |  |  |  |  |  |  |
| 12 | 32 | [Download](12/dataset.zip) |  |  |  |  |  |  |  |  |
| 13 | 42 | [Download](13/dataset.zip) |  |  |  |  |  |  |  |  |
| 14 | 47 | [Download](14/dataset.zip) |  |  |  |  |  |  |  |  |
| 15 | 602 | [Download](15/dataset.zip) |  |  |  |  |  |  |  |  |
| 16 | 1066 | [Download](16/dataset.zip) |  |  |  |  |  |  |  |  |
| 17 | 395 | [Download](17/dataset.zip) |  |  |  |  |  |  |  |  |
| 18 | 208 | [Download](18/dataset.zip) |  |  |  |  |  |  |  |  |
| 19 | 79 | [Download](19/dataset.zip) |  |  |  |  |  |  |  |  |
| 20 | 86 | [Download](20/dataset.zip) |  |  |  |  |  |  |  |  |
| 21 | 62 | [Download](21/dataset.zip) |  |  |  |  |  |  |  |  |
| 22 | 50 | [Download](22/dataset.zip) |  |  |  |  |  |  |  |  |
| 23 | 53 | [Download](23/dataset.zip) |  |  |  |  |  |  |  |  |
| 24 | 76 | [Download](24/dataset.zip) |  |  |  |  |  |  |  |  |
| 25 | 141 | [Download](25/dataset.zip) |  |  |  |  |  |  |  |  |
| 26 | 67 | [Download](26/dataset.zip) |  |  |  |  |  |  |  |  |
| 27 | 45 | [Download](27/dataset.zip) |  |  |  |  |  |  |  |  |
| 28 | 750 | [Download](28/dataset.zip) |  |  |  |  |  |  |  |  |
| 29 | 103 | [Download](29/dataset.zip) |  |  |  |  |  |  |  |  |
| 30 | 34 | [Download](30/dataset.zip) |  |  |  |  |  |  |  |  |
| 31 | 42 | [Download](31/dataset.zip) |  |  |  |  |  |  |  |  |
| 32 | 20 | [Download](32/dataset.zip) |  |  |  |  |  |  |  |  |
| 33 | 67 | [Download](33/dataset.zip) |  |  |  |  |  |  |  |  |
| 34 | 79 | [Download](34/dataset.zip) |  |  |  |  |  |  |  |  |
| 35 | 40 | [Download](35/dataset.zip) |  |  |  |  |  |  |  |  |
| 36 | 45 | [Download](36/dataset.zip) |  |  |  |  |  |  |  |  |
| 37 | 118 | [Download](37/dataset.zip) |  |  |  |  |  |  |  |  |
| 38 | 41 | [Download](38/dataset.zip) |  |  |  |  |  |  |  |  |
| 39 | 62 | [Download](39/dataset.zip) |  |  |  |  |  |  |  |  |
| 40 | 93 | [Download](40/dataset.zip) |  |  |  |  |  |  |  |  |
| 41 | 79 | [Download](41/dataset.zip) |  |  |  |  |  |  |  |  |
| 42 | 920 | [Download](42/dataset.zip) |  |  |  |  |  |  |  |  |
| 43 | 55 | [Download](43/dataset.zip) |  |  |  |  |  |  |  |  |
| 44 | 75 | [Download](44/dataset.zip) |  |  |  |  |  |  |  |  |
| 45 | 36 | [Download](45/dataset.zip) |  |  |  |  |  |  |  |  |
| 46 | 15 | [Download](46/dataset.zip) |  |  |  |  |  |  |  |  |
| 47 | 126 | [Download](47/dataset.zip) |  |  |  |  |  |  |  |  |
| 48 | 41 | [Download](48/dataset.zip) |  |  |  |  |  |  |  |  |
| 49 | 46 | [Download](49/dataset.zip) |  |  |  |  |  |  |  |  |
| 50 | 100 | [Download](50/dataset.zip) |  |  |  |  |  |  |  |  |
| 51 | 121 | [Download](51/dataset.zip) |  |  |  |  |  |  |  |  |
| 52 | 36 | [Download](52/dataset.zip) |  |  |  |  |  |  |  |  |
| 53 | 102 | [Download](53/dataset.zip) |  |  |  |  |  |  |  |  |
| 54 | 50 | [Download](54/dataset.zip) |  |  |  |  |  |  |  |  |
| 55 | 105 | [Download](55/dataset.zip) |  |  |  |  |  |  |  |  |
| 56 | 47 | [Download](56/dataset.zip) |  |  |  |  |  |  |  |  |
| 57 | 60 | [Download](57/dataset.zip) |  |  |  |  |  |  |  |  |
| 58 | 26 | [Download](58/dataset.zip) |  |  |  |  |  |  |  |  |
| 59 | 47 | [Download](59/dataset.zip) |  |  |  |  |  |  |  |  |
| 60 | 79 | [Download](60/dataset.zip) |  |  |  |  |  |  |  |  |
| 61 | 74 | [Download](61/dataset.zip) |  |  |  |  |  |  |  |  |
| 62 | 11 | [Download](62/dataset.zip) |  |  |  |  |  |  |  |  |
| 63 | 73 | [Download](63/dataset.zip) |  |  |  |  |  |  |  |  |
| 64 | 30 | [Download](64/dataset.zip) |  |  |  |  |  |  |  |  |
| 65 | 32 | [Download](65/dataset.zip) |  |  |  |  |  |  |  |  |
| 66 | 102 | [Download](66/dataset.zip) |  |  |  |  |  |  |  |  |
| 67 | 17 | [Download](67/dataset.zip) |  |  |  |  |  |  |  |  |
| 68 | 49 | [Download](68/dataset.zip) |  |  |  |  |  |  |  |  |
| 69 | 24 | [Download](69/dataset.zip) |  |  |  |  |  |  |  |  |
| 70 | 28 | [Download](70/dataset.zip) |  |  |  |  |  |  |  |  |
| 71 | 38 | [Download](71/dataset.zip) |  |  |  |  |  |  |  |  |
| 72 | 96 | [Download](72/dataset.zip) |  |  |  |  |  |  |  |  |
| 73 | 52 | [Download](73/dataset.zip) |  |  |  |  |  |  |  |  |
| 74 | 747 | [Download](74/dataset.zip) |  |  |  |  |  |  |  |  |
| 75 | 50 | [Download](75/dataset.zip) |  |  |  |  |  |  |  |  |
| 76 | 43 | [Download](76/dataset.zip) |  |  |  |  |  |  |  |  |
| 77 | 21 | [Download](77/dataset.zip) |  |  |  |  |  |  |  |  |
| 78 | 22 | [Download](78/dataset.zip) |  |  |  |  |  |  |  |  |
| 79 | 23 | [Download](79/dataset.zip) |  |  |  |  |  |  |  |  |
| 80 | 38 | [Download](80/dataset.zip) |  |  |  |  |  |  |  |  |
| 81 | 20 | [Download](81/dataset.zip) |  |  |  |  |  |  |  |  |
| 82 | 44 | [Download](82/dataset.zip) |  |  |  |  |  |  |  |  |
| 83 | 19 | [Download](83/dataset.zip) |  |  |  |  |  |  |  |  |
| 84 | 19 | [Download](84/dataset.zip) |  |  |  |  |  |  |  |  |
| 85 | 19 | [Download](85/dataset.zip) |  |  |  |  |  |  |  |  |
| 86 | 11 | [Download](86/dataset.zip) |  |  |  |  |  |  |  |  |
| 87 | 48 | [Download](87/dataset.zip) |  |  |  |  |  |  |  |  |
| 88 | 18 | [Download](88/dataset.zip) |  |  |  |  |  |  |  |  |
| 89 | 14 | [Download](89/dataset.zip) |  |  |  |  |  |  |  |  |
| 90 | 24 | [Download](90/dataset.zip) |  |  |  |  |  |  |  |  |
| 91 | 19 | [Download](91/dataset.zip) |  |  |  |  |  |  |  |  |
| 92 | 10 | [Download](92/dataset.zip) |  |  |  |  |  |  |  |  |
| 93 | 10 | [Download](93/dataset.zip) |  |  |  |  |  |  |  |  |
| 94 | 33 | [Download](94/dataset.zip) |  |  |  |  |  |  |  |  |
| 95 | 28 | [Download](95/dataset.zip) |  |  |  |  |  |  |  |  |
| 96 | 58 | [Download](96/dataset.zip) |  |  |  |  |  |  |  |  |
| 97 | 13 | [Download](97/dataset.zip) |  |  |  |  |  |  |  |  |
| 98 | 29 | [Download](98/dataset.zip) |  |  |  |  |  |  |  |  |
| 99 | 17 | [Download](99/dataset.zip) |  |  |  |  |  |  |  |  |
| 100 | 32 | [Download](100/dataset.zip) |  |  |  |  |  |  |  |  |
| 101 | 21 | [Download](101/dataset.zip) |  |  |  |  |  |  |  |  |
| 102 | 27 | [Download](102/dataset.zip) |  |  |  |  |  |  |  |  |
| 103 | 22 | [Download](103/dataset.zip) |  |  |  |  |  |  |  |  |
| 104 | 11 | [Download](104/dataset.zip) |  |  |  |  |  |  |  |  |
| 105 | 7 | [Download](105/dataset.zip) |  |  |  |  |  |  |  | N/A |
| 106 | 12 | [Download](106/dataset.zip) |  |  |  |  |  |  |  |  |
| 107 | 14 | [Download](107/dataset.zip) |  |  |  |  |  |  |  |  |
| 108 | 22 | [Download](108/dataset.zip) |  |  |  |  |  |  |  |  |
| 109 | 21 | [Download](109/dataset.zip) |  |  |  |  |  |  |  |  |
| 110 | 25 | [Download](110/dataset.zip) |  |  |  |  |  |  |  |  |
| 111 | 45 | [Download](111/dataset.zip) |  |  |  |  |  |  |  |  |
| 112 | 11 | [Download](112/dataset.zip) |  |  |  |  |  |  |  |  |
| 113 | 23 | [Download](113/dataset.zip) |  |  |  |  |  |  |  |  |
| 114 | 14 | [Download](114/dataset.zip) |  |  |  |  |  |  |  |  |
| 115 | 39 | [Download](115/dataset.zip) |  |  |  |  |  |  |  |  |
| 116 | 17 | [Download](116/dataset.zip) |  |  |  |  |  |  |  |  |
| 117 | 27 | [Download](117/dataset.zip) |  |  |  |  |  |  |  |  |
| 118 | 56 | [Download](118/dataset.zip) |  |  |  |  |  |  |  |  |
| 119 | 19 | [Download](119/dataset.zip) |  |  |  |  |  |  |  |  |
| 120 | 17 | [Download](120/dataset.zip) |  |  |  |  |  |  |  |  |
| 121 | 14 | [Download](121/dataset.zip) |  |  |  |  |  |  |  |  |
| 122 | 12 | [Download](122/dataset.zip) |  |  |  |  |  |  |  |  |
| 123 | 103 | [Download](123/dataset.zip) |  |  |  |  |  |  |  |  |
| 124 | 39 | [Download](124/dataset.zip) |  |  |  |  |  |  |  |  |
| 125 | 15 | [Download](125/dataset.zip) |  |  |  |  |  |  |  |  |
| 126 | 19 | [Download](126/dataset.zip) |  |  |  |  |  |  |  |  |
| 127 | 11 | [Download](127/dataset.zip) |  |  |  |  |  |  |  |  |
| 128 | 15 | [Download](128/dataset.zip) |  |  |  |  |  |  |  |  |
| 129 | 8 | [Download](129/dataset.zip) |  |  |  |  |  |  |  |  |
| 130 | 9 | [Download](130/dataset.zip) |  |  |  |  |  |  |  |  |
| noise | 528 | [Download](-1/dataset.zip) |  |  |  |  |  |  |  |  |
| BangumiBase/sailormoon1990s | [
"size_categories:10K<n<100K",
"license:mit",
"art",
"region:us"
]
| 2023-11-19T21:15:02+00:00 | {"license": "mit", "size_categories": ["10K<n<100K"], "tags": ["art"]} | 2023-11-20T11:23:44+00:00 | []
| []
| TAGS
#size_categories-10K<n<100K #license-mit #art #region-us
| Bangumi Image Base of Sailor Moon (1990s)
=========================================
This is the image base of bangumi Sailor Moon (1990s), we detected 132 characters, 14684 images in total. The full dataset is here.
Please note that these image bases are not guaranteed to be 100% cleaned, they may be noisy actual. If you intend to manually train models using this dataset, we recommend performing necessary preprocessing on the downloaded dataset to eliminate potential noisy samples (approximately 1% probability).
Here is the characters' preview:
| []
| [
"TAGS\n#size_categories-10K<n<100K #license-mit #art #region-us \n"
]
| [
25
]
| [
"passage: TAGS\n#size_categories-10K<n<100K #license-mit #art #region-us \n"
]
|
5f7a306c23fe8d0baf39ce8f7991e88c6203476a | # Dataset Card for "pii-PQA"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | Yijia-Xiao/pii-PQA | [
"region:us"
]
| 2023-11-19T21:22:35+00:00 | {"dataset_info": {"features": [{"name": "instruction", "dtype": "string"}, {"name": "input", "dtype": "string"}, {"name": "output", "dtype": "string"}, {"name": "cleaned_output", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 10209349, "num_examples": 49995}], "download_size": 1229195, "dataset_size": 10209349}} | 2023-11-19T21:22:40+00:00 | []
| []
| TAGS
#region-us
| # Dataset Card for "pii-PQA"
More Information needed | [
"# Dataset Card for \"pii-PQA\"\n\nMore Information needed"
]
| [
"TAGS\n#region-us \n",
"# Dataset Card for \"pii-PQA\"\n\nMore Information needed"
]
| [
6,
15
]
| [
"passage: TAGS\n#region-us \n# Dataset Card for \"pii-PQA\"\n\nMore Information needed"
]
|
f534ad6f95b7780d45b97acced4d64e0d462ef3f |
# BEE-spoke-data/govdocs1-image
This contains `.jpg` files from govdocs1. Light deduplication was applied (i.e. `jdupes` on all files) which removed ~500 duplicate images.
```python
DatasetDict({
train: Dataset({
features: ['image'],
num_rows: 108895
})
})
```
## source
Source info/page: https://digitalcorpora.org/corpora/file-corpora/files/
```
@inproceedings{garfinkel2009bringing,
title={Bringing Science to Digital Forensics with Standardized Forensic Corpora},
author={Garfinkel, Simson and Farrell, Paul and Roussev, Vassil and Dinolt, George},
booktitle={Digital Forensic Research Workshop (DFRWS) 2009},
year={2009},
address={Montreal, Canada},
url={https://digitalcorpora.org/corpora/file-corpora/files/}
}
```
| BEE-spoke-data/govdocs1-image | [
"size_categories:100K<n<1M",
"license:odc-by",
"govdocs1",
"jpg",
"region:us"
]
| 2023-11-19T22:20:04+00:00 | {"license": "odc-by", "size_categories": ["100K<n<1M"], "dataset_info": [{"config_name": "dedup-phash", "features": [{"name": "image", "dtype": "image"}, {"name": "id", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 24292594648.068, "num_examples": 72334}], "download_size": 23672584689, "dataset_size": 24292594648.068}, {"config_name": "dedup-phash-10k", "features": [{"name": "image", "dtype": "image"}, {"name": "id", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 3292749886.0, "num_examples": 10000}], "download_size": 3278279266, "dataset_size": 3292749886.0}, {"config_name": "default", "features": [{"name": "image", "dtype": "image"}], "splits": [{"name": "train", "num_bytes": 34698929817.535, "num_examples": 108895}], "download_size": 37151833690, "dataset_size": 34698929817.535}], "configs": [{"config_name": "dedup-phash", "data_files": [{"split": "train", "path": "dedup-phash/train-*"}]}, {"config_name": "dedup-phash-10k", "data_files": [{"split": "train", "path": "dedup-phash-10k/train-*"}]}, {"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}], "tags": ["govdocs1", "jpg"]} | 2023-11-30T00:14:59+00:00 | []
| []
| TAGS
#size_categories-100K<n<1M #license-odc-by #govdocs1 #jpg #region-us
|
# BEE-spoke-data/govdocs1-image
This contains '.jpg' files from govdocs1. Light deduplication was applied (i.e. 'jdupes' on all files) which removed ~500 duplicate images.
## source
Source info/page: URL
| [
"# BEE-spoke-data/govdocs1-image\n\n\nThis contains '.jpg' files from govdocs1. Light deduplication was applied (i.e. 'jdupes' on all files) which removed ~500 duplicate images.",
"## source\n\nSource info/page: URL"
]
| [
"TAGS\n#size_categories-100K<n<1M #license-odc-by #govdocs1 #jpg #region-us \n",
"# BEE-spoke-data/govdocs1-image\n\n\nThis contains '.jpg' files from govdocs1. Light deduplication was applied (i.e. 'jdupes' on all files) which removed ~500 duplicate images.",
"## source\n\nSource info/page: URL"
]
| [
33,
56,
8
]
| [
"passage: TAGS\n#size_categories-100K<n<1M #license-odc-by #govdocs1 #jpg #region-us \n# BEE-spoke-data/govdocs1-image\n\n\nThis contains '.jpg' files from govdocs1. Light deduplication was applied (i.e. 'jdupes' on all files) which removed ~500 duplicate images.## source\n\nSource info/page: URL"
]
|
0aa637fde5d5000d17f8f15fec485571b09594c4 | # Dataset Card for "imdb_review_prompts"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | 5cp/imdb_review_prompts | [
"region:us"
]
| 2023-11-19T22:44:29+00:00 | {"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "test", "path": "data/test-*"}]}], "dataset_info": {"features": [{"name": "text", "dtype": "string"}, {"name": "label", "dtype": {"class_label": {"names": {"0": "neg", "1": "pos"}}}}], "splits": [{"name": "train", "num_bytes": 290977, "num_examples": 782}, {"name": "test", "num_bytes": 304788, "num_examples": 858}], "download_size": 286770, "dataset_size": 595765}} | 2023-11-19T22:44:31+00:00 | []
| []
| TAGS
#region-us
| # Dataset Card for "imdb_review_prompts"
More Information needed | [
"# Dataset Card for \"imdb_review_prompts\"\n\nMore Information needed"
]
| [
"TAGS\n#region-us \n",
"# Dataset Card for \"imdb_review_prompts\"\n\nMore Information needed"
]
| [
6,
18
]
| [
"passage: TAGS\n#region-us \n# Dataset Card for \"imdb_review_prompts\"\n\nMore Information needed"
]
|
5bf6882f7f10f3a2b36ed3dfdfe7c60dbf98a2c2 |
# ComfyOpenSubtitles
## Dataset Description
ComfyOpenSubtitles is a multilingual dataset that contains parallel translations of subtitles from various languages. It includes pairs of input and target languages, along with the corresponding subtitles.
### Languages
The dataset supports the following languages:
- English (en)
- Russian (ru)
- French (fr)
- Spanish (es)
- Arabic (ar)
- Simplified Chinese (zh-cn)
- Korean (ko)
- Japanese (ja)
- German (de)
## Dataset Structure
### Data Instances
Here are some examples of data instances:
- Input Language: English
Target Language: Russian
Input Text: "Oh, bud... what have you done?"
Output Text: "Эх, Кореш... Что ж вы наделали?"
- Input Language: Spanish
Target Language: French
Input Text: "This is a beautiful sunset."
Output Text: "C'est un magnifique coucher de soleil."
### Data Fields
The dataset includes the following fields for each instance:
- `input_language`: The language of the input text.
- `target_language`: The language of the target translation.
- `input_text`: The input text in the source language.
- `output_text`: The corresponding translation in the target language.
### Data Splits
The dataset is typically divided into training splits with varying sizes.
## Dataset Creation
### Curation Rationale
The dataset was created to provide a multilingual collection of subtitles and their translations for research and natural language processing tasks.
### Source Data
The source data for this dataset consists of subtitles from various movies and TV shows.
### Personal and Sensitive Information
The dataset may contain text from movies and TV shows, which may include personal or sensitive information related to the content of those shows.
### Other Known Limitations
Some data may be inaccurate. Be careful.
## Acknowledgments
- https://huggingface.co/datasets/open_subtitles | ReDUB/ComfyOpenSubtitles | [
"task_categories:translation",
"size_categories:10M<n<100M",
"language:en",
"language:ru",
"language:fr",
"language:es",
"language:ar",
"language:zh",
"language:ko",
"language:ja",
"language:de",
"license:unknown",
"region:us"
]
| 2023-11-19T22:51:52+00:00 | {"language": ["en", "ru", "fr", "es", "ar", "zh", "ko", "ja", "de"], "license": "unknown", "size_categories": ["10M<n<100M"], "task_categories": ["translation"], "pretty_name": "ComfyOpenSubtitles"} | 2023-11-20T05:03:53+00:00 | []
| [
"en",
"ru",
"fr",
"es",
"ar",
"zh",
"ko",
"ja",
"de"
]
| TAGS
#task_categories-translation #size_categories-10M<n<100M #language-English #language-Russian #language-French #language-Spanish #language-Arabic #language-Chinese #language-Korean #language-Japanese #language-German #license-unknown #region-us
|
# ComfyOpenSubtitles
## Dataset Description
ComfyOpenSubtitles is a multilingual dataset that contains parallel translations of subtitles from various languages. It includes pairs of input and target languages, along with the corresponding subtitles.
### Languages
The dataset supports the following languages:
- English (en)
- Russian (ru)
- French (fr)
- Spanish (es)
- Arabic (ar)
- Simplified Chinese (zh-cn)
- Korean (ko)
- Japanese (ja)
- German (de)
## Dataset Structure
### Data Instances
Here are some examples of data instances:
- Input Language: English
Target Language: Russian
Input Text: "Oh, bud... what have you done?"
Output Text: "Эх, Кореш... Что ж вы наделали?"
- Input Language: Spanish
Target Language: French
Input Text: "This is a beautiful sunset."
Output Text: "C'est un magnifique coucher de soleil."
### Data Fields
The dataset includes the following fields for each instance:
- 'input_language': The language of the input text.
- 'target_language': The language of the target translation.
- 'input_text': The input text in the source language.
- 'output_text': The corresponding translation in the target language.
### Data Splits
The dataset is typically divided into training splits with varying sizes.
## Dataset Creation
### Curation Rationale
The dataset was created to provide a multilingual collection of subtitles and their translations for research and natural language processing tasks.
### Source Data
The source data for this dataset consists of subtitles from various movies and TV shows.
### Personal and Sensitive Information
The dataset may contain text from movies and TV shows, which may include personal or sensitive information related to the content of those shows.
### Other Known Limitations
Some data may be inaccurate. Be careful.
## Acknowledgments
- URL | [
"# ComfyOpenSubtitles",
"## Dataset Description\n\nComfyOpenSubtitles is a multilingual dataset that contains parallel translations of subtitles from various languages. It includes pairs of input and target languages, along with the corresponding subtitles.",
"### Languages\n\nThe dataset supports the following languages:\n- English (en)\n- Russian (ru)\n- French (fr)\n- Spanish (es)\n- Arabic (ar)\n- Simplified Chinese (zh-cn)\n- Korean (ko)\n- Japanese (ja)\n- German (de)",
"## Dataset Structure",
"### Data Instances\n\nHere are some examples of data instances:\n\n- Input Language: English\n Target Language: Russian\n Input Text: \"Oh, bud... what have you done?\"\n Output Text: \"Эх, Кореш... Что ж вы наделали?\"\n\n- Input Language: Spanish\n Target Language: French\n Input Text: \"This is a beautiful sunset.\"\n Output Text: \"C'est un magnifique coucher de soleil.\"",
"### Data Fields\n\nThe dataset includes the following fields for each instance:\n- 'input_language': The language of the input text.\n- 'target_language': The language of the target translation.\n- 'input_text': The input text in the source language.\n- 'output_text': The corresponding translation in the target language.",
"### Data Splits\n\nThe dataset is typically divided into training splits with varying sizes.",
"## Dataset Creation",
"### Curation Rationale\n\nThe dataset was created to provide a multilingual collection of subtitles and their translations for research and natural language processing tasks.",
"### Source Data\n\nThe source data for this dataset consists of subtitles from various movies and TV shows.",
"### Personal and Sensitive Information\n\nThe dataset may contain text from movies and TV shows, which may include personal or sensitive information related to the content of those shows.",
"### Other Known Limitations\n\nSome data may be inaccurate. Be careful.",
"## Acknowledgments\n\n- URL"
]
| [
"TAGS\n#task_categories-translation #size_categories-10M<n<100M #language-English #language-Russian #language-French #language-Spanish #language-Arabic #language-Chinese #language-Korean #language-Japanese #language-German #license-unknown #region-us \n",
"# ComfyOpenSubtitles",
"## Dataset Description\n\nComfyOpenSubtitles is a multilingual dataset that contains parallel translations of subtitles from various languages. It includes pairs of input and target languages, along with the corresponding subtitles.",
"### Languages\n\nThe dataset supports the following languages:\n- English (en)\n- Russian (ru)\n- French (fr)\n- Spanish (es)\n- Arabic (ar)\n- Simplified Chinese (zh-cn)\n- Korean (ko)\n- Japanese (ja)\n- German (de)",
"## Dataset Structure",
"### Data Instances\n\nHere are some examples of data instances:\n\n- Input Language: English\n Target Language: Russian\n Input Text: \"Oh, bud... what have you done?\"\n Output Text: \"Эх, Кореш... Что ж вы наделали?\"\n\n- Input Language: Spanish\n Target Language: French\n Input Text: \"This is a beautiful sunset.\"\n Output Text: \"C'est un magnifique coucher de soleil.\"",
"### Data Fields\n\nThe dataset includes the following fields for each instance:\n- 'input_language': The language of the input text.\n- 'target_language': The language of the target translation.\n- 'input_text': The input text in the source language.\n- 'output_text': The corresponding translation in the target language.",
"### Data Splits\n\nThe dataset is typically divided into training splits with varying sizes.",
"## Dataset Creation",
"### Curation Rationale\n\nThe dataset was created to provide a multilingual collection of subtitles and their translations for research and natural language processing tasks.",
"### Source Data\n\nThe source data for this dataset consists of subtitles from various movies and TV shows.",
"### Personal and Sensitive Information\n\nThe dataset may contain text from movies and TV shows, which may include personal or sensitive information related to the content of those shows.",
"### Other Known Limitations\n\nSome data may be inaccurate. Be careful.",
"## Acknowledgments\n\n- URL"
]
| [
79,
7,
52,
64,
6,
95,
80,
22,
5,
37,
24,
35,
20,
8
]
| [
"passage: TAGS\n#task_categories-translation #size_categories-10M<n<100M #language-English #language-Russian #language-French #language-Spanish #language-Arabic #language-Chinese #language-Korean #language-Japanese #language-German #license-unknown #region-us \n# ComfyOpenSubtitles## Dataset Description\n\nComfyOpenSubtitles is a multilingual dataset that contains parallel translations of subtitles from various languages. It includes pairs of input and target languages, along with the corresponding subtitles.### Languages\n\nThe dataset supports the following languages:\n- English (en)\n- Russian (ru)\n- French (fr)\n- Spanish (es)\n- Arabic (ar)\n- Simplified Chinese (zh-cn)\n- Korean (ko)\n- Japanese (ja)\n- German (de)## Dataset Structure### Data Instances\n\nHere are some examples of data instances:\n\n- Input Language: English\n Target Language: Russian\n Input Text: \"Oh, bud... what have you done?\"\n Output Text: \"Эх, Кореш... Что ж вы наделали?\"\n\n- Input Language: Spanish\n Target Language: French\n Input Text: \"This is a beautiful sunset.\"\n Output Text: \"C'est un magnifique coucher de soleil.\"### Data Fields\n\nThe dataset includes the following fields for each instance:\n- 'input_language': The language of the input text.\n- 'target_language': The language of the target translation.\n- 'input_text': The input text in the source language.\n- 'output_text': The corresponding translation in the target language.### Data Splits\n\nThe dataset is typically divided into training splits with varying sizes.## Dataset Creation### Curation Rationale\n\nThe dataset was created to provide a multilingual collection of subtitles and their translations for research and natural language processing tasks.### Source Data\n\nThe source data for this dataset consists of subtitles from various movies and TV shows.### Personal and Sensitive Information\n\nThe dataset may contain text from movies and TV shows, which may include personal or sensitive information related to the content of those shows."
]
|
cf3f61c2459fe9a5a86e5380fefe2d3dd5a925f6 |
# Dataset Card for Dataset Name
<!-- Provide a quick summary of the dataset. -->
Credit: https://www.kaggle.com/datasets/dasmehdixtr/drone-dataset-uav
This is a dataset from the above the link. It's used for object detection training on yolo model for the class of drone.
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | ywanny/Drone_Detection | [
"region:us"
]
| 2023-11-19T23:02:07+00:00 | {} | 2023-11-19T23:32:13+00:00 | []
| []
| TAGS
#region-us
|
# Dataset Card for Dataset Name
Credit: URL
This is a dataset from the above the link. It's used for object detection training on yolo model for the class of drone.
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Dataset Name\n\n\n\nCredit: URL \nThis is a dataset from the above the link. It's used for object detection training on yolo model for the class of drone.",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
]
| [
"TAGS\n#region-us \n",
"# Dataset Card for Dataset Name\n\n\n\nCredit: URL \nThis is a dataset from the above the link. It's used for object detection training on yolo model for the class of drone.",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
]
| [
6,
41,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
]
| [
"passage: TAGS\n#region-us \n# Dataset Card for Dataset Name\n\n\n\nCredit: URL \nThis is a dataset from the above the link. It's used for object detection training on yolo model for the class of drone.## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Dataset Card Authors [optional]## Dataset Card Contact"
]
|
153773b22327481554d0e8df309040f90b2423f8 | # Dataset Card for "xView2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | danielz01/xView2 | [
"region:us"
]
| 2023-11-19T23:37:30+00:00 | {"dataset_info": {"config_name": "competition", "features": [{"name": "image1", "dtype": "image"}, {"name": "image2", "dtype": "image"}, {"name": "mask1", "dtype": "image"}, {"name": "mask2", "dtype": "image"}, {"name": "objects1", "struct": [{"name": "bbox", "sequence": {"sequence": "int32"}}, {"name": "feature_type", "sequence": "string"}, {"name": "uid", "sequence": "string"}]}, {"name": "objects2", "struct": [{"name": "bbox", "sequence": {"sequence": "int32"}}, {"name": "feature_type", "sequence": "string"}, {"name": "subtype", "sequence": "string"}, {"name": "uid", "sequence": "string"}]}, {"name": "meta1", "struct": [{"name": "features", "struct": [{"name": "lng_lat", "list": [{"name": "properties", "struct": [{"name": "feature_type", "dtype": "string"}, {"name": "uid", "dtype": "string"}]}, {"name": "wkt", "dtype": "string"}]}, {"name": "xy", "list": [{"name": "properties", "struct": [{"name": "feature_type", "dtype": "string"}, {"name": "uid", "dtype": "string"}]}, {"name": "wkt", "dtype": "string"}]}]}, {"name": "metadata", "struct": [{"name": "capture_date", "dtype": "string"}, {"name": "catalog_id", "dtype": "string"}, {"name": "disaster", "dtype": "string"}, {"name": "disaster_type", "dtype": "string"}, {"name": "gsd", "dtype": "float64"}, {"name": "height", "dtype": "int64"}, {"name": "id", "dtype": "string"}, {"name": "img_name", "dtype": "string"}, {"name": "off_nadir_angle", "dtype": "float64"}, {"name": "original_height", "dtype": "int64"}, {"name": "original_width", "dtype": "int64"}, {"name": "pan_resolution", "dtype": "float64"}, {"name": "provider_asset_type", "dtype": "string"}, {"name": "sensor", "dtype": "string"}, {"name": "sun_azimuth", "dtype": "float64"}, {"name": "sun_elevation", "dtype": "float64"}, {"name": "target_azimuth", "dtype": "float64"}, {"name": "width", "dtype": "int64"}]}]}, {"name": "meta2", "struct": [{"name": "features", "struct": [{"name": "lng_lat", "list": [{"name": "properties", "struct": [{"name": "feature_type", "dtype": "string"}, {"name": "subtype", "dtype": "string"}, {"name": "uid", "dtype": "string"}]}, {"name": "wkt", "dtype": "string"}]}, {"name": "xy", "list": [{"name": "properties", "struct": [{"name": "feature_type", "dtype": "string"}, {"name": "subtype", "dtype": "string"}, {"name": "uid", "dtype": "string"}]}, {"name": "wkt", "dtype": "string"}]}]}, {"name": "metadata", "struct": [{"name": "capture_date", "dtype": "string"}, {"name": "catalog_id", "dtype": "string"}, {"name": "disaster", "dtype": "string"}, {"name": "disaster_type", "dtype": "string"}, {"name": "gsd", "dtype": "float64"}, {"name": "height", "dtype": "int64"}, {"name": "id", "dtype": "string"}, {"name": "img_name", "dtype": "string"}, {"name": "off_nadir_angle", "dtype": "float64"}, {"name": "original_height", "dtype": "int64"}, {"name": "original_width", "dtype": "int64"}, {"name": "pan_resolution", "dtype": "float64"}, {"name": "provider_asset_type", "dtype": "string"}, {"name": "sensor", "dtype": "string"}, {"name": "sun_azimuth", "dtype": "float64"}, {"name": "sun_elevation", "dtype": "float64"}, {"name": "target_azimuth", "dtype": "float64"}, {"name": "width", "dtype": "int64"}]}]}], "splits": [{"name": "train", "num_bytes": 8588187300.178, "num_examples": 2799}, {"name": "test", "num_bytes": 2860401182.0, "num_examples": 933}], "download_size": 11309747563, "dataset_size": 11448588482.178001}, "configs": [{"config_name": "competition", "data_files": [{"split": "train", "path": "competition/train-*"}, {"split": "test", "path": "competition/test-*"}]}]} | 2023-11-19T23:43:11+00:00 | []
| []
| TAGS
#region-us
| # Dataset Card for "xView2"
More Information needed | [
"# Dataset Card for \"xView2\"\n\nMore Information needed"
]
| [
"TAGS\n#region-us \n",
"# Dataset Card for \"xView2\"\n\nMore Information needed"
]
| [
6,
13
]
| [
"passage: TAGS\n#region-us \n# Dataset Card for \"xView2\"\n\nMore Information needed"
]
|
f9a221f361ea53756e2224470f5e1f9cd62f7d30 |
# U.S. Congressional Hearings Dataset
This dataset currently contains cleaned sentences from all House Committee on Energy and Commerce hearings from 2002.
A total of 1K+ hearing transcripts in txt formats from govinfo.gov were collected and cleaned. | erikliu18/us-congress-hearing | [
"task_categories:text-classification",
"language:en",
"finance",
"legal",
"region:us"
]
| 2023-11-20T00:49:47+00:00 | {"language": ["en"], "task_categories": ["text-classification"], "tags": ["finance", "legal"]} | 2023-11-20T01:01:16+00:00 | []
| [
"en"
]
| TAGS
#task_categories-text-classification #language-English #finance #legal #region-us
|
# U.S. Congressional Hearings Dataset
This dataset currently contains cleaned sentences from all House Committee on Energy and Commerce hearings from 2002.
A total of 1K+ hearing transcripts in txt formats from URL were collected and cleaned. | [
"# U.S. Congressional Hearings Dataset\n\nThis dataset currently contains cleaned sentences from all House Committee on Energy and Commerce hearings from 2002. \nA total of 1K+ hearing transcripts in txt formats from URL were collected and cleaned."
]
| [
"TAGS\n#task_categories-text-classification #language-English #finance #legal #region-us \n",
"# U.S. Congressional Hearings Dataset\n\nThis dataset currently contains cleaned sentences from all House Committee on Energy and Commerce hearings from 2002. \nA total of 1K+ hearing transcripts in txt formats from URL were collected and cleaned."
]
| [
26,
58
]
| [
"passage: TAGS\n#task_categories-text-classification #language-English #finance #legal #region-us \n# U.S. Congressional Hearings Dataset\n\nThis dataset currently contains cleaned sentences from all House Committee on Energy and Commerce hearings from 2002. \nA total of 1K+ hearing transcripts in txt formats from URL were collected and cleaned."
]
|
e739f802aacff689535281c0da8bf523a249f3a7 |
# Dataset card
The Luau dataset is a collection of code fragments collected from the Roblox Luau Data Sharing program.
Only experiences where creators gave us permission to contribute to the public Luau Dataset were used for producing this dataset.
# Languages:
Lua, Luau
# License:
MIT
Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the “Software”), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED “AS IS”, WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
# Format:
The dataset format is in jsonl format, with prompt / completion fields.
# Dataset usage:
This dataset is designed for fine tuning large language models.
# Risks:
The dataset has been filtered for various quality signals, though Roblox makes no guarantees of data quality.
# Evaluation:
We have found that typically fine tuning a generalist code LLM improve it’s performance on Roblox Lua code quality by 10 to 20%.
| Roblox/luau_corpus | [
"license:mit",
"code",
"region:us"
]
| 2023-11-20T01:08:21+00:00 | {"license": "mit", "tags": ["code"]} | 2023-11-20T01:09:48+00:00 | []
| []
| TAGS
#license-mit #code #region-us
|
# Dataset card
The Luau dataset is a collection of code fragments collected from the Roblox Luau Data Sharing program.
Only experiences where creators gave us permission to contribute to the public Luau Dataset were used for producing this dataset.
# Languages:
Lua, Luau
# License:
MIT
Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the “Software”), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED “AS IS”, WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
# Format:
The dataset format is in jsonl format, with prompt / completion fields.
# Dataset usage:
This dataset is designed for fine tuning large language models.
# Risks:
The dataset has been filtered for various quality signals, though Roblox makes no guarantees of data quality.
# Evaluation:
We have found that typically fine tuning a generalist code LLM improve it’s performance on Roblox Lua code quality by 10 to 20%.
| [
"# Dataset card\n\n\nThe Luau dataset is a collection of code fragments collected from the Roblox Luau Data Sharing program.\n\nOnly experiences where creators gave us permission to contribute to the public Luau Dataset were used for producing this dataset.",
"# Languages:\nLua, Luau",
"# License:\nMIT\n\nPermission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the “Software”), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED “AS IS”, WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.",
"# Format:\nThe dataset format is in jsonl format, with prompt / completion fields.",
"# Dataset usage:\nThis dataset is designed for fine tuning large language models.",
"# Risks:\n\nThe dataset has been filtered for various quality signals, though Roblox makes no guarantees of data quality.",
"# Evaluation:\n\nWe have found that typically fine tuning a generalist code LLM improve it’s performance on Roblox Lua code quality by 10 to 20%."
]
| [
"TAGS\n#license-mit #code #region-us \n",
"# Dataset card\n\n\nThe Luau dataset is a collection of code fragments collected from the Roblox Luau Data Sharing program.\n\nOnly experiences where creators gave us permission to contribute to the public Luau Dataset were used for producing this dataset.",
"# Languages:\nLua, Luau",
"# License:\nMIT\n\nPermission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the “Software”), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED “AS IS”, WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.",
"# Format:\nThe dataset format is in jsonl format, with prompt / completion fields.",
"# Dataset usage:\nThis dataset is designed for fine tuning large language models.",
"# Risks:\n\nThe dataset has been filtered for various quality signals, though Roblox makes no guarantees of data quality.",
"# Evaluation:\n\nWe have found that typically fine tuning a generalist code LLM improve it’s performance on Roblox Lua code quality by 10 to 20%."
]
| [
13,
56,
9,
292,
22,
18,
29,
36
]
| [
"passage: TAGS\n#license-mit #code #region-us \n# Dataset card\n\n\nThe Luau dataset is a collection of code fragments collected from the Roblox Luau Data Sharing program.\n\nOnly experiences where creators gave us permission to contribute to the public Luau Dataset were used for producing this dataset.# Languages:\nLua, Luau# License:\nMIT\n\nPermission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the “Software”), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED “AS IS”, WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.# Format:\nThe dataset format is in jsonl format, with prompt / completion fields.# Dataset usage:\nThis dataset is designed for fine tuning large language models.# Risks:\n\nThe dataset has been filtered for various quality signals, though Roblox makes no guarantees of data quality.# Evaluation:\n\nWe have found that typically fine tuning a generalist code LLM improve it’s performance on Roblox Lua code quality by 10 to 20%."
]
|
e199ebeb85fe83388786f5dfa6490abe67491922 | The drone dataset that was used was from https://www.kaggle.com/datasets/muki2003/yolo-drone-detection-dataset | jameslpineda/cs370-uav-detection | [
"region:us"
]
| 2023-11-20T02:16:15+00:00 | {} | 2023-11-20T02:29:41+00:00 | []
| []
| TAGS
#region-us
| The drone dataset that was used was from URL | []
| [
"TAGS\n#region-us \n"
]
| [
6
]
| [
"passage: TAGS\n#region-us \n"
]
|
c46f93b5ec79650637363b2757b4aaf00435c5b9 | 11-20 11:25
ID 1300까지 Deepl 번역 완. 미정제 후처리 안 한 데이터셋. 계속 번역 추가중. 번역 완료 후 데이터 정제하고 데이터셋명 변경예정.
original:
This dataset is the result of combing through several reverse proxy logs sets and cleaning them of refusals, duplicate, incomplete, and poor quality responses. Lots of manual quality checks. There's also things like ecommerce descriptions for sex toys and bondage gear, as well as examples of SEO optimized porn video descriptions. I will definitely be improving on this dataset continously; it should be considered a work in progress. My goal is to create a model (or set of models) which can completely replace OpenAI models for erotic roleplay and adult industry use.
Please consider supporting me on Patreon, I'm only asking for about tree fiddy.
https://www.patreon.com/openerotica | leaudhiver/frdrpko | [
"license:apache-2.0",
"region:us"
]
| 2023-11-20T02:19:49+00:00 | {"license": "apache-2.0"} | 2023-11-20T02:34:53+00:00 | []
| []
| TAGS
#license-apache-2.0 #region-us
| 11-20 11:25
ID 1300까지 Deepl 번역 완. 미정제 후처리 안 한 데이터셋. 계속 번역 추가중. 번역 완료 후 데이터 정제하고 데이터셋명 변경예정.
original:
This dataset is the result of combing through several reverse proxy logs sets and cleaning them of refusals, duplicate, incomplete, and poor quality responses. Lots of manual quality checks. There's also things like ecommerce descriptions for sex toys and bondage gear, as well as examples of SEO optimized porn video descriptions. I will definitely be improving on this dataset continously; it should be considered a work in progress. My goal is to create a model (or set of models) which can completely replace OpenAI models for erotic roleplay and adult industry use.
Please consider supporting me on Patreon, I'm only asking for about tree fiddy.
URL | []
| [
"TAGS\n#license-apache-2.0 #region-us \n"
]
| [
14
]
| [
"passage: TAGS\n#license-apache-2.0 #region-us \n"
]
|
9938681640eaa9e15226cef434b68ed133b2dca6 | # Dataset Card for "wiki_find_passage_train200_eval40_title"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | tyzhu/wiki_find_passage_train200_eval40_title | [
"region:us"
]
| 2023-11-20T03:05:57+00:00 | {"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "validation", "path": "data/validation-*"}]}], "dataset_info": {"features": [{"name": "inputs", "dtype": "string"}, {"name": "targets", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 323205, "num_examples": 440}, {"name": "validation", "num_bytes": 33941, "num_examples": 40}], "download_size": 0, "dataset_size": 357146}} | 2023-11-20T03:16:56+00:00 | []
| []
| TAGS
#region-us
| # Dataset Card for "wiki_find_passage_train200_eval40_title"
More Information needed | [
"# Dataset Card for \"wiki_find_passage_train200_eval40_title\"\n\nMore Information needed"
]
| [
"TAGS\n#region-us \n",
"# Dataset Card for \"wiki_find_passage_train200_eval40_title\"\n\nMore Information needed"
]
| [
6,
26
]
| [
"passage: TAGS\n#region-us \n# Dataset Card for \"wiki_find_passage_train200_eval40_title\"\n\nMore Information needed"
]
|
0d4bb2bdf73e7f4bbc68b85e6a049f513bc84f82 | # Dataset Card for "wiki_find_passage_train200_eval40_rare"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | tyzhu/wiki_find_passage_train200_eval40_rare | [
"region:us"
]
| 2023-11-20T03:06:34+00:00 | {"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "validation", "path": "data/validation-*"}]}], "dataset_info": {"features": [{"name": "inputs", "dtype": "string"}, {"name": "targets", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 317804, "num_examples": 440}, {"name": "validation", "num_bytes": 33460, "num_examples": 40}], "download_size": 146364, "dataset_size": 351264}} | 2023-11-20T03:17:46+00:00 | []
| []
| TAGS
#region-us
| # Dataset Card for "wiki_find_passage_train200_eval40_rare"
More Information needed | [
"# Dataset Card for \"wiki_find_passage_train200_eval40_rare\"\n\nMore Information needed"
]
| [
"TAGS\n#region-us \n",
"# Dataset Card for \"wiki_find_passage_train200_eval40_rare\"\n\nMore Information needed"
]
| [
6,
26
]
| [
"passage: TAGS\n#region-us \n# Dataset Card for \"wiki_find_passage_train200_eval40_rare\"\n\nMore Information needed"
]
|
11033a9327276f852386109fdf4f44e5513b56d4 | # Dataset Card for "wiki_find_passage_train200_eval40_num"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | tyzhu/wiki_find_passage_train200_eval40_num | [
"region:us"
]
| 2023-11-20T03:07:09+00:00 | {"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "validation", "path": "data/validation-*"}]}], "dataset_info": {"features": [{"name": "inputs", "dtype": "string"}, {"name": "targets", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 316002, "num_examples": 440}, {"name": "validation", "num_bytes": 33332, "num_examples": 40}], "download_size": 0, "dataset_size": 349334}} | 2023-11-20T03:18:27+00:00 | []
| []
| TAGS
#region-us
| # Dataset Card for "wiki_find_passage_train200_eval40_num"
More Information needed | [
"# Dataset Card for \"wiki_find_passage_train200_eval40_num\"\n\nMore Information needed"
]
| [
"TAGS\n#region-us \n",
"# Dataset Card for \"wiki_find_passage_train200_eval40_num\"\n\nMore Information needed"
]
| [
6,
26
]
| [
"passage: TAGS\n#region-us \n# Dataset Card for \"wiki_find_passage_train200_eval40_num\"\n\nMore Information needed"
]
|
fd5624d6963cf9192f0367d0ddff7603cc20ffb2 | # Dataset Card for "wiki_find_passage_train400_eval40_title"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | tyzhu/wiki_find_passage_train400_eval40_title | [
"region:us"
]
| 2023-11-20T03:07:46+00:00 | {"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "validation", "path": "data/validation-*"}]}], "dataset_info": {"features": [{"name": "inputs", "dtype": "string"}, {"name": "targets", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 637437, "num_examples": 840}, {"name": "validation", "num_bytes": 33941, "num_examples": 40}], "download_size": 0, "dataset_size": 671378}} | 2023-11-20T03:19:10+00:00 | []
| []
| TAGS
#region-us
| # Dataset Card for "wiki_find_passage_train400_eval40_title"
More Information needed | [
"# Dataset Card for \"wiki_find_passage_train400_eval40_title\"\n\nMore Information needed"
]
| [
"TAGS\n#region-us \n",
"# Dataset Card for \"wiki_find_passage_train400_eval40_title\"\n\nMore Information needed"
]
| [
6,
26
]
| [
"passage: TAGS\n#region-us \n# Dataset Card for \"wiki_find_passage_train400_eval40_title\"\n\nMore Information needed"
]
|
7963b0ef8d87b58dc2d569bd1d4b2f4d5856f3fb | # Dataset Card for "wiki_find_passage_train400_eval40_rare"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | tyzhu/wiki_find_passage_train400_eval40_rare | [
"region:us"
]
| 2023-11-20T03:08:21+00:00 | {"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "validation", "path": "data/validation-*"}]}], "dataset_info": {"features": [{"name": "inputs", "dtype": "string"}, {"name": "targets", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 627940, "num_examples": 840}, {"name": "validation", "num_bytes": 33454, "num_examples": 40}], "download_size": 243645, "dataset_size": 661394}} | 2023-11-20T03:20:00+00:00 | []
| []
| TAGS
#region-us
| # Dataset Card for "wiki_find_passage_train400_eval40_rare"
More Information needed | [
"# Dataset Card for \"wiki_find_passage_train400_eval40_rare\"\n\nMore Information needed"
]
| [
"TAGS\n#region-us \n",
"# Dataset Card for \"wiki_find_passage_train400_eval40_rare\"\n\nMore Information needed"
]
| [
6,
26
]
| [
"passage: TAGS\n#region-us \n# Dataset Card for \"wiki_find_passage_train400_eval40_rare\"\n\nMore Information needed"
]
|
4beb00b7d5071f219f6f22c54f4f539b2f40bacc | # Dataset Card for "wiki_find_passage_train400_eval40_num"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | tyzhu/wiki_find_passage_train400_eval40_num | [
"region:us"
]
| 2023-11-20T03:08:58+00:00 | {"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "validation", "path": "data/validation-*"}]}], "dataset_info": {"features": [{"name": "inputs", "dtype": "string"}, {"name": "targets", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 624884, "num_examples": 840}, {"name": "validation", "num_bytes": 33332, "num_examples": 40}], "download_size": 0, "dataset_size": 658216}} | 2023-11-20T03:20:42+00:00 | []
| []
| TAGS
#region-us
| # Dataset Card for "wiki_find_passage_train400_eval40_num"
More Information needed | [
"# Dataset Card for \"wiki_find_passage_train400_eval40_num\"\n\nMore Information needed"
]
| [
"TAGS\n#region-us \n",
"# Dataset Card for \"wiki_find_passage_train400_eval40_num\"\n\nMore Information needed"
]
| [
6,
26
]
| [
"passage: TAGS\n#region-us \n# Dataset Card for \"wiki_find_passage_train400_eval40_num\"\n\nMore Information needed"
]
|
c4ae6558a2ea516c71e2f5caa708d6379cd1e1e8 | # Dataset Card for "random_letter_same_length_find_passage_train200_eval40_title"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | tyzhu/random_letter_same_length_find_passage_train200_eval40_title | [
"region:us"
]
| 2023-11-20T03:17:11+00:00 | {"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "validation", "path": "data/validation-*"}]}], "dataset_info": {"features": [{"name": "inputs", "dtype": "string"}, {"name": "targets", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 151441, "num_examples": 440}, {"name": "validation", "num_bytes": 16031, "num_examples": 40}], "download_size": 81084, "dataset_size": 167472}} | 2023-11-20T03:17:19+00:00 | []
| []
| TAGS
#region-us
| # Dataset Card for "random_letter_same_length_find_passage_train200_eval40_title"
More Information needed | [
"# Dataset Card for \"random_letter_same_length_find_passage_train200_eval40_title\"\n\nMore Information needed"
]
| [
"TAGS\n#region-us \n",
"# Dataset Card for \"random_letter_same_length_find_passage_train200_eval40_title\"\n\nMore Information needed"
]
| [
6,
34
]
| [
"passage: TAGS\n#region-us \n# Dataset Card for \"random_letter_same_length_find_passage_train200_eval40_title\"\n\nMore Information needed"
]
|
3b26fdc847dfea574ab9cb677858c4f1db9dc144 | # Dataset Card for "random_letter_same_length_find_passage_train200_eval40_rare"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | tyzhu/random_letter_same_length_find_passage_train200_eval40_rare | [
"region:us"
]
| 2023-11-20T03:18:02+00:00 | {"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "validation", "path": "data/validation-*"}]}], "dataset_info": {"features": [{"name": "inputs", "dtype": "string"}, {"name": "targets", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 146054, "num_examples": 440}, {"name": "validation", "num_bytes": 15546, "num_examples": 40}], "download_size": 79395, "dataset_size": 161600}} | 2023-11-20T03:18:09+00:00 | []
| []
| TAGS
#region-us
| # Dataset Card for "random_letter_same_length_find_passage_train200_eval40_rare"
More Information needed | [
"# Dataset Card for \"random_letter_same_length_find_passage_train200_eval40_rare\"\n\nMore Information needed"
]
| [
"TAGS\n#region-us \n",
"# Dataset Card for \"random_letter_same_length_find_passage_train200_eval40_rare\"\n\nMore Information needed"
]
| [
6,
34
]
| [
"passage: TAGS\n#region-us \n# Dataset Card for \"random_letter_same_length_find_passage_train200_eval40_rare\"\n\nMore Information needed"
]
|
57ec10c5b31c6097b1de67987d55ac81776cbf58 | # Dataset Card for "random_letter_same_length_find_passage_train200_eval40_num"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | tyzhu/random_letter_same_length_find_passage_train200_eval40_num | [
"region:us"
]
| 2023-11-20T03:18:43+00:00 | {"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "validation", "path": "data/validation-*"}]}], "dataset_info": {"features": [{"name": "inputs", "dtype": "string"}, {"name": "targets", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 144238, "num_examples": 440}, {"name": "validation", "num_bytes": 15422, "num_examples": 40}], "download_size": 76975, "dataset_size": 159660}} | 2023-11-20T03:18:51+00:00 | []
| []
| TAGS
#region-us
| # Dataset Card for "random_letter_same_length_find_passage_train200_eval40_num"
More Information needed | [
"# Dataset Card for \"random_letter_same_length_find_passage_train200_eval40_num\"\n\nMore Information needed"
]
| [
"TAGS\n#region-us \n",
"# Dataset Card for \"random_letter_same_length_find_passage_train200_eval40_num\"\n\nMore Information needed"
]
| [
6,
34
]
| [
"passage: TAGS\n#region-us \n# Dataset Card for \"random_letter_same_length_find_passage_train200_eval40_num\"\n\nMore Information needed"
]
|
e274374524caca5f3ac03e2d2fafa63553a8eac9 | # Dataset Card for "random_letter_same_length_find_passage_train400_eval40_title"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | tyzhu/random_letter_same_length_find_passage_train400_eval40_title | [
"region:us"
]
| 2023-11-20T03:19:25+00:00 | {"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "validation", "path": "data/validation-*"}]}], "dataset_info": {"features": [{"name": "inputs", "dtype": "string"}, {"name": "targets", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 299411, "num_examples": 840}, {"name": "validation", "num_bytes": 16031, "num_examples": 40}], "download_size": 135104, "dataset_size": 315442}} | 2023-11-20T03:19:32+00:00 | []
| []
| TAGS
#region-us
| # Dataset Card for "random_letter_same_length_find_passage_train400_eval40_title"
More Information needed | [
"# Dataset Card for \"random_letter_same_length_find_passage_train400_eval40_title\"\n\nMore Information needed"
]
| [
"TAGS\n#region-us \n",
"# Dataset Card for \"random_letter_same_length_find_passage_train400_eval40_title\"\n\nMore Information needed"
]
| [
6,
34
]
| [
"passage: TAGS\n#region-us \n# Dataset Card for \"random_letter_same_length_find_passage_train400_eval40_title\"\n\nMore Information needed"
]
|
e4404ae54c02c1a53b9b3a5822b1fc2860a05fb1 |
This data set was collected for academic purposes, suitable for some NLP tasks including sentiment analysis. | beltrewilton/punta-cana-spanish-reviews | [
"task_categories:text-classification",
"language:es",
"license:mit",
"region:us"
]
| 2023-11-20T03:19:51+00:00 | {"language": ["es"], "license": "mit", "task_categories": ["text-classification"]} | 2023-11-20T03:25:58+00:00 | []
| [
"es"
]
| TAGS
#task_categories-text-classification #language-Spanish #license-mit #region-us
|
This data set was collected for academic purposes, suitable for some NLP tasks including sentiment analysis. | []
| [
"TAGS\n#task_categories-text-classification #language-Spanish #license-mit #region-us \n"
]
| [
27
]
| [
"passage: TAGS\n#task_categories-text-classification #language-Spanish #license-mit #region-us \n"
]
|
b5b161261025b2d7213d609003e48638b8d4dbd7 | Retrieval Augmented Generation for Supervised FineTuning
| coccoc-search/sft_rag | [
"region:us"
]
| 2023-11-20T03:19:56+00:00 | {} | 2023-11-20T04:42:59+00:00 | []
| []
| TAGS
#region-us
| Retrieval Augmented Generation for Supervised FineTuning
| []
| [
"TAGS\n#region-us \n"
]
| [
6
]
| [
"passage: TAGS\n#region-us \n"
]
|
b06f79ae35d370fabc9c5e25f6ff5f2b5e2ce125 | # Dataset Card for "random_letter_same_length_find_passage_train400_eval40_rare"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | tyzhu/random_letter_same_length_find_passage_train400_eval40_rare | [
"region:us"
]
| 2023-11-20T03:20:16+00:00 | {"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "validation", "path": "data/validation-*"}]}], "dataset_info": {"features": [{"name": "inputs", "dtype": "string"}, {"name": "targets", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 289948, "num_examples": 840}, {"name": "validation", "num_bytes": 15536, "num_examples": 40}], "download_size": 132781, "dataset_size": 305484}} | 2023-11-20T03:20:23+00:00 | []
| []
| TAGS
#region-us
| # Dataset Card for "random_letter_same_length_find_passage_train400_eval40_rare"
More Information needed | [
"# Dataset Card for \"random_letter_same_length_find_passage_train400_eval40_rare\"\n\nMore Information needed"
]
| [
"TAGS\n#region-us \n",
"# Dataset Card for \"random_letter_same_length_find_passage_train400_eval40_rare\"\n\nMore Information needed"
]
| [
6,
34
]
| [
"passage: TAGS\n#region-us \n# Dataset Card for \"random_letter_same_length_find_passage_train400_eval40_rare\"\n\nMore Information needed"
]
|
f20007a79dbc01c54c4d506ff52307f63e2b0fe6 | # Dataset Card for "random_letter_same_length_find_passage_train400_eval40_num"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | tyzhu/random_letter_same_length_find_passage_train400_eval40_num | [
"region:us"
]
| 2023-11-20T03:20:58+00:00 | {"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "validation", "path": "data/validation-*"}]}], "dataset_info": {"features": [{"name": "inputs", "dtype": "string"}, {"name": "targets", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 286858, "num_examples": 840}, {"name": "validation", "num_bytes": 15422, "num_examples": 40}], "download_size": 128731, "dataset_size": 302280}} | 2023-11-20T03:21:05+00:00 | []
| []
| TAGS
#region-us
| # Dataset Card for "random_letter_same_length_find_passage_train400_eval40_num"
More Information needed | [
"# Dataset Card for \"random_letter_same_length_find_passage_train400_eval40_num\"\n\nMore Information needed"
]
| [
"TAGS\n#region-us \n",
"# Dataset Card for \"random_letter_same_length_find_passage_train400_eval40_num\"\n\nMore Information needed"
]
| [
6,
34
]
| [
"passage: TAGS\n#region-us \n# Dataset Card for \"random_letter_same_length_find_passage_train400_eval40_num\"\n\nMore Information needed"
]
|
d8e3d4e127dec61542200e4ed9c33407e8dbbfd0 | This is a preprocessed version of the realnewslike subdirectory of C4
C4 from: https://huggingface.co/datasets/allenai/c4
Files generated by using Megatron-LM https://github.com/NVIDIA/Megatron-LM/
```
python tools/preprocess_data.py \
--input 'c4/realnewslike/c4-train.0000[0-9]-of-00512.json' \
--partitions 8 \
--output-prefix preprocessed/c4 \
--tokenizer-type GPTSentencePieceTokenizer \
--tokenizer-model tokenizers/tokenizer.model \
--workers 8
```
---
license: odc-by
---
| ufotalent/zero_bubble_sample_dataset | [
"region:us"
]
| 2023-11-20T03:24:52+00:00 | {} | 2023-11-20T03:29:59+00:00 | []
| []
| TAGS
#region-us
| This is a preprocessed version of the realnewslike subdirectory of C4
C4 from: URL
Files generated by using Megatron-LM URL
---
license: odc-by
---
| []
| [
"TAGS\n#region-us \n"
]
| [
6
]
| [
"passage: TAGS\n#region-us \n"
]
|
d9a5e83ccd86215b067f718447454addf52cced8 | # Dataset Card for "imdb-card-pred-binary"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | CardinalityLM/imdb-card-pred-binary | [
"region:us"
]
| 2023-11-20T03:52:32+00:00 | {"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "test", "path": "data/test-*"}]}], "dataset_info": {"features": [{"name": "text", "dtype": "string"}, {"name": "prompt", "dtype": "string"}, {"name": "true_cardinality", "dtype": "int64"}], "splits": [{"name": "train", "num_bytes": 40068212.8, "num_examples": 80000}, {"name": "test", "num_bytes": 10017053.2, "num_examples": 20000}], "download_size": 8598252, "dataset_size": 50085266.0}} | 2023-11-20T03:52:38+00:00 | []
| []
| TAGS
#region-us
| # Dataset Card for "imdb-card-pred-binary"
More Information needed | [
"# Dataset Card for \"imdb-card-pred-binary\"\n\nMore Information needed"
]
| [
"TAGS\n#region-us \n",
"# Dataset Card for \"imdb-card-pred-binary\"\n\nMore Information needed"
]
| [
6,
19
]
| [
"passage: TAGS\n#region-us \n# Dataset Card for \"imdb-card-pred-binary\"\n\nMore Information needed"
]
|
336da87884bd7f2b17bb1e42c7bc03d01484a989 | # Dataset Card for "imdb-card-pred-decimal"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | CardinalityLM/imdb-card-pred-decimal | [
"region:us"
]
| 2023-11-20T03:52:39+00:00 | {"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "test", "path": "data/test-*"}]}], "dataset_info": {"features": [{"name": "text", "dtype": "string"}, {"name": "prompt", "dtype": "string"}, {"name": "true_cardinality", "dtype": "int64"}], "splits": [{"name": "train", "num_bytes": 39101954.4, "num_examples": 80000}, {"name": "test", "num_bytes": 9775488.6, "num_examples": 20000}], "download_size": 8380198, "dataset_size": 48877443.0}} | 2023-11-20T03:52:43+00:00 | []
| []
| TAGS
#region-us
| # Dataset Card for "imdb-card-pred-decimal"
More Information needed | [
"# Dataset Card for \"imdb-card-pred-decimal\"\n\nMore Information needed"
]
| [
"TAGS\n#region-us \n",
"# Dataset Card for \"imdb-card-pred-decimal\"\n\nMore Information needed"
]
| [
6,
19
]
| [
"passage: TAGS\n#region-us \n# Dataset Card for \"imdb-card-pred-decimal\"\n\nMore Information needed"
]
|
2932c17b3fb82f79de2444038474e306f419f6d8 | # Dataset Card for "imdb-card-pred-science"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | CardinalityLM/imdb-card-pred-science | [
"region:us"
]
| 2023-11-20T03:52:44+00:00 | {"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "test", "path": "data/test-*"}]}], "dataset_info": {"features": [{"name": "text", "dtype": "string"}, {"name": "prompt", "dtype": "string"}, {"name": "true_cardinality", "dtype": "int64"}], "splits": [{"name": "train", "num_bytes": 39344995.2, "num_examples": 80000}, {"name": "test", "num_bytes": 9836248.8, "num_examples": 20000}], "download_size": 8632989, "dataset_size": 49181244.0}} | 2023-11-20T03:52:49+00:00 | []
| []
| TAGS
#region-us
| # Dataset Card for "imdb-card-pred-science"
More Information needed | [
"# Dataset Card for \"imdb-card-pred-science\"\n\nMore Information needed"
]
| [
"TAGS\n#region-us \n",
"# Dataset Card for \"imdb-card-pred-science\"\n\nMore Information needed"
]
| [
6,
18
]
| [
"passage: TAGS\n#region-us \n# Dataset Card for \"imdb-card-pred-science\"\n\nMore Information needed"
]
|
165f683c00b3aae4d99c0647cfb0115655614fe3 |
## Data Format
The dataset is organized in the following structure:
```yaml
dataset/
├── video_id_1/
│ ├── audio_language_1.wav
│ ├── audio_language_2.wav
│ ├── subtitle_language_1.vtt
│ ├── subtitle_language_2.vtt
│ └── unmatched/
│ └── ...
├── video_id_2/
│ ├── ...
└── ...
```
Original version with the channel (MrBeast) will contain 487 hours 27 minutes 59 seconds of audio files.
## Limitations
- **Copyright**: Please be aware of copyright restrictions when using this dataset. Ensure that you have the necessary permissions to use the audio and subtitle data for your intended purposes.
- **Inaccuracies**: While efforts have been made to align audio and subtitles accurately, there may be occasional mismatches or inaccuracies in the dataset. We recommend verifying the quality and alignment of the data for your specific use case.
## Generating Dataset
For generating the dataset launch:
1. `generate_urls.py` - to generate video URLs based on `channel_urls.txt`
2. `generate_dataset.py` - for generating dataset (can take **a lot** of time...)
3. `polish_dataset.py` - for cleaning the folders without any useful data
## Usage
The SoundHarvest dataset can be utilized for a variety of applications, including:
### 1. Automatic Speech Recognition (ASR)
Train ASR models to convert spoken language into text. SoundHarvest provides diverse language samples, making it suitable for multilingual ASR tasks.
### 2. Multilingual Natural Language Processing (NLP)
Leverage the dataset for multilingual NLP tasks, such as:
- Speech sentiment analysis.
- Language identification.
### 3. Linguistic Research and Analysis
Conduct linguistic research and analysis to explore various aspects of languages, including phonetics, dialects, and language evolution.
### 4. Speech-to-Speech Translation
Use the dataset to develop and evaluate speech-to-speech translation models. Translate spoken content from one language to another, expanding the dataset's applications to cross-lingual communication.
## Acknowledgments
We would like to express our gratitude to the YouTube content creators for providing valuable multilingual audio content that makes this dataset possible. | ReDUB/SoundHarvest | [
"task_categories:translation",
"task_categories:audio-to-audio",
"size_categories:1K<n<10K",
"language:ar",
"language:es",
"language:fr",
"language:hi",
"language:id",
"language:ja",
"language:ko",
"language:pt",
"language:ru",
"language:th",
"language:tr",
"language:vi",
"language:en",
"license:other",
"speech2speech",
"region:us"
]
| 2023-11-20T04:21:11+00:00 | {"language": ["ar", "es", "fr", "hi", "id", "ja", "ko", "pt", "ru", "th", "tr", "vi", "en"], "license": "other", "size_categories": ["1K<n<10K"], "task_categories": ["translation", "audio-to-audio"], "pretty_name": "SoundHarvest", "tags": ["speech2speech"]} | 2023-12-14T22:51:51+00:00 | []
| [
"ar",
"es",
"fr",
"hi",
"id",
"ja",
"ko",
"pt",
"ru",
"th",
"tr",
"vi",
"en"
]
| TAGS
#task_categories-translation #task_categories-audio-to-audio #size_categories-1K<n<10K #language-Arabic #language-Spanish #language-French #language-Hindi #language-Indonesian #language-Japanese #language-Korean #language-Portuguese #language-Russian #language-Thai #language-Turkish #language-Vietnamese #language-English #license-other #speech2speech #region-us
|
## Data Format
The dataset is organized in the following structure:
Original version with the channel (MrBeast) will contain 487 hours 27 minutes 59 seconds of audio files.
## Limitations
- Copyright: Please be aware of copyright restrictions when using this dataset. Ensure that you have the necessary permissions to use the audio and subtitle data for your intended purposes.
- Inaccuracies: While efforts have been made to align audio and subtitles accurately, there may be occasional mismatches or inaccuracies in the dataset. We recommend verifying the quality and alignment of the data for your specific use case.
## Generating Dataset
For generating the dataset launch:
1. 'generate_urls.py' - to generate video URLs based on 'channel_urls.txt'
2. 'generate_dataset.py' - for generating dataset (can take a lot of time...)
3. 'polish_dataset.py' - for cleaning the folders without any useful data
## Usage
The SoundHarvest dataset can be utilized for a variety of applications, including:
### 1. Automatic Speech Recognition (ASR)
Train ASR models to convert spoken language into text. SoundHarvest provides diverse language samples, making it suitable for multilingual ASR tasks.
### 2. Multilingual Natural Language Processing (NLP)
Leverage the dataset for multilingual NLP tasks, such as:
- Speech sentiment analysis.
- Language identification.
### 3. Linguistic Research and Analysis
Conduct linguistic research and analysis to explore various aspects of languages, including phonetics, dialects, and language evolution.
### 4. Speech-to-Speech Translation
Use the dataset to develop and evaluate speech-to-speech translation models. Translate spoken content from one language to another, expanding the dataset's applications to cross-lingual communication.
## Acknowledgments
We would like to express our gratitude to the YouTube content creators for providing valuable multilingual audio content that makes this dataset possible. | [
"## Data Format\n\nThe dataset is organized in the following structure:\n\n\n\nOriginal version with the channel (MrBeast) will contain 487 hours 27 minutes 59 seconds of audio files.",
"## Limitations\n\n- Copyright: Please be aware of copyright restrictions when using this dataset. Ensure that you have the necessary permissions to use the audio and subtitle data for your intended purposes.\n\n- Inaccuracies: While efforts have been made to align audio and subtitles accurately, there may be occasional mismatches or inaccuracies in the dataset. We recommend verifying the quality and alignment of the data for your specific use case.",
"## Generating Dataset\n\nFor generating the dataset launch:\n1. 'generate_urls.py' - to generate video URLs based on 'channel_urls.txt'\n2. 'generate_dataset.py' - for generating dataset (can take a lot of time...)\n3. 'polish_dataset.py' - for cleaning the folders without any useful data",
"## Usage\n\nThe SoundHarvest dataset can be utilized for a variety of applications, including:",
"### 1. Automatic Speech Recognition (ASR)\n\nTrain ASR models to convert spoken language into text. SoundHarvest provides diverse language samples, making it suitable for multilingual ASR tasks.",
"### 2. Multilingual Natural Language Processing (NLP)\n\nLeverage the dataset for multilingual NLP tasks, such as:\n\n- Speech sentiment analysis.\n- Language identification.",
"### 3. Linguistic Research and Analysis\n\nConduct linguistic research and analysis to explore various aspects of languages, including phonetics, dialects, and language evolution.",
"### 4. Speech-to-Speech Translation\n\nUse the dataset to develop and evaluate speech-to-speech translation models. Translate spoken content from one language to another, expanding the dataset's applications to cross-lingual communication.",
"## Acknowledgments\n\nWe would like to express our gratitude to the YouTube content creators for providing valuable multilingual audio content that makes this dataset possible."
]
| [
"TAGS\n#task_categories-translation #task_categories-audio-to-audio #size_categories-1K<n<10K #language-Arabic #language-Spanish #language-French #language-Hindi #language-Indonesian #language-Japanese #language-Korean #language-Portuguese #language-Russian #language-Thai #language-Turkish #language-Vietnamese #language-English #license-other #speech2speech #region-us \n",
"## Data Format\n\nThe dataset is organized in the following structure:\n\n\n\nOriginal version with the channel (MrBeast) will contain 487 hours 27 minutes 59 seconds of audio files.",
"## Limitations\n\n- Copyright: Please be aware of copyright restrictions when using this dataset. Ensure that you have the necessary permissions to use the audio and subtitle data for your intended purposes.\n\n- Inaccuracies: While efforts have been made to align audio and subtitles accurately, there may be occasional mismatches or inaccuracies in the dataset. We recommend verifying the quality and alignment of the data for your specific use case.",
"## Generating Dataset\n\nFor generating the dataset launch:\n1. 'generate_urls.py' - to generate video URLs based on 'channel_urls.txt'\n2. 'generate_dataset.py' - for generating dataset (can take a lot of time...)\n3. 'polish_dataset.py' - for cleaning the folders without any useful data",
"## Usage\n\nThe SoundHarvest dataset can be utilized for a variety of applications, including:",
"### 1. Automatic Speech Recognition (ASR)\n\nTrain ASR models to convert spoken language into text. SoundHarvest provides diverse language samples, making it suitable for multilingual ASR tasks.",
"### 2. Multilingual Natural Language Processing (NLP)\n\nLeverage the dataset for multilingual NLP tasks, such as:\n\n- Speech sentiment analysis.\n- Language identification.",
"### 3. Linguistic Research and Analysis\n\nConduct linguistic research and analysis to explore various aspects of languages, including phonetics, dialects, and language evolution.",
"### 4. Speech-to-Speech Translation\n\nUse the dataset to develop and evaluate speech-to-speech translation models. Translate spoken content from one language to another, expanding the dataset's applications to cross-lingual communication.",
"## Acknowledgments\n\nWe would like to express our gratitude to the YouTube content creators for providing valuable multilingual audio content that makes this dataset possible."
]
| [
121,
38,
101,
84,
21,
46,
41,
37,
55,
35
]
| [
"passage: TAGS\n#task_categories-translation #task_categories-audio-to-audio #size_categories-1K<n<10K #language-Arabic #language-Spanish #language-French #language-Hindi #language-Indonesian #language-Japanese #language-Korean #language-Portuguese #language-Russian #language-Thai #language-Turkish #language-Vietnamese #language-English #license-other #speech2speech #region-us \n## Data Format\n\nThe dataset is organized in the following structure:\n\n\n\nOriginal version with the channel (MrBeast) will contain 487 hours 27 minutes 59 seconds of audio files.## Limitations\n\n- Copyright: Please be aware of copyright restrictions when using this dataset. Ensure that you have the necessary permissions to use the audio and subtitle data for your intended purposes.\n\n- Inaccuracies: While efforts have been made to align audio and subtitles accurately, there may be occasional mismatches or inaccuracies in the dataset. We recommend verifying the quality and alignment of the data for your specific use case.## Generating Dataset\n\nFor generating the dataset launch:\n1. 'generate_urls.py' - to generate video URLs based on 'channel_urls.txt'\n2. 'generate_dataset.py' - for generating dataset (can take a lot of time...)\n3. 'polish_dataset.py' - for cleaning the folders without any useful data## Usage\n\nThe SoundHarvest dataset can be utilized for a variety of applications, including:### 1. Automatic Speech Recognition (ASR)\n\nTrain ASR models to convert spoken language into text. SoundHarvest provides diverse language samples, making it suitable for multilingual ASR tasks.### 2. Multilingual Natural Language Processing (NLP)\n\nLeverage the dataset for multilingual NLP tasks, such as:\n\n- Speech sentiment analysis.\n- Language identification.### 3. Linguistic Research and Analysis\n\nConduct linguistic research and analysis to explore various aspects of languages, including phonetics, dialects, and language evolution."
]
|
6847b2e03a49e3a5780da956c18b56e3d815a308 | # Dataset Card for "idea-200"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | facat/idea-200 | [
"region:us"
]
| 2023-11-20T04:47:24+00:00 | {"dataset_info": {"features": [{"name": "query", "dtype": "string"}, {"name": "class", "dtype": "string"}, {"name": "response_gpt35", "dtype": "string"}, {"name": "response_gpt4", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 371240, "num_examples": 200}], "download_size": 239151, "dataset_size": 371240}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2023-11-20T12:25:16+00:00 | []
| []
| TAGS
#region-us
| # Dataset Card for "idea-200"
More Information needed | [
"# Dataset Card for \"idea-200\"\n\nMore Information needed"
]
| [
"TAGS\n#region-us \n",
"# Dataset Card for \"idea-200\"\n\nMore Information needed"
]
| [
6,
12
]
| [
"passage: TAGS\n#region-us \n# Dataset Card for \"idea-200\"\n\nMore Information needed"
]
|
efc071cd9dbc328447522a224322b2cd75eacae9 | all generated through tiefighter 13b awq on vllm. generated in about 5 hours on a a4000. | Superintendent/world-building | [
"region:us"
]
| 2023-11-20T04:55:14+00:00 | {} | 2023-11-21T20:35:29+00:00 | []
| []
| TAGS
#region-us
| all generated through tiefighter 13b awq on vllm. generated in about 5 hours on a a4000. | []
| [
"TAGS\n#region-us \n"
]
| [
6
]
| [
"passage: TAGS\n#region-us \n"
]
|
1c305107fd8ed71b35e646aed92093d74a4d1489 | configs:
- config_name: 1k
data_files:
- split: train
path: "data/train_1k.parquet"
- split: test
path: "data/test.parquet"
- config_name: 5k
data_files:
- split: train
path: "data/train_5k.parquet"
- split: test
path: "data/test.parquet"
- config_name: 10k
data_files:
- split: train
path: "data/train_10k.parquet"
- split: test
path: "data/test.parquet"
- config_name: 15k
data_files:
- split: train
path: "data/train_15k.parquet"
- split: test
path: "data/test.parquet"
- config_name: 20k
data_files:
- split: train
path: "data/train_20k.parquet"
- split: test
path: "data/test.parquet"
- config_name: 30k
data_files:
- split: train
path: "data/train_30k.parquet"
- split: test
path: "data/test.parquet"
- config_name: 50k
data_files:
- split: train
path: "data/train_50k.parquet"
- split: test
path: "data/test.parquet"
# vicuna 실험용 데이터셋
다음 데이터셋으로부터 변환됨:
https://huggingface.co/datasets/junelee/sharegpt_deepl_ko
## 파일구조
- converted.parquet : 원본 데이터셋의 ko_alpaca_style_dataset.json을 트레이닝에 맞도록 형식 변환
## 라이센스
원본 데이터가 OPENAI 이기 때문에 해당 [약관](https://openai.com/policies/terms-of-use)에 따릅니다.
그 이외의 부분은 다음 라이센스를 따릅니다: 저작자표시 2.0 대한민국 (CC BY 2.0 KR)
| EP45/test | [
"region:us"
]
| 2023-11-20T05:28:11+00:00 | {"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train_15k.parquet"}, {"split": "test", "path": "data/test.parquet"}]}]} | 2023-12-02T09:21:32+00:00 | []
| []
| TAGS
#region-us
| configs:
- config_name: 1k
data_files:
- split: train
path: "data/train_1k.parquet"
- split: test
path: "data/test.parquet"
- config_name: 5k
data_files:
- split: train
path: "data/train_5k.parquet"
- split: test
path: "data/test.parquet"
- config_name: 10k
data_files:
- split: train
path: "data/train_10k.parquet"
- split: test
path: "data/test.parquet"
- config_name: 15k
data_files:
- split: train
path: "data/train_15k.parquet"
- split: test
path: "data/test.parquet"
- config_name: 20k
data_files:
- split: train
path: "data/train_20k.parquet"
- split: test
path: "data/test.parquet"
- config_name: 30k
data_files:
- split: train
path: "data/train_30k.parquet"
- split: test
path: "data/test.parquet"
- config_name: 50k
data_files:
- split: train
path: "data/train_50k.parquet"
- split: test
path: "data/test.parquet"
# vicuna 실험용 데이터셋
다음 데이터셋으로부터 변환됨:
URL
## 파일구조
- converted.parquet : 원본 데이터셋의 ko_alpaca_style_dataset.json을 트레이닝에 맞도록 형식 변환
## 라이센스
원본 데이터가 OPENAI 이기 때문에 해당 약관에 따릅니다.
그 이외의 부분은 다음 라이센스를 따릅니다: 저작자표시 2.0 대한민국 (CC BY 2.0 KR)
| [
"# vicuna 실험용 데이터셋\n\n다음 데이터셋으로부터 변환됨:\nURL",
"## 파일구조\n- converted.parquet : 원본 데이터셋의 ko_alpaca_style_dataset.json을 트레이닝에 맞도록 형식 변환",
"## 라이센스\n\n원본 데이터가 OPENAI 이기 때문에 해당 약관에 따릅니다.\n그 이외의 부분은 다음 라이센스를 따릅니다: 저작자표시 2.0 대한민국 (CC BY 2.0 KR)"
]
| [
"TAGS\n#region-us \n",
"# vicuna 실험용 데이터셋\n\n다음 데이터셋으로부터 변환됨:\nURL",
"## 파일구조\n- converted.parquet : 원본 데이터셋의 ko_alpaca_style_dataset.json을 트레이닝에 맞도록 형식 변환",
"## 라이센스\n\n원본 데이터가 OPENAI 이기 때문에 해당 약관에 따릅니다.\n그 이외의 부분은 다음 라이센스를 따릅니다: 저작자표시 2.0 대한민국 (CC BY 2.0 KR)"
]
| [
6,
17,
37,
47
]
| [
"passage: TAGS\n#region-us \n# vicuna 실험용 데이터셋\n\n다음 데이터셋으로부터 변환됨:\nURL## 파일구조\n- converted.parquet : 원본 데이터셋의 ko_alpaca_style_dataset.json을 트레이닝에 맞도록 형식 변환## 라이센스\n\n원본 데이터가 OPENAI 이기 때문에 해당 약관에 따릅니다.\n그 이외의 부분은 다음 라이센스를 따릅니다: 저작자표시 2.0 대한민국 (CC BY 2.0 KR)"
]
|
96ed11d13915874f109bedc3cc1f2eabb736a76c | # Dataset Card for "find_sent_before_sent_train_100_eval_40"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | tyzhu/find_sent_before_sent_train_100_eval_40 | [
"region:us"
]
| 2023-11-20T05:36:35+00:00 | {"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "validation", "path": "data/validation-*"}]}], "dataset_info": {"features": [{"name": "inputs", "dtype": "string"}, {"name": "targets", "dtype": "string"}, {"name": "title", "dtype": "string"}, {"name": "context", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 777978, "num_examples": 644}, {"name": "validation", "num_bytes": 223538, "num_examples": 202}], "download_size": 273207, "dataset_size": 1001516}} | 2023-11-20T06:20:42+00:00 | []
| []
| TAGS
#region-us
| # Dataset Card for "find_sent_before_sent_train_100_eval_40"
More Information needed | [
"# Dataset Card for \"find_sent_before_sent_train_100_eval_40\"\n\nMore Information needed"
]
| [
"TAGS\n#region-us \n",
"# Dataset Card for \"find_sent_before_sent_train_100_eval_40\"\n\nMore Information needed"
]
| [
6,
28
]
| [
"passage: TAGS\n#region-us \n# Dataset Card for \"find_sent_before_sent_train_100_eval_40\"\n\nMore Information needed"
]
|
6540ed3ccbf88e3e80a8a73397db789c48b25beb | # Dataset Card for "find_sent_after_sent_train_100_eval_40"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | tyzhu/find_sent_after_sent_train_100_eval_40 | [
"region:us"
]
| 2023-11-20T05:36:43+00:00 | {"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "validation", "path": "data/validation-*"}]}], "dataset_info": {"features": [{"name": "inputs", "dtype": "string"}, {"name": "targets", "dtype": "string"}, {"name": "title", "dtype": "string"}, {"name": "context", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 776548, "num_examples": 644}, {"name": "validation", "num_bytes": 223190, "num_examples": 202}], "download_size": 275129, "dataset_size": 999738}} | 2023-11-20T06:20:52+00:00 | []
| []
| TAGS
#region-us
| # Dataset Card for "find_sent_after_sent_train_100_eval_40"
More Information needed | [
"# Dataset Card for \"find_sent_after_sent_train_100_eval_40\"\n\nMore Information needed"
]
| [
"TAGS\n#region-us \n",
"# Dataset Card for \"find_sent_after_sent_train_100_eval_40\"\n\nMore Information needed"
]
| [
6,
28
]
| [
"passage: TAGS\n#region-us \n# Dataset Card for \"find_sent_after_sent_train_100_eval_40\"\n\nMore Information needed"
]
|
aa3420d93f696f7a810829cee33d6ff11818fa47 | # Dataset Card for "find_first_sent_train_200_eval_40"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | tyzhu/find_first_sent_train_200_eval_40 | [
"region:us"
]
| 2023-11-20T05:37:06+00:00 | {"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "validation", "path": "data/validation-*"}]}], "dataset_info": {"features": [{"name": "inputs", "dtype": "string"}, {"name": "targets", "dtype": "string"}, {"name": "title", "dtype": "string"}, {"name": "context", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 570793, "num_examples": 440}, {"name": "validation", "num_bytes": 40604, "num_examples": 40}], "download_size": 0, "dataset_size": 611397}} | 2023-11-20T06:21:13+00:00 | []
| []
| TAGS
#region-us
| # Dataset Card for "find_first_sent_train_200_eval_40"
More Information needed | [
"# Dataset Card for \"find_first_sent_train_200_eval_40\"\n\nMore Information needed"
]
| [
"TAGS\n#region-us \n",
"# Dataset Card for \"find_first_sent_train_200_eval_40\"\n\nMore Information needed"
]
| [
6,
26
]
| [
"passage: TAGS\n#region-us \n# Dataset Card for \"find_first_sent_train_200_eval_40\"\n\nMore Information needed"
]
|
169075465b3ea218196c2bf18d8ad66b52849eaf | # Dataset Card for "find_second_sent_train_200_eval_40"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | tyzhu/find_second_sent_train_200_eval_40 | [
"region:us"
]
| 2023-11-20T05:37:14+00:00 | {"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "validation", "path": "data/validation-*"}]}], "dataset_info": {"features": [{"name": "inputs", "dtype": "string"}, {"name": "targets", "dtype": "string"}, {"name": "title", "dtype": "string"}, {"name": "context", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 570351, "num_examples": 440}, {"name": "validation", "num_bytes": 41108, "num_examples": 40}], "download_size": 0, "dataset_size": 611459}} | 2023-11-20T06:21:17+00:00 | []
| []
| TAGS
#region-us
| # Dataset Card for "find_second_sent_train_200_eval_40"
More Information needed | [
"# Dataset Card for \"find_second_sent_train_200_eval_40\"\n\nMore Information needed"
]
| [
"TAGS\n#region-us \n",
"# Dataset Card for \"find_second_sent_train_200_eval_40\"\n\nMore Information needed"
]
| [
6,
25
]
| [
"passage: TAGS\n#region-us \n# Dataset Card for \"find_second_sent_train_200_eval_40\"\n\nMore Information needed"
]
|
7f7319cabbd45e9d9c16a8edbcbb6b2b05e56250 | # Dataset Card for "find_last_sent_train_200_eval_40"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | tyzhu/find_last_sent_train_200_eval_40 | [
"region:us"
]
| 2023-11-20T05:37:20+00:00 | {"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "validation", "path": "data/validation-*"}]}], "dataset_info": {"features": [{"name": "inputs", "dtype": "string"}, {"name": "targets", "dtype": "string"}, {"name": "title", "dtype": "string"}, {"name": "context", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 570390, "num_examples": 440}, {"name": "validation", "num_bytes": 39956, "num_examples": 40}], "download_size": 0, "dataset_size": 610346}} | 2023-11-20T06:21:22+00:00 | []
| []
| TAGS
#region-us
| # Dataset Card for "find_last_sent_train_200_eval_40"
More Information needed | [
"# Dataset Card for \"find_last_sent_train_200_eval_40\"\n\nMore Information needed"
]
| [
"TAGS\n#region-us \n",
"# Dataset Card for \"find_last_sent_train_200_eval_40\"\n\nMore Information needed"
]
| [
6,
25
]
| [
"passage: TAGS\n#region-us \n# Dataset Card for \"find_last_sent_train_200_eval_40\"\n\nMore Information needed"
]
|
60b5687e757eba4147007810deaacde759e0c741 | # Dataset Card for "find_sent_before_sent_train_200_eval_40"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | tyzhu/find_sent_before_sent_train_200_eval_40 | [
"region:us"
]
| 2023-11-20T05:37:27+00:00 | {"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "validation", "path": "data/validation-*"}]}], "dataset_info": {"features": [{"name": "inputs", "dtype": "string"}, {"name": "targets", "dtype": "string"}, {"name": "title", "dtype": "string"}, {"name": "context", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 1527376, "num_examples": 1263}, {"name": "validation", "num_bytes": 236067, "num_examples": 203}], "download_size": 450742, "dataset_size": 1763443}} | 2023-11-20T06:21:34+00:00 | []
| []
| TAGS
#region-us
| # Dataset Card for "find_sent_before_sent_train_200_eval_40"
More Information needed | [
"# Dataset Card for \"find_sent_before_sent_train_200_eval_40\"\n\nMore Information needed"
]
| [
"TAGS\n#region-us \n",
"# Dataset Card for \"find_sent_before_sent_train_200_eval_40\"\n\nMore Information needed"
]
| [
6,
28
]
| [
"passage: TAGS\n#region-us \n# Dataset Card for \"find_sent_before_sent_train_200_eval_40\"\n\nMore Information needed"
]
|
553bb7402632c796fad4337bd56522ca7a33c795 | # Dataset Card for "find_sent_after_sent_train_200_eval_40"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | tyzhu/find_sent_after_sent_train_200_eval_40 | [
"region:us"
]
| 2023-11-20T05:37:34+00:00 | {"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "validation", "path": "data/validation-*"}]}], "dataset_info": {"features": [{"name": "inputs", "dtype": "string"}, {"name": "targets", "dtype": "string"}, {"name": "title", "dtype": "string"}, {"name": "context", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 1526150, "num_examples": 1263}, {"name": "validation", "num_bytes": 235256, "num_examples": 203}], "download_size": 452785, "dataset_size": 1761406}} | 2023-11-20T06:21:45+00:00 | []
| []
| TAGS
#region-us
| # Dataset Card for "find_sent_after_sent_train_200_eval_40"
More Information needed | [
"# Dataset Card for \"find_sent_after_sent_train_200_eval_40\"\n\nMore Information needed"
]
| [
"TAGS\n#region-us \n",
"# Dataset Card for \"find_sent_after_sent_train_200_eval_40\"\n\nMore Information needed"
]
| [
6,
28
]
| [
"passage: TAGS\n#region-us \n# Dataset Card for \"find_sent_after_sent_train_200_eval_40\"\n\nMore Information needed"
]
|
ede148c59733bfaf7c96f522f9baae8bc186bd31 | # Dataset Card for "find_first_sent_train_400_eval_40"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | tyzhu/find_first_sent_train_400_eval_40 | [
"region:us"
]
| 2023-11-20T05:37:59+00:00 | {"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "validation", "path": "data/validation-*"}]}], "dataset_info": {"features": [{"name": "inputs", "dtype": "string"}, {"name": "targets", "dtype": "string"}, {"name": "title", "dtype": "string"}, {"name": "context", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 1072675, "num_examples": 840}, {"name": "validation", "num_bytes": 41217, "num_examples": 40}], "download_size": 0, "dataset_size": 1113892}} | 2023-11-20T06:22:08+00:00 | []
| []
| TAGS
#region-us
| # Dataset Card for "find_first_sent_train_400_eval_40"
More Information needed | [
"# Dataset Card for \"find_first_sent_train_400_eval_40\"\n\nMore Information needed"
]
| [
"TAGS\n#region-us \n",
"# Dataset Card for \"find_first_sent_train_400_eval_40\"\n\nMore Information needed"
]
| [
6,
26
]
| [
"passage: TAGS\n#region-us \n# Dataset Card for \"find_first_sent_train_400_eval_40\"\n\nMore Information needed"
]
|
20186eb4b38bfcc0e44d8359708bb89c55db290c | # Dataset Card for "find_second_sent_train_400_eval_40"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | tyzhu/find_second_sent_train_400_eval_40 | [
"region:us"
]
| 2023-11-20T05:38:06+00:00 | {"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "validation", "path": "data/validation-*"}]}], "dataset_info": {"features": [{"name": "inputs", "dtype": "string"}, {"name": "targets", "dtype": "string"}, {"name": "title", "dtype": "string"}, {"name": "context", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 1073147, "num_examples": 840}, {"name": "validation", "num_bytes": 40955, "num_examples": 40}], "download_size": 0, "dataset_size": 1114102}} | 2023-11-20T06:22:12+00:00 | []
| []
| TAGS
#region-us
| # Dataset Card for "find_second_sent_train_400_eval_40"
More Information needed | [
"# Dataset Card for \"find_second_sent_train_400_eval_40\"\n\nMore Information needed"
]
| [
"TAGS\n#region-us \n",
"# Dataset Card for \"find_second_sent_train_400_eval_40\"\n\nMore Information needed"
]
| [
6,
25
]
| [
"passage: TAGS\n#region-us \n# Dataset Card for \"find_second_sent_train_400_eval_40\"\n\nMore Information needed"
]
|
cd14209d7ba4c95f3918dcf65ac47aca61599124 | # Dataset Card for "find_last_sent_train_400_eval_40"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | tyzhu/find_last_sent_train_400_eval_40 | [
"region:us"
]
| 2023-11-20T05:38:13+00:00 | {"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "validation", "path": "data/validation-*"}]}], "dataset_info": {"features": [{"name": "inputs", "dtype": "string"}, {"name": "targets", "dtype": "string"}, {"name": "title", "dtype": "string"}, {"name": "context", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 1071032, "num_examples": 840}, {"name": "validation", "num_bytes": 41250, "num_examples": 40}], "download_size": 0, "dataset_size": 1112282}} | 2023-11-20T06:22:15+00:00 | []
| []
| TAGS
#region-us
| # Dataset Card for "find_last_sent_train_400_eval_40"
More Information needed | [
"# Dataset Card for \"find_last_sent_train_400_eval_40\"\n\nMore Information needed"
]
| [
"TAGS\n#region-us \n",
"# Dataset Card for \"find_last_sent_train_400_eval_40\"\n\nMore Information needed"
]
| [
6,
25
]
| [
"passage: TAGS\n#region-us \n# Dataset Card for \"find_last_sent_train_400_eval_40\"\n\nMore Information needed"
]
|
a53e4f6252379f4ace0a0c98dea2c8ce9f337ba7 | # Dataset Card for "find_sent_before_sent_train_400_eval_40"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | tyzhu/find_sent_before_sent_train_400_eval_40 | [
"region:us"
]
| 2023-11-20T05:38:19+00:00 | {"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "validation", "path": "data/validation-*"}]}], "dataset_info": {"features": [{"name": "inputs", "dtype": "string"}, {"name": "targets", "dtype": "string"}, {"name": "title", "dtype": "string"}, {"name": "context", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 2938930, "num_examples": 2434}, {"name": "validation", "num_bytes": 232610, "num_examples": 200}], "download_size": 775051, "dataset_size": 3171540}} | 2023-11-20T06:22:27+00:00 | []
| []
| TAGS
#region-us
| # Dataset Card for "find_sent_before_sent_train_400_eval_40"
More Information needed | [
"# Dataset Card for \"find_sent_before_sent_train_400_eval_40\"\n\nMore Information needed"
]
| [
"TAGS\n#region-us \n",
"# Dataset Card for \"find_sent_before_sent_train_400_eval_40\"\n\nMore Information needed"
]
| [
6,
28
]
| [
"passage: TAGS\n#region-us \n# Dataset Card for \"find_sent_before_sent_train_400_eval_40\"\n\nMore Information needed"
]
|
871732129c4bdf1f6b0e79f4b88a21bba866ee91 | # Dataset Card for "find_sent_after_sent_train_400_eval_40"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | tyzhu/find_sent_after_sent_train_400_eval_40 | [
"region:us"
]
| 2023-11-20T05:38:26+00:00 | {"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "validation", "path": "data/validation-*"}]}], "dataset_info": {"features": [{"name": "inputs", "dtype": "string"}, {"name": "targets", "dtype": "string"}, {"name": "title", "dtype": "string"}, {"name": "context", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 2935693, "num_examples": 2434}, {"name": "validation", "num_bytes": 232483, "num_examples": 200}], "download_size": 778097, "dataset_size": 3168176}} | 2023-11-20T06:22:38+00:00 | []
| []
| TAGS
#region-us
| # Dataset Card for "find_sent_after_sent_train_400_eval_40"
More Information needed | [
"# Dataset Card for \"find_sent_after_sent_train_400_eval_40\"\n\nMore Information needed"
]
| [
"TAGS\n#region-us \n",
"# Dataset Card for \"find_sent_after_sent_train_400_eval_40\"\n\nMore Information needed"
]
| [
6,
28
]
| [
"passage: TAGS\n#region-us \n# Dataset Card for \"find_sent_after_sent_train_400_eval_40\"\n\nMore Information needed"
]
|
ea50bd049ed645ffe612d5fe00c3bb220e5f437d | # Dataset Card for "squad_title_v4_train_30_eval_10_recite_ans_sent"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | tyzhu/squad_title_v4_train_30_eval_10_recite_ans_sent | [
"region:us"
]
| 2023-11-20T06:52:40+00:00 | {"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "validation", "path": "data/validation-*"}]}], "dataset_info": {"features": [{"name": "id", "dtype": "string"}, {"name": "title", "dtype": "string"}, {"name": "context", "dtype": "string"}, {"name": "question", "dtype": "string"}, {"name": "answers", "sequence": [{"name": "text", "dtype": "string"}, {"name": "answer_start", "dtype": "int32"}]}, {"name": "context_id", "dtype": "string"}, {"name": "inputs", "dtype": "string"}, {"name": "targets", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 590548, "num_examples": 368}, {"name": "validation", "num_bytes": 60559, "num_examples": 50}], "download_size": 118596, "dataset_size": 651107}} | 2023-11-20T06:52:47+00:00 | []
| []
| TAGS
#region-us
| # Dataset Card for "squad_title_v4_train_30_eval_10_recite_ans_sent"
More Information needed | [
"# Dataset Card for \"squad_title_v4_train_30_eval_10_recite_ans_sent\"\n\nMore Information needed"
]
| [
"TAGS\n#region-us \n",
"# Dataset Card for \"squad_title_v4_train_30_eval_10_recite_ans_sent\"\n\nMore Information needed"
]
| [
6,
34
]
| [
"passage: TAGS\n#region-us \n# Dataset Card for \"squad_title_v4_train_30_eval_10_recite_ans_sent\"\n\nMore Information needed"
]
|
565787b9d5da31d3e16b1bcf7893243c73123496 | # Dataset Card for "squad_wrong_title_v4_train_30_eval_10_recite_ans_sent"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | tyzhu/squad_wrong_title_v4_train_30_eval_10_recite_ans_sent | [
"region:us"
]
| 2023-11-20T06:53:02+00:00 | {"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "validation", "path": "data/validation-*"}]}], "dataset_info": {"features": [{"name": "id", "dtype": "string"}, {"name": "title", "dtype": "string"}, {"name": "context", "dtype": "string"}, {"name": "question", "dtype": "string"}, {"name": "answers", "sequence": [{"name": "text", "dtype": "string"}, {"name": "answer_start", "dtype": "int32"}]}, {"name": "context_id", "dtype": "string"}, {"name": "inputs", "dtype": "string"}, {"name": "targets", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 590548, "num_examples": 368}, {"name": "validation", "num_bytes": 60527, "num_examples": 50}], "download_size": 118986, "dataset_size": 651075}} | 2023-11-20T06:53:08+00:00 | []
| []
| TAGS
#region-us
| # Dataset Card for "squad_wrong_title_v4_train_30_eval_10_recite_ans_sent"
More Information needed | [
"# Dataset Card for \"squad_wrong_title_v4_train_30_eval_10_recite_ans_sent\"\n\nMore Information needed"
]
| [
"TAGS\n#region-us \n",
"# Dataset Card for \"squad_wrong_title_v4_train_30_eval_10_recite_ans_sent\"\n\nMore Information needed"
]
| [
6,
37
]
| [
"passage: TAGS\n#region-us \n# Dataset Card for \"squad_wrong_title_v4_train_30_eval_10_recite_ans_sent\"\n\nMore Information needed"
]
|
3a7d46963ca0592f57161aeafbf8cee47dede9aa | # Dataset Card for "squad_no_title_v4_train_30_eval_10_recite_ans_sent"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | tyzhu/squad_no_title_v4_train_30_eval_10_recite_ans_sent | [
"region:us"
]
| 2023-11-20T06:53:22+00:00 | {"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "validation", "path": "data/validation-*"}]}], "dataset_info": {"features": [{"name": "id", "dtype": "string"}, {"name": "title", "dtype": "string"}, {"name": "context", "dtype": "string"}, {"name": "question", "dtype": "string"}, {"name": "answers", "sequence": [{"name": "text", "dtype": "string"}, {"name": "answer_start", "dtype": "int32"}]}, {"name": "context_id", "dtype": "string"}, {"name": "inputs", "dtype": "string"}, {"name": "targets", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 590548, "num_examples": 368}, {"name": "validation", "num_bytes": 48707, "num_examples": 50}], "download_size": 113536, "dataset_size": 639255}} | 2023-11-20T06:53:28+00:00 | []
| []
| TAGS
#region-us
| # Dataset Card for "squad_no_title_v4_train_30_eval_10_recite_ans_sent"
More Information needed | [
"# Dataset Card for \"squad_no_title_v4_train_30_eval_10_recite_ans_sent\"\n\nMore Information needed"
]
| [
"TAGS\n#region-us \n",
"# Dataset Card for \"squad_no_title_v4_train_30_eval_10_recite_ans_sent\"\n\nMore Information needed"
]
| [
6,
36
]
| [
"passage: TAGS\n#region-us \n# Dataset Card for \"squad_no_title_v4_train_30_eval_10_recite_ans_sent\"\n\nMore Information needed"
]
|
7def32e4815841d72fe966c2be99048236b2ff1a | # Dataset Card for "squad_no_title_strict_v4_train_30_eval_10_recite_ans_sent"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | tyzhu/squad_no_title_strict_v4_train_30_eval_10_recite_ans_sent | [
"region:us"
]
| 2023-11-20T06:53:41+00:00 | {"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "validation", "path": "data/validation-*"}]}], "dataset_info": {"features": [{"name": "id", "dtype": "string"}, {"name": "title", "dtype": "string"}, {"name": "context", "dtype": "string"}, {"name": "question", "dtype": "string"}, {"name": "answers", "sequence": [{"name": "text", "dtype": "string"}, {"name": "answer_start", "dtype": "int32"}]}, {"name": "context_id", "dtype": "string"}, {"name": "inputs", "dtype": "string"}, {"name": "targets", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 548439, "num_examples": 368}, {"name": "validation", "num_bytes": 48707, "num_examples": 50}], "download_size": 104798, "dataset_size": 597146}} | 2023-11-20T06:53:48+00:00 | []
| []
| TAGS
#region-us
| # Dataset Card for "squad_no_title_strict_v4_train_30_eval_10_recite_ans_sent"
More Information needed | [
"# Dataset Card for \"squad_no_title_strict_v4_train_30_eval_10_recite_ans_sent\"\n\nMore Information needed"
]
| [
"TAGS\n#region-us \n",
"# Dataset Card for \"squad_no_title_strict_v4_train_30_eval_10_recite_ans_sent\"\n\nMore Information needed"
]
| [
6,
38
]
| [
"passage: TAGS\n#region-us \n# Dataset Card for \"squad_no_title_strict_v4_train_30_eval_10_recite_ans_sent\"\n\nMore Information needed"
]
|
dba901eafa10d234456612492887db66b7ec8649 | # Dataset Card for "squad_baseline_v4_train_30_eval_10_recite_ans_sent"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | tyzhu/squad_baseline_v4_train_30_eval_10_recite_ans_sent | [
"region:us"
]
| 2023-11-20T06:54:01+00:00 | {"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "validation", "path": "data/validation-*"}]}], "dataset_info": {"features": [{"name": "id", "dtype": "string"}, {"name": "title", "dtype": "string"}, {"name": "context", "dtype": "string"}, {"name": "question", "dtype": "string"}, {"name": "answers", "sequence": [{"name": "text", "dtype": "string"}, {"name": "answer_start", "dtype": "int32"}]}, {"name": "inputs", "dtype": "string"}, {"name": "targets", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 172536, "num_examples": 159}, {"name": "validation", "num_bytes": 47457, "num_examples": 50}], "download_size": 75697, "dataset_size": 219993}} | 2023-11-20T06:54:07+00:00 | []
| []
| TAGS
#region-us
| # Dataset Card for "squad_baseline_v4_train_30_eval_10_recite_ans_sent"
More Information needed | [
"# Dataset Card for \"squad_baseline_v4_train_30_eval_10_recite_ans_sent\"\n\nMore Information needed"
]
| [
"TAGS\n#region-us \n",
"# Dataset Card for \"squad_baseline_v4_train_30_eval_10_recite_ans_sent\"\n\nMore Information needed"
]
| [
6,
35
]
| [
"passage: TAGS\n#region-us \n# Dataset Card for \"squad_baseline_v4_train_30_eval_10_recite_ans_sent\"\n\nMore Information needed"
]
|
ae0e3d71d484d5b9dbcbe0fd7c255b07a43c5f9c | # Dataset Card for "squad_rare_v4_train_30_eval_10_recite_ans_sent"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | tyzhu/squad_rare_v4_train_30_eval_10_recite_ans_sent | [
"region:us"
]
| 2023-11-20T06:54:21+00:00 | {"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "validation", "path": "data/validation-*"}]}], "dataset_info": {"features": [{"name": "id", "dtype": "string"}, {"name": "title", "dtype": "string"}, {"name": "context", "dtype": "string"}, {"name": "question", "dtype": "string"}, {"name": "answers", "sequence": [{"name": "text", "dtype": "string"}, {"name": "answer_start", "dtype": "int32"}]}, {"name": "context_id", "dtype": "string"}, {"name": "inputs", "dtype": "string"}, {"name": "targets", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 581992, "num_examples": 368}, {"name": "validation", "num_bytes": 59435, "num_examples": 50}], "download_size": 117856, "dataset_size": 641427}} | 2023-11-20T06:54:27+00:00 | []
| []
| TAGS
#region-us
| # Dataset Card for "squad_rare_v4_train_30_eval_10_recite_ans_sent"
More Information needed | [
"# Dataset Card for \"squad_rare_v4_train_30_eval_10_recite_ans_sent\"\n\nMore Information needed"
]
| [
"TAGS\n#region-us \n",
"# Dataset Card for \"squad_rare_v4_train_30_eval_10_recite_ans_sent\"\n\nMore Information needed"
]
| [
6,
34
]
| [
"passage: TAGS\n#region-us \n# Dataset Card for \"squad_rare_v4_train_30_eval_10_recite_ans_sent\"\n\nMore Information needed"
]
|
c8dfe8306567366d95bf8025af1190f5c8bf49ab | # Dataset Card for "squad_no_rare_v4_train_30_eval_10_recite_ans_sent"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | tyzhu/squad_no_rare_v4_train_30_eval_10_recite_ans_sent | [
"region:us"
]
| 2023-11-20T06:54:40+00:00 | {"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "validation", "path": "data/validation-*"}]}], "dataset_info": {"features": [{"name": "id", "dtype": "string"}, {"name": "title", "dtype": "string"}, {"name": "context", "dtype": "string"}, {"name": "question", "dtype": "string"}, {"name": "answers", "sequence": [{"name": "text", "dtype": "string"}, {"name": "answer_start", "dtype": "int32"}]}, {"name": "context_id", "dtype": "string"}, {"name": "inputs", "dtype": "string"}, {"name": "targets", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 581992, "num_examples": 368}, {"name": "validation", "num_bytes": 48145, "num_examples": 50}], "download_size": 112955, "dataset_size": 630137}} | 2023-11-20T06:54:46+00:00 | []
| []
| TAGS
#region-us
| # Dataset Card for "squad_no_rare_v4_train_30_eval_10_recite_ans_sent"
More Information needed | [
"# Dataset Card for \"squad_no_rare_v4_train_30_eval_10_recite_ans_sent\"\n\nMore Information needed"
]
| [
"TAGS\n#region-us \n",
"# Dataset Card for \"squad_no_rare_v4_train_30_eval_10_recite_ans_sent\"\n\nMore Information needed"
]
| [
6,
36
]
| [
"passage: TAGS\n#region-us \n# Dataset Card for \"squad_no_rare_v4_train_30_eval_10_recite_ans_sent\"\n\nMore Information needed"
]
|
459500d772675f03c255719c58e07b7a63894174 | # Dataset Card for "squad_wrong_rare_v4_train_30_eval_10_recite_ans_sent"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | tyzhu/squad_wrong_rare_v4_train_30_eval_10_recite_ans_sent | [
"region:us"
]
| 2023-11-20T06:54:59+00:00 | {"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "validation", "path": "data/validation-*"}]}], "dataset_info": {"features": [{"name": "id", "dtype": "string"}, {"name": "title", "dtype": "string"}, {"name": "context", "dtype": "string"}, {"name": "question", "dtype": "string"}, {"name": "answers", "sequence": [{"name": "text", "dtype": "string"}, {"name": "answer_start", "dtype": "int32"}]}, {"name": "context_id", "dtype": "string"}, {"name": "inputs", "dtype": "string"}, {"name": "targets", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 581992, "num_examples": 368}, {"name": "validation", "num_bytes": 59965, "num_examples": 50}], "download_size": 118405, "dataset_size": 641957}} | 2023-11-20T06:55:07+00:00 | []
| []
| TAGS
#region-us
| # Dataset Card for "squad_wrong_rare_v4_train_30_eval_10_recite_ans_sent"
More Information needed | [
"# Dataset Card for \"squad_wrong_rare_v4_train_30_eval_10_recite_ans_sent\"\n\nMore Information needed"
]
| [
"TAGS\n#region-us \n",
"# Dataset Card for \"squad_wrong_rare_v4_train_30_eval_10_recite_ans_sent\"\n\nMore Information needed"
]
| [
6,
37
]
| [
"passage: TAGS\n#region-us \n# Dataset Card for \"squad_wrong_rare_v4_train_30_eval_10_recite_ans_sent\"\n\nMore Information needed"
]
|
4712a7fc1276184bb9af845a5fa490928e0051fa | # Dataset Card for "squad_no_rare_strict_v4_train_30_eval_10_recite_ans_sent"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | tyzhu/squad_no_rare_strict_v4_train_30_eval_10_recite_ans_sent | [
"region:us"
]
| 2023-11-20T06:55:20+00:00 | {"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "validation", "path": "data/validation-*"}]}], "dataset_info": {"features": [{"name": "id", "dtype": "string"}, {"name": "title", "dtype": "string"}, {"name": "context", "dtype": "string"}, {"name": "question", "dtype": "string"}, {"name": "answers", "sequence": [{"name": "text", "dtype": "string"}, {"name": "answer_start", "dtype": "int32"}]}, {"name": "context_id", "dtype": "string"}, {"name": "inputs", "dtype": "string"}, {"name": "targets", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 541741, "num_examples": 368}, {"name": "validation", "num_bytes": 48145, "num_examples": 50}], "download_size": 104315, "dataset_size": 589886}} | 2023-11-20T06:55:27+00:00 | []
| []
| TAGS
#region-us
| # Dataset Card for "squad_no_rare_strict_v4_train_30_eval_10_recite_ans_sent"
More Information needed | [
"# Dataset Card for \"squad_no_rare_strict_v4_train_30_eval_10_recite_ans_sent\"\n\nMore Information needed"
]
| [
"TAGS\n#region-us \n",
"# Dataset Card for \"squad_no_rare_strict_v4_train_30_eval_10_recite_ans_sent\"\n\nMore Information needed"
]
| [
6,
38
]
| [
"passage: TAGS\n#region-us \n# Dataset Card for \"squad_no_rare_strict_v4_train_30_eval_10_recite_ans_sent\"\n\nMore Information needed"
]
|
5745799e5451f0fe931537374ee1bfe4174c9e4d | # Dataset Card for Dataset Name
<!-- Provide a quick summary of the dataset. -->
This dataset card aims to be a base template for new datasets. It has been generated using [this raw template](https://github.com/huggingface/huggingface_hub/blob/main/src/huggingface_hub/templates/datasetcard_template.md?plain=1).
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | rjhuang/my_wb_preferences | [
"task_categories:text-classification",
"language:zh",
"license:apache-2.0",
"social",
"region:us"
]
| 2023-11-20T06:59:34+00:00 | {"language": ["zh"], "license": "apache-2.0", "task_categories": ["text-classification"], "tags": ["social"], "dataset_info": {"features": [{"name": "author", "dtype": "string"}, {"name": "content", "dtype": "string"}, {"name": "response", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 156400, "num_examples": 417}], "download_size": 100117, "dataset_size": 156400}} | 2023-11-20T08:27:26+00:00 | []
| [
"zh"
]
| TAGS
#task_categories-text-classification #language-Chinese #license-apache-2.0 #social #region-us
| # Dataset Card for Dataset Name
This dataset card aims to be a base template for new datasets. It has been generated using this raw template.
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Dataset Name\n\n\n\nThis dataset card aims to be a base template for new datasets. It has been generated using this raw template.",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
]
| [
"TAGS\n#task_categories-text-classification #language-Chinese #license-apache-2.0 #social #region-us \n",
"# Dataset Card for Dataset Name\n\n\n\nThis dataset card aims to be a base template for new datasets. It has been generated using this raw template.",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
]
| [
32,
34,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
]
| [
"passage: TAGS\n#task_categories-text-classification #language-Chinese #license-apache-2.0 #social #region-us \n# Dataset Card for Dataset Name\n\n\n\nThis dataset card aims to be a base template for new datasets. It has been generated using this raw template.## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Dataset Card Authors [optional]## Dataset Card Contact"
]
|
2e2c2c637ae5afec0494967a29d591c161398a9d | # B3 Historical Quotes
<!-- Provide a quick summary of the dataset. -->
This dataset is a collection of historical quotes from the brazilian stock market(B3).
It contains historical quotes from all stocks in the country from Jan/2015 until Oct/2023.
## Dataset Details
All the data was retrieved as is from [B3 Historical Data](https://www.b3.com.br/en_us/market-data-and-indices/data-services/market-data/historical-data/equities/historical-quotes/)
and parsed to a csv. The columns are the same as the ones from the original content.
If you need more informations about the columns, it can be found in the [official b3 documentation](https://www.b3.com.br/en_us/market-data-and-indices/data-services/market-data/historical-data/equities/historical-quote-data/). | jmbrito/b3-historical-quotes | [
"size_categories:1M<n<10M",
"license:mit",
"finance",
"b3",
"quotes",
"historical",
"region:us"
]
| 2023-11-20T07:00:53+00:00 | {"license": "mit", "size_categories": ["1M<n<10M"], "pretty_name": "B3 Historical Quotes", "tags": ["finance", "b3", "quotes", "historical"]} | 2024-01-31T15:35:18+00:00 | []
| []
| TAGS
#size_categories-1M<n<10M #license-mit #finance #b3 #quotes #historical #region-us
| # B3 Historical Quotes
This dataset is a collection of historical quotes from the brazilian stock market(B3).
It contains historical quotes from all stocks in the country from Jan/2015 until Oct/2023.
## Dataset Details
All the data was retrieved as is from B3 Historical Data
and parsed to a csv. The columns are the same as the ones from the original content.
If you need more informations about the columns, it can be found in the official b3 documentation. | [
"# B3 Historical Quotes\n\n\n\nThis dataset is a collection of historical quotes from the brazilian stock market(B3). \n\nIt contains historical quotes from all stocks in the country from Jan/2015 until Oct/2023.",
"## Dataset Details\n\nAll the data was retrieved as is from B3 Historical Data\nand parsed to a csv. The columns are the same as the ones from the original content.\n\nIf you need more informations about the columns, it can be found in the official b3 documentation."
]
| [
"TAGS\n#size_categories-1M<n<10M #license-mit #finance #b3 #quotes #historical #region-us \n",
"# B3 Historical Quotes\n\n\n\nThis dataset is a collection of historical quotes from the brazilian stock market(B3). \n\nIt contains historical quotes from all stocks in the country from Jan/2015 until Oct/2023.",
"## Dataset Details\n\nAll the data was retrieved as is from B3 Historical Data\nand parsed to a csv. The columns are the same as the ones from the original content.\n\nIf you need more informations about the columns, it can be found in the official b3 documentation."
]
| [
35,
48,
64
]
| [
"passage: TAGS\n#size_categories-1M<n<10M #license-mit #finance #b3 #quotes #historical #region-us \n# B3 Historical Quotes\n\n\n\nThis dataset is a collection of historical quotes from the brazilian stock market(B3). \n\nIt contains historical quotes from all stocks in the country from Jan/2015 until Oct/2023.## Dataset Details\n\nAll the data was retrieved as is from B3 Historical Data\nand parsed to a csv. The columns are the same as the ones from the original content.\n\nIf you need more informations about the columns, it can be found in the official b3 documentation."
]
|
602f0953ea0b3aa348eecbb94962af12d695c1f9 | # Dataset Card for "NER-VN"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | NguyenVanHieu1605/NER-VN | [
"region:us"
]
| 2023-11-20T07:08:24+00:00 | {"dataset_info": {"features": [{"name": "tokens", "sequence": "string"}, {"name": "NE_labels", "sequence": "int64"}, {"name": "nested_NE_labels", "sequence": "int64"}], "splits": [{"name": "train", "num_bytes": 7653204, "num_examples": 13486}, {"name": "valid", "num_bytes": 1915087, "num_examples": 3372}, {"name": "test", "num_bytes": 1706240, "num_examples": 2831}], "download_size": 1632188, "dataset_size": 11274531}} | 2023-11-20T07:08:45+00:00 | []
| []
| TAGS
#region-us
| # Dataset Card for "NER-VN"
More Information needed | [
"# Dataset Card for \"NER-VN\"\n\nMore Information needed"
]
| [
"TAGS\n#region-us \n",
"# Dataset Card for \"NER-VN\"\n\nMore Information needed"
]
| [
6,
13
]
| [
"passage: TAGS\n#region-us \n# Dataset Card for \"NER-VN\"\n\nMore Information needed"
]
|
7b7a95f9cd823c80428a3761e7c81a604bd5dc6e | # Dataset Card for "captions10"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | cherry0324/captions10 | [
"region:us"
]
| 2023-11-20T07:23:08+00:00 | {"dataset_info": {"features": [{"name": "image", "dtype": "image"}, {"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 207987814.0, "num_examples": 50000}], "download_size": 221800655, "dataset_size": 207987814.0}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2023-11-20T07:25:23+00:00 | []
| []
| TAGS
#region-us
| # Dataset Card for "captions10"
More Information needed | [
"# Dataset Card for \"captions10\"\n\nMore Information needed"
]
| [
"TAGS\n#region-us \n",
"# Dataset Card for \"captions10\"\n\nMore Information needed"
]
| [
6,
13
]
| [
"passage: TAGS\n#region-us \n# Dataset Card for \"captions10\"\n\nMore Information needed"
]
|
ea768d2c3bcbee7f89f50e55d6ec4e749948bf1d | Invoice Data | kaniam/invoice | [
"region:us"
]
| 2023-11-20T07:26:10+00:00 | {} | 2023-12-21T08:36:37+00:00 | []
| []
| TAGS
#region-us
| Invoice Data | []
| [
"TAGS\n#region-us \n"
]
| [
6
]
| [
"passage: TAGS\n#region-us \n"
]
|
fad62af8386dd1c31bef0cbace0c2d579eab1f40 |
# Dataset Card for Evaluation run of SciPhi/SciPhi-Self-RAG-Mistral-7B-32k
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/SciPhi/SciPhi-Self-RAG-Mistral-7B-32k
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [SciPhi/SciPhi-Self-RAG-Mistral-7B-32k](https://huggingface.co/SciPhi/SciPhi-Self-RAG-Mistral-7B-32k) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_SciPhi__SciPhi-Self-RAG-Mistral-7B-32k_public",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-11-20T07:34:10.299317](https://huggingface.co/datasets/open-llm-leaderboard/details_SciPhi__SciPhi-Self-RAG-Mistral-7B-32k_public/blob/main/results_2023-11-20T07-34-10.299317.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6025602935621713,
"acc_stderr": 0.03298893084060298,
"acc_norm": 0.6108691668755563,
"acc_norm_stderr": 0.033697327638679664,
"mc1": 0.3011015911872705,
"mc1_stderr": 0.016058999026100612,
"mc2": 0.45630119540652325,
"mc2_stderr": 0.015576968550952539,
"em": 0.2489513422818792,
"em_stderr": 0.004428237695563259,
"f1": 0.31078754194630925,
"f1_stderr": 0.004386404073454112
},
"harness|arc:challenge|25": {
"acc": 0.5341296928327645,
"acc_stderr": 0.0145773113152311,
"acc_norm": 0.5733788395904437,
"acc_norm_stderr": 0.014453185592920293
},
"harness|hellaswag|10": {
"acc": 0.6132244572794264,
"acc_stderr": 0.004860162076330984,
"acc_norm": 0.8044214299940251,
"acc_norm_stderr": 0.003958347934520328
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.48148148148148145,
"acc_stderr": 0.043163785995113245,
"acc_norm": 0.48148148148148145,
"acc_norm_stderr": 0.043163785995113245
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6513157894736842,
"acc_stderr": 0.0387813988879761,
"acc_norm": 0.6513157894736842,
"acc_norm_stderr": 0.0387813988879761
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6264150943396226,
"acc_stderr": 0.029773082713319878,
"acc_norm": 0.6264150943396226,
"acc_norm_stderr": 0.029773082713319878
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7083333333333334,
"acc_stderr": 0.03800968060554859,
"acc_norm": 0.7083333333333334,
"acc_norm_stderr": 0.03800968060554859
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.44,
"acc_stderr": 0.0498887651569859,
"acc_norm": 0.44,
"acc_norm_stderr": 0.0498887651569859
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6242774566473989,
"acc_stderr": 0.036928207672648664,
"acc_norm": 0.6242774566473989,
"acc_norm_stderr": 0.036928207672648664
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4411764705882353,
"acc_stderr": 0.049406356306056595,
"acc_norm": 0.4411764705882353,
"acc_norm_stderr": 0.049406356306056595
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5063829787234042,
"acc_stderr": 0.032683358999363366,
"acc_norm": 0.5063829787234042,
"acc_norm_stderr": 0.032683358999363366
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.38596491228070173,
"acc_stderr": 0.045796394220704334,
"acc_norm": 0.38596491228070173,
"acc_norm_stderr": 0.045796394220704334
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5310344827586206,
"acc_stderr": 0.04158632762097828,
"acc_norm": 0.5310344827586206,
"acc_norm_stderr": 0.04158632762097828
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.42063492063492064,
"acc_stderr": 0.025424835086923992,
"acc_norm": 0.42063492063492064,
"acc_norm_stderr": 0.025424835086923992
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3888888888888889,
"acc_stderr": 0.04360314860077459,
"acc_norm": 0.3888888888888889,
"acc_norm_stderr": 0.04360314860077459
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252604,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252604
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7258064516129032,
"acc_stderr": 0.0253781399708852,
"acc_norm": 0.7258064516129032,
"acc_norm_stderr": 0.0253781399708852
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4729064039408867,
"acc_stderr": 0.03512819077876106,
"acc_norm": 0.4729064039408867,
"acc_norm_stderr": 0.03512819077876106
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.62,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.62,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7757575757575758,
"acc_stderr": 0.03256866661681102,
"acc_norm": 0.7757575757575758,
"acc_norm_stderr": 0.03256866661681102
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7323232323232324,
"acc_stderr": 0.03154449888270286,
"acc_norm": 0.7323232323232324,
"acc_norm_stderr": 0.03154449888270286
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8393782383419689,
"acc_stderr": 0.02649905770139746,
"acc_norm": 0.8393782383419689,
"acc_norm_stderr": 0.02649905770139746
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6128205128205129,
"acc_stderr": 0.024697216930878937,
"acc_norm": 0.6128205128205129,
"acc_norm_stderr": 0.024697216930878937
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2962962962962963,
"acc_stderr": 0.027840811495871934,
"acc_norm": 0.2962962962962963,
"acc_norm_stderr": 0.027840811495871934
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6680672268907563,
"acc_stderr": 0.03058869701378364,
"acc_norm": 0.6680672268907563,
"acc_norm_stderr": 0.03058869701378364
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3509933774834437,
"acc_stderr": 0.03896981964257375,
"acc_norm": 0.3509933774834437,
"acc_norm_stderr": 0.03896981964257375
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8128440366972477,
"acc_stderr": 0.016722684526200144,
"acc_norm": 0.8128440366972477,
"acc_norm_stderr": 0.016722684526200144
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4675925925925926,
"acc_stderr": 0.03402801581358966,
"acc_norm": 0.4675925925925926,
"acc_norm_stderr": 0.03402801581358966
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7794117647058824,
"acc_stderr": 0.029102254389674082,
"acc_norm": 0.7794117647058824,
"acc_norm_stderr": 0.029102254389674082
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7637130801687764,
"acc_stderr": 0.02765215314415927,
"acc_norm": 0.7637130801687764,
"acc_norm_stderr": 0.02765215314415927
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.695067264573991,
"acc_stderr": 0.030898610882477515,
"acc_norm": 0.695067264573991,
"acc_norm_stderr": 0.030898610882477515
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7251908396946565,
"acc_stderr": 0.039153454088478354,
"acc_norm": 0.7251908396946565,
"acc_norm_stderr": 0.039153454088478354
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8016528925619835,
"acc_stderr": 0.03640118271990947,
"acc_norm": 0.8016528925619835,
"acc_norm_stderr": 0.03640118271990947
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.040191074725573483,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.040191074725573483
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7361963190184049,
"acc_stderr": 0.03462419931615623,
"acc_norm": 0.7361963190184049,
"acc_norm_stderr": 0.03462419931615623
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.44642857142857145,
"acc_stderr": 0.04718471485219588,
"acc_norm": 0.44642857142857145,
"acc_norm_stderr": 0.04718471485219588
},
"harness|hendrycksTest-management|5": {
"acc": 0.7864077669902912,
"acc_stderr": 0.040580420156460344,
"acc_norm": 0.7864077669902912,
"acc_norm_stderr": 0.040580420156460344
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8461538461538461,
"acc_stderr": 0.023636873317489267,
"acc_norm": 0.8461538461538461,
"acc_norm_stderr": 0.023636873317489267
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.65,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.65,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7994891443167306,
"acc_stderr": 0.014317653708594204,
"acc_norm": 0.7994891443167306,
"acc_norm_stderr": 0.014317653708594204
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6936416184971098,
"acc_stderr": 0.024818350129436593,
"acc_norm": 0.6936416184971098,
"acc_norm_stderr": 0.024818350129436593
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.37094972067039106,
"acc_stderr": 0.016155910721341763,
"acc_norm": 0.37094972067039106,
"acc_norm_stderr": 0.016155910721341763
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6895424836601307,
"acc_stderr": 0.026493033225145894,
"acc_norm": 0.6895424836601307,
"acc_norm_stderr": 0.026493033225145894
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6752411575562701,
"acc_stderr": 0.026596782287697043,
"acc_norm": 0.6752411575562701,
"acc_norm_stderr": 0.026596782287697043
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6728395061728395,
"acc_stderr": 0.026105673861409818,
"acc_norm": 0.6728395061728395,
"acc_norm_stderr": 0.026105673861409818
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.450354609929078,
"acc_stderr": 0.029680105565029036,
"acc_norm": 0.450354609929078,
"acc_norm_stderr": 0.029680105565029036
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4589308996088657,
"acc_stderr": 0.012727084826799798,
"acc_norm": 0.4589308996088657,
"acc_norm_stderr": 0.012727084826799798
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6507352941176471,
"acc_stderr": 0.028959755196824866,
"acc_norm": 0.6507352941176471,
"acc_norm_stderr": 0.028959755196824866
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6013071895424836,
"acc_stderr": 0.019808281317449848,
"acc_norm": 0.6013071895424836,
"acc_norm_stderr": 0.019808281317449848
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.04554619617541054,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.04554619617541054
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6938775510204082,
"acc_stderr": 0.029504896454595957,
"acc_norm": 0.6938775510204082,
"acc_norm_stderr": 0.029504896454595957
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7960199004975125,
"acc_stderr": 0.02849317624532607,
"acc_norm": 0.7960199004975125,
"acc_norm_stderr": 0.02849317624532607
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.81,
"acc_stderr": 0.03942772444036625,
"acc_norm": 0.81,
"acc_norm_stderr": 0.03942772444036625
},
"harness|hendrycksTest-virology|5": {
"acc": 0.536144578313253,
"acc_stderr": 0.038823108508905954,
"acc_norm": 0.536144578313253,
"acc_norm_stderr": 0.038823108508905954
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8011695906432749,
"acc_stderr": 0.030611116557432528,
"acc_norm": 0.8011695906432749,
"acc_norm_stderr": 0.030611116557432528
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3011015911872705,
"mc1_stderr": 0.016058999026100612,
"mc2": 0.45630119540652325,
"mc2_stderr": 0.015576968550952539
},
"harness|winogrande|5": {
"acc": 0.7482241515390686,
"acc_stderr": 0.01219848910025978
},
"harness|drop|3": {
"em": 0.2489513422818792,
"em_stderr": 0.004428237695563259,
"f1": 0.31078754194630925,
"f1_stderr": 0.004386404073454112
},
"harness|gsm8k|5": {
"acc": 0.19711902956785443,
"acc_stderr": 0.01095802163030063
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] | open-llm-leaderboard/details_SciPhi__SciPhi-Self-RAG-Mistral-7B-32k | [
"region:us"
]
| 2023-11-20T07:37:08+00:00 | {"pretty_name": "Evaluation run of SciPhi/SciPhi-Self-RAG-Mistral-7B-32k", "dataset_summary": "Dataset automatically created during the evaluation run of model [SciPhi/SciPhi-Self-RAG-Mistral-7B-32k](https://huggingface.co/SciPhi/SciPhi-Self-RAG-Mistral-7B-32k) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_SciPhi__SciPhi-Self-RAG-Mistral-7B-32k_public\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-11-20T07:34:10.299317](https://huggingface.co/datasets/open-llm-leaderboard/details_SciPhi__SciPhi-Self-RAG-Mistral-7B-32k_public/blob/main/results_2023-11-20T07-34-10.299317.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6025602935621713,\n \"acc_stderr\": 0.03298893084060298,\n \"acc_norm\": 0.6108691668755563,\n \"acc_norm_stderr\": 0.033697327638679664,\n \"mc1\": 0.3011015911872705,\n \"mc1_stderr\": 0.016058999026100612,\n \"mc2\": 0.45630119540652325,\n \"mc2_stderr\": 0.015576968550952539,\n \"em\": 0.2489513422818792,\n \"em_stderr\": 0.004428237695563259,\n \"f1\": 0.31078754194630925,\n \"f1_stderr\": 0.004386404073454112\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5341296928327645,\n \"acc_stderr\": 0.0145773113152311,\n \"acc_norm\": 0.5733788395904437,\n \"acc_norm_stderr\": 0.014453185592920293\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6132244572794264,\n \"acc_stderr\": 0.004860162076330984,\n \"acc_norm\": 0.8044214299940251,\n \"acc_norm_stderr\": 0.003958347934520328\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.48148148148148145,\n \"acc_stderr\": 0.043163785995113245,\n \"acc_norm\": 0.48148148148148145,\n \"acc_norm_stderr\": 0.043163785995113245\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6513157894736842,\n \"acc_stderr\": 0.0387813988879761,\n \"acc_norm\": 0.6513157894736842,\n \"acc_norm_stderr\": 0.0387813988879761\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.56,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.56,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6264150943396226,\n \"acc_stderr\": 0.029773082713319878,\n \"acc_norm\": 0.6264150943396226,\n \"acc_norm_stderr\": 0.029773082713319878\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7083333333333334,\n \"acc_stderr\": 0.03800968060554859,\n \"acc_norm\": 0.7083333333333334,\n \"acc_norm_stderr\": 0.03800968060554859\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.44,\n \"acc_stderr\": 0.0498887651569859,\n \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.0498887651569859\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6242774566473989,\n \"acc_stderr\": 0.036928207672648664,\n \"acc_norm\": 0.6242774566473989,\n \"acc_norm_stderr\": 0.036928207672648664\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4411764705882353,\n \"acc_stderr\": 0.049406356306056595,\n \"acc_norm\": 0.4411764705882353,\n \"acc_norm_stderr\": 0.049406356306056595\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5063829787234042,\n \"acc_stderr\": 0.032683358999363366,\n \"acc_norm\": 0.5063829787234042,\n \"acc_norm_stderr\": 0.032683358999363366\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.38596491228070173,\n \"acc_stderr\": 0.045796394220704334,\n \"acc_norm\": 0.38596491228070173,\n \"acc_norm_stderr\": 0.045796394220704334\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5310344827586206,\n \"acc_stderr\": 0.04158632762097828,\n \"acc_norm\": 0.5310344827586206,\n \"acc_norm_stderr\": 0.04158632762097828\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.42063492063492064,\n \"acc_stderr\": 0.025424835086923992,\n \"acc_norm\": 0.42063492063492064,\n \"acc_norm_stderr\": 0.025424835086923992\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3888888888888889,\n \"acc_stderr\": 0.04360314860077459,\n \"acc_norm\": 0.3888888888888889,\n \"acc_norm_stderr\": 0.04360314860077459\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7258064516129032,\n \"acc_stderr\": 0.0253781399708852,\n \"acc_norm\": 0.7258064516129032,\n \"acc_norm_stderr\": 0.0253781399708852\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4729064039408867,\n \"acc_stderr\": 0.03512819077876106,\n \"acc_norm\": 0.4729064039408867,\n \"acc_norm_stderr\": 0.03512819077876106\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.62,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.62,\n \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7757575757575758,\n \"acc_stderr\": 0.03256866661681102,\n \"acc_norm\": 0.7757575757575758,\n \"acc_norm_stderr\": 0.03256866661681102\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7323232323232324,\n \"acc_stderr\": 0.03154449888270286,\n \"acc_norm\": 0.7323232323232324,\n \"acc_norm_stderr\": 0.03154449888270286\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8393782383419689,\n \"acc_stderr\": 0.02649905770139746,\n \"acc_norm\": 0.8393782383419689,\n \"acc_norm_stderr\": 0.02649905770139746\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6128205128205129,\n \"acc_stderr\": 0.024697216930878937,\n \"acc_norm\": 0.6128205128205129,\n \"acc_norm_stderr\": 0.024697216930878937\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.2962962962962963,\n \"acc_stderr\": 0.027840811495871934,\n \"acc_norm\": 0.2962962962962963,\n \"acc_norm_stderr\": 0.027840811495871934\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6680672268907563,\n \"acc_stderr\": 0.03058869701378364,\n \"acc_norm\": 0.6680672268907563,\n \"acc_norm_stderr\": 0.03058869701378364\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3509933774834437,\n \"acc_stderr\": 0.03896981964257375,\n \"acc_norm\": 0.3509933774834437,\n \"acc_norm_stderr\": 0.03896981964257375\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8128440366972477,\n \"acc_stderr\": 0.016722684526200144,\n \"acc_norm\": 0.8128440366972477,\n \"acc_norm_stderr\": 0.016722684526200144\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.4675925925925926,\n \"acc_stderr\": 0.03402801581358966,\n \"acc_norm\": 0.4675925925925926,\n \"acc_norm_stderr\": 0.03402801581358966\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7794117647058824,\n \"acc_stderr\": 0.029102254389674082,\n \"acc_norm\": 0.7794117647058824,\n \"acc_norm_stderr\": 0.029102254389674082\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7637130801687764,\n \"acc_stderr\": 0.02765215314415927,\n \"acc_norm\": 0.7637130801687764,\n \"acc_norm_stderr\": 0.02765215314415927\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.695067264573991,\n \"acc_stderr\": 0.030898610882477515,\n \"acc_norm\": 0.695067264573991,\n \"acc_norm_stderr\": 0.030898610882477515\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7251908396946565,\n \"acc_stderr\": 0.039153454088478354,\n \"acc_norm\": 0.7251908396946565,\n \"acc_norm_stderr\": 0.039153454088478354\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8016528925619835,\n \"acc_stderr\": 0.03640118271990947,\n \"acc_norm\": 0.8016528925619835,\n \"acc_norm_stderr\": 0.03640118271990947\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.040191074725573483,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.040191074725573483\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7361963190184049,\n \"acc_stderr\": 0.03462419931615623,\n \"acc_norm\": 0.7361963190184049,\n \"acc_norm_stderr\": 0.03462419931615623\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.44642857142857145,\n \"acc_stderr\": 0.04718471485219588,\n \"acc_norm\": 0.44642857142857145,\n \"acc_norm_stderr\": 0.04718471485219588\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7864077669902912,\n \"acc_stderr\": 0.040580420156460344,\n \"acc_norm\": 0.7864077669902912,\n \"acc_norm_stderr\": 0.040580420156460344\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8461538461538461,\n \"acc_stderr\": 0.023636873317489267,\n \"acc_norm\": 0.8461538461538461,\n \"acc_norm_stderr\": 0.023636873317489267\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.65,\n \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.65,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7994891443167306,\n \"acc_stderr\": 0.014317653708594204,\n \"acc_norm\": 0.7994891443167306,\n \"acc_norm_stderr\": 0.014317653708594204\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.6936416184971098,\n \"acc_stderr\": 0.024818350129436593,\n \"acc_norm\": 0.6936416184971098,\n \"acc_norm_stderr\": 0.024818350129436593\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.37094972067039106,\n \"acc_stderr\": 0.016155910721341763,\n \"acc_norm\": 0.37094972067039106,\n \"acc_norm_stderr\": 0.016155910721341763\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.6895424836601307,\n \"acc_stderr\": 0.026493033225145894,\n \"acc_norm\": 0.6895424836601307,\n \"acc_norm_stderr\": 0.026493033225145894\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6752411575562701,\n \"acc_stderr\": 0.026596782287697043,\n \"acc_norm\": 0.6752411575562701,\n \"acc_norm_stderr\": 0.026596782287697043\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.6728395061728395,\n \"acc_stderr\": 0.026105673861409818,\n \"acc_norm\": 0.6728395061728395,\n \"acc_norm_stderr\": 0.026105673861409818\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.450354609929078,\n \"acc_stderr\": 0.029680105565029036,\n \"acc_norm\": 0.450354609929078,\n \"acc_norm_stderr\": 0.029680105565029036\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4589308996088657,\n \"acc_stderr\": 0.012727084826799798,\n \"acc_norm\": 0.4589308996088657,\n \"acc_norm_stderr\": 0.012727084826799798\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6507352941176471,\n \"acc_stderr\": 0.028959755196824866,\n \"acc_norm\": 0.6507352941176471,\n \"acc_norm_stderr\": 0.028959755196824866\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6013071895424836,\n \"acc_stderr\": 0.019808281317449848,\n \"acc_norm\": 0.6013071895424836,\n \"acc_norm_stderr\": 0.019808281317449848\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.6938775510204082,\n \"acc_stderr\": 0.029504896454595957,\n \"acc_norm\": 0.6938775510204082,\n \"acc_norm_stderr\": 0.029504896454595957\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7960199004975125,\n \"acc_stderr\": 0.02849317624532607,\n \"acc_norm\": 0.7960199004975125,\n \"acc_norm_stderr\": 0.02849317624532607\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.81,\n \"acc_stderr\": 0.03942772444036625,\n \"acc_norm\": 0.81,\n \"acc_norm_stderr\": 0.03942772444036625\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.536144578313253,\n \"acc_stderr\": 0.038823108508905954,\n \"acc_norm\": 0.536144578313253,\n \"acc_norm_stderr\": 0.038823108508905954\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8011695906432749,\n \"acc_stderr\": 0.030611116557432528,\n \"acc_norm\": 0.8011695906432749,\n \"acc_norm_stderr\": 0.030611116557432528\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3011015911872705,\n \"mc1_stderr\": 0.016058999026100612,\n \"mc2\": 0.45630119540652325,\n \"mc2_stderr\": 0.015576968550952539\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7482241515390686,\n \"acc_stderr\": 0.01219848910025978\n },\n \"harness|drop|3\": {\n \"em\": 0.2489513422818792,\n \"em_stderr\": 0.004428237695563259,\n \"f1\": 0.31078754194630925,\n \"f1_stderr\": 0.004386404073454112\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.19711902956785443,\n \"acc_stderr\": 0.01095802163030063\n }\n}\n```", "repo_url": "https://huggingface.co/SciPhi/SciPhi-Self-RAG-Mistral-7B-32k", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_11_20T07_34_10.299317", "path": ["**/details_harness|arc:challenge|25_2023-11-20T07-34-10.299317.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-11-20T07-34-10.299317.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_11_20T07_34_10.299317", "path": ["**/details_harness|drop|3_2023-11-20T07-34-10.299317.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-11-20T07-34-10.299317.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_11_20T07_34_10.299317", "path": ["**/details_harness|gsm8k|5_2023-11-20T07-34-10.299317.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-11-20T07-34-10.299317.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_11_20T07_34_10.299317", "path": ["**/details_harness|hellaswag|10_2023-11-20T07-34-10.299317.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-11-20T07-34-10.299317.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_11_20T07_34_10.299317", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-20T07-34-10.299317.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-11-20T07-34-10.299317.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-11-20T07-34-10.299317.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-11-20T07-34-10.299317.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-20T07-34-10.299317.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-11-20T07-34-10.299317.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-11-20T07-34-10.299317.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-11-20T07-34-10.299317.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-11-20T07-34-10.299317.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-11-20T07-34-10.299317.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-11-20T07-34-10.299317.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-11-20T07-34-10.299317.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-20T07-34-10.299317.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-11-20T07-34-10.299317.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-20T07-34-10.299317.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-20T07-34-10.299317.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-11-20T07-34-10.299317.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-11-20T07-34-10.299317.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-11-20T07-34-10.299317.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-20T07-34-10.299317.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-20T07-34-10.299317.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-20T07-34-10.299317.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-11-20T07-34-10.299317.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-20T07-34-10.299317.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-20T07-34-10.299317.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-20T07-34-10.299317.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-20T07-34-10.299317.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-11-20T07-34-10.299317.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-20T07-34-10.299317.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-20T07-34-10.299317.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-20T07-34-10.299317.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-20T07-34-10.299317.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-11-20T07-34-10.299317.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-11-20T07-34-10.299317.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-11-20T07-34-10.299317.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-11-20T07-34-10.299317.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-20T07-34-10.299317.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-11-20T07-34-10.299317.parquet", "**/details_harness|hendrycksTest-management|5_2023-11-20T07-34-10.299317.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-11-20T07-34-10.299317.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-11-20T07-34-10.299317.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-11-20T07-34-10.299317.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-11-20T07-34-10.299317.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-20T07-34-10.299317.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-11-20T07-34-10.299317.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-11-20T07-34-10.299317.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-11-20T07-34-10.299317.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-11-20T07-34-10.299317.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-11-20T07-34-10.299317.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-11-20T07-34-10.299317.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-11-20T07-34-10.299317.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-11-20T07-34-10.299317.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-11-20T07-34-10.299317.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-11-20T07-34-10.299317.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-20T07-34-10.299317.parquet", "**/details_harness|hendrycksTest-virology|5_2023-11-20T07-34-10.299317.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-11-20T07-34-10.299317.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-20T07-34-10.299317.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-11-20T07-34-10.299317.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-11-20T07-34-10.299317.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-11-20T07-34-10.299317.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-20T07-34-10.299317.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-11-20T07-34-10.299317.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-11-20T07-34-10.299317.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-11-20T07-34-10.299317.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-11-20T07-34-10.299317.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-11-20T07-34-10.299317.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-11-20T07-34-10.299317.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-11-20T07-34-10.299317.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-20T07-34-10.299317.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-11-20T07-34-10.299317.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-20T07-34-10.299317.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-20T07-34-10.299317.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-11-20T07-34-10.299317.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-11-20T07-34-10.299317.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-11-20T07-34-10.299317.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-20T07-34-10.299317.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-20T07-34-10.299317.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-20T07-34-10.299317.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-11-20T07-34-10.299317.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-20T07-34-10.299317.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-20T07-34-10.299317.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-20T07-34-10.299317.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-20T07-34-10.299317.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-11-20T07-34-10.299317.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-20T07-34-10.299317.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-20T07-34-10.299317.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-20T07-34-10.299317.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-20T07-34-10.299317.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-11-20T07-34-10.299317.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-11-20T07-34-10.299317.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-11-20T07-34-10.299317.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-11-20T07-34-10.299317.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-20T07-34-10.299317.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-11-20T07-34-10.299317.parquet", "**/details_harness|hendrycksTest-management|5_2023-11-20T07-34-10.299317.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-11-20T07-34-10.299317.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-11-20T07-34-10.299317.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-11-20T07-34-10.299317.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-11-20T07-34-10.299317.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-20T07-34-10.299317.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-11-20T07-34-10.299317.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-11-20T07-34-10.299317.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-11-20T07-34-10.299317.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-11-20T07-34-10.299317.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-11-20T07-34-10.299317.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-11-20T07-34-10.299317.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-11-20T07-34-10.299317.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-11-20T07-34-10.299317.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-11-20T07-34-10.299317.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-11-20T07-34-10.299317.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-20T07-34-10.299317.parquet", "**/details_harness|hendrycksTest-virology|5_2023-11-20T07-34-10.299317.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-11-20T07-34-10.299317.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_11_20T07_34_10.299317", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-20T07-34-10.299317.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-20T07-34-10.299317.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_11_20T07_34_10.299317", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-11-20T07-34-10.299317.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-11-20T07-34-10.299317.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_11_20T07_34_10.299317", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-11-20T07-34-10.299317.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-11-20T07-34-10.299317.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_11_20T07_34_10.299317", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-11-20T07-34-10.299317.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-11-20T07-34-10.299317.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_11_20T07_34_10.299317", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-20T07-34-10.299317.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-20T07-34-10.299317.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_11_20T07_34_10.299317", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-11-20T07-34-10.299317.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-11-20T07-34-10.299317.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_11_20T07_34_10.299317", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-11-20T07-34-10.299317.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-11-20T07-34-10.299317.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_11_20T07_34_10.299317", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-11-20T07-34-10.299317.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-11-20T07-34-10.299317.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_11_20T07_34_10.299317", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-11-20T07-34-10.299317.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-11-20T07-34-10.299317.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_11_20T07_34_10.299317", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-11-20T07-34-10.299317.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-11-20T07-34-10.299317.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_11_20T07_34_10.299317", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-11-20T07-34-10.299317.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-11-20T07-34-10.299317.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_11_20T07_34_10.299317", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-11-20T07-34-10.299317.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-11-20T07-34-10.299317.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_11_20T07_34_10.299317", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-20T07-34-10.299317.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-20T07-34-10.299317.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_11_20T07_34_10.299317", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-11-20T07-34-10.299317.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-11-20T07-34-10.299317.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_11_20T07_34_10.299317", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-20T07-34-10.299317.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-20T07-34-10.299317.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_11_20T07_34_10.299317", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-20T07-34-10.299317.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-20T07-34-10.299317.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_11_20T07_34_10.299317", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-11-20T07-34-10.299317.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-11-20T07-34-10.299317.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_11_20T07_34_10.299317", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-11-20T07-34-10.299317.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-11-20T07-34-10.299317.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_11_20T07_34_10.299317", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-11-20T07-34-10.299317.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-11-20T07-34-10.299317.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_11_20T07_34_10.299317", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-20T07-34-10.299317.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-20T07-34-10.299317.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_11_20T07_34_10.299317", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-20T07-34-10.299317.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-20T07-34-10.299317.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_11_20T07_34_10.299317", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-20T07-34-10.299317.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-20T07-34-10.299317.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_11_20T07_34_10.299317", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-11-20T07-34-10.299317.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-11-20T07-34-10.299317.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_11_20T07_34_10.299317", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-20T07-34-10.299317.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-20T07-34-10.299317.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_11_20T07_34_10.299317", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-20T07-34-10.299317.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-20T07-34-10.299317.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_11_20T07_34_10.299317", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-20T07-34-10.299317.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-20T07-34-10.299317.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_11_20T07_34_10.299317", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-20T07-34-10.299317.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-20T07-34-10.299317.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_11_20T07_34_10.299317", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-11-20T07-34-10.299317.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-11-20T07-34-10.299317.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_11_20T07_34_10.299317", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-20T07-34-10.299317.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-20T07-34-10.299317.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_11_20T07_34_10.299317", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-20T07-34-10.299317.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-20T07-34-10.299317.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_11_20T07_34_10.299317", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-20T07-34-10.299317.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-20T07-34-10.299317.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_11_20T07_34_10.299317", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-20T07-34-10.299317.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-20T07-34-10.299317.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_11_20T07_34_10.299317", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-11-20T07-34-10.299317.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-11-20T07-34-10.299317.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_11_20T07_34_10.299317", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-11-20T07-34-10.299317.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-11-20T07-34-10.299317.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_11_20T07_34_10.299317", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-11-20T07-34-10.299317.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-11-20T07-34-10.299317.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_11_20T07_34_10.299317", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-11-20T07-34-10.299317.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-11-20T07-34-10.299317.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_11_20T07_34_10.299317", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-20T07-34-10.299317.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-20T07-34-10.299317.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_11_20T07_34_10.299317", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-11-20T07-34-10.299317.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-11-20T07-34-10.299317.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_11_20T07_34_10.299317", "path": ["**/details_harness|hendrycksTest-management|5_2023-11-20T07-34-10.299317.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-11-20T07-34-10.299317.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_11_20T07_34_10.299317", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-11-20T07-34-10.299317.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-11-20T07-34-10.299317.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_11_20T07_34_10.299317", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-11-20T07-34-10.299317.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-11-20T07-34-10.299317.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_11_20T07_34_10.299317", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-11-20T07-34-10.299317.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-11-20T07-34-10.299317.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_11_20T07_34_10.299317", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-11-20T07-34-10.299317.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-11-20T07-34-10.299317.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_11_20T07_34_10.299317", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-20T07-34-10.299317.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-20T07-34-10.299317.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_11_20T07_34_10.299317", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-11-20T07-34-10.299317.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-11-20T07-34-10.299317.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_11_20T07_34_10.299317", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-11-20T07-34-10.299317.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-11-20T07-34-10.299317.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_11_20T07_34_10.299317", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-11-20T07-34-10.299317.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-11-20T07-34-10.299317.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_11_20T07_34_10.299317", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-11-20T07-34-10.299317.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-11-20T07-34-10.299317.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_11_20T07_34_10.299317", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-11-20T07-34-10.299317.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-11-20T07-34-10.299317.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_11_20T07_34_10.299317", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-11-20T07-34-10.299317.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-11-20T07-34-10.299317.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_11_20T07_34_10.299317", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-11-20T07-34-10.299317.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-11-20T07-34-10.299317.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_11_20T07_34_10.299317", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-11-20T07-34-10.299317.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-11-20T07-34-10.299317.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_11_20T07_34_10.299317", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-11-20T07-34-10.299317.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-11-20T07-34-10.299317.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_11_20T07_34_10.299317", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-11-20T07-34-10.299317.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-11-20T07-34-10.299317.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_11_20T07_34_10.299317", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-20T07-34-10.299317.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-20T07-34-10.299317.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_11_20T07_34_10.299317", "path": ["**/details_harness|hendrycksTest-virology|5_2023-11-20T07-34-10.299317.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-11-20T07-34-10.299317.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_11_20T07_34_10.299317", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-11-20T07-34-10.299317.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-11-20T07-34-10.299317.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_11_20T07_34_10.299317", "path": ["**/details_harness|truthfulqa:mc|0_2023-11-20T07-34-10.299317.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-11-20T07-34-10.299317.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_11_20T07_34_10.299317", "path": ["**/details_harness|winogrande|5_2023-11-20T07-34-10.299317.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-11-20T07-34-10.299317.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_11_20T07_34_10.299317", "path": ["results_2023-11-20T07-34-10.299317.parquet"]}, {"split": "latest", "path": ["results_2023-11-20T07-34-10.299317.parquet"]}]}]} | 2023-11-20T07:37:54+00:00 | []
| []
| TAGS
#region-us
|
# Dataset Card for Evaluation run of SciPhi/SciPhi-Self-RAG-Mistral-7B-32k
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model SciPhi/SciPhi-Self-RAG-Mistral-7B-32k on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-11-20T07:34:10.299317(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
| [
"# Dataset Card for Evaluation run of SciPhi/SciPhi-Self-RAG-Mistral-7B-32k",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model SciPhi/SciPhi-Self-RAG-Mistral-7B-32k on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-11-20T07:34:10.299317(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
]
| [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of SciPhi/SciPhi-Self-RAG-Mistral-7B-32k",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model SciPhi/SciPhi-Self-RAG-Mistral-7B-32k on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-11-20T07:34:10.299317(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
]
| [
6,
30,
31,
179,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
]
| [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of SciPhi/SciPhi-Self-RAG-Mistral-7B-32k## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model SciPhi/SciPhi-Self-RAG-Mistral-7B-32k on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-11-20T07:34:10.299317(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
]
|
e6c70d87bdc3e736d828f4ec3a4401d525bef6ad |
# Dataset Card for Evaluation run of maywell/Synatra-7B-v0.3-dpo
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/maywell/Synatra-7B-v0.3-dpo
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [maywell/Synatra-7B-v0.3-dpo](https://huggingface.co/maywell/Synatra-7B-v0.3-dpo) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_maywell__Synatra-7B-v0.3-dpo_public",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-11-20T08:03:37.008028](https://huggingface.co/datasets/open-llm-leaderboard/details_maywell__Synatra-7B-v0.3-dpo_public/blob/main/results_2023-11-20T08-03-37.008028.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.610854861666512,
"acc_stderr": 0.03282789791741049,
"acc_norm": 0.6184353715807913,
"acc_norm_stderr": 0.03351856767139879,
"mc1": 0.39657282741738065,
"mc1_stderr": 0.017124930942023518,
"mc2": 0.5646058699056372,
"mc2_stderr": 0.015306312553856578,
"em": 0.006711409395973154,
"em_stderr": 0.0008361500895152437,
"f1": 0.086758598993289,
"f1_stderr": 0.0017937356641132749
},
"harness|arc:challenge|25": {
"acc": 0.6006825938566553,
"acc_stderr": 0.014312094557946709,
"acc_norm": 0.6279863481228669,
"acc_norm_stderr": 0.014124597881844461
},
"harness|hellaswag|10": {
"acc": 0.6278629755028878,
"acc_stderr": 0.004823867761332464,
"acc_norm": 0.8258315076677952,
"acc_norm_stderr": 0.0037847921724660665
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5925925925925926,
"acc_stderr": 0.04244633238353228,
"acc_norm": 0.5925925925925926,
"acc_norm_stderr": 0.04244633238353228
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6578947368421053,
"acc_stderr": 0.03860731599316092,
"acc_norm": 0.6578947368421053,
"acc_norm_stderr": 0.03860731599316092
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.59,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.59,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.660377358490566,
"acc_stderr": 0.02914690474779833,
"acc_norm": 0.660377358490566,
"acc_norm_stderr": 0.02914690474779833
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6875,
"acc_stderr": 0.038760854559127644,
"acc_norm": 0.6875,
"acc_norm_stderr": 0.038760854559127644
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5953757225433526,
"acc_stderr": 0.03742461193887248,
"acc_norm": 0.5953757225433526,
"acc_norm_stderr": 0.03742461193887248
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.28431372549019607,
"acc_stderr": 0.04488482852329017,
"acc_norm": 0.28431372549019607,
"acc_norm_stderr": 0.04488482852329017
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.79,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.79,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5404255319148936,
"acc_stderr": 0.03257901482099835,
"acc_norm": 0.5404255319148936,
"acc_norm_stderr": 0.03257901482099835
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4473684210526316,
"acc_stderr": 0.04677473004491199,
"acc_norm": 0.4473684210526316,
"acc_norm_stderr": 0.04677473004491199
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5655172413793104,
"acc_stderr": 0.041307408795554966,
"acc_norm": 0.5655172413793104,
"acc_norm_stderr": 0.041307408795554966
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.42328042328042326,
"acc_stderr": 0.02544636563440678,
"acc_norm": 0.42328042328042326,
"acc_norm_stderr": 0.02544636563440678
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3888888888888889,
"acc_stderr": 0.04360314860077459,
"acc_norm": 0.3888888888888889,
"acc_norm_stderr": 0.04360314860077459
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7451612903225806,
"acc_stderr": 0.024790118459332208,
"acc_norm": 0.7451612903225806,
"acc_norm_stderr": 0.024790118459332208
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4729064039408867,
"acc_stderr": 0.03512819077876106,
"acc_norm": 0.4729064039408867,
"acc_norm_stderr": 0.03512819077876106
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7818181818181819,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.7818181818181819,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7929292929292929,
"acc_stderr": 0.02886977846026704,
"acc_norm": 0.7929292929292929,
"acc_norm_stderr": 0.02886977846026704
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8393782383419689,
"acc_stderr": 0.026499057701397443,
"acc_norm": 0.8393782383419689,
"acc_norm_stderr": 0.026499057701397443
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6076923076923076,
"acc_stderr": 0.024756000382130952,
"acc_norm": 0.6076923076923076,
"acc_norm_stderr": 0.024756000382130952
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2814814814814815,
"acc_stderr": 0.02742001935094527,
"acc_norm": 0.2814814814814815,
"acc_norm_stderr": 0.02742001935094527
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6554621848739496,
"acc_stderr": 0.030868682604121626,
"acc_norm": 0.6554621848739496,
"acc_norm_stderr": 0.030868682604121626
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31788079470198677,
"acc_stderr": 0.03802039760107903,
"acc_norm": 0.31788079470198677,
"acc_norm_stderr": 0.03802039760107903
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8073394495412844,
"acc_stderr": 0.01690927688493607,
"acc_norm": 0.8073394495412844,
"acc_norm_stderr": 0.01690927688493607
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.49074074074074076,
"acc_stderr": 0.034093869469927006,
"acc_norm": 0.49074074074074076,
"acc_norm_stderr": 0.034093869469927006
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7990196078431373,
"acc_stderr": 0.028125972265654373,
"acc_norm": 0.7990196078431373,
"acc_norm_stderr": 0.028125972265654373
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.759493670886076,
"acc_stderr": 0.027820781981149685,
"acc_norm": 0.759493670886076,
"acc_norm_stderr": 0.027820781981149685
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7085201793721974,
"acc_stderr": 0.030500283176545843,
"acc_norm": 0.7085201793721974,
"acc_norm_stderr": 0.030500283176545843
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6946564885496184,
"acc_stderr": 0.040393149787245605,
"acc_norm": 0.6946564885496184,
"acc_norm_stderr": 0.040393149787245605
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7933884297520661,
"acc_stderr": 0.036959801280988226,
"acc_norm": 0.7933884297520661,
"acc_norm_stderr": 0.036959801280988226
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.75,
"acc_stderr": 0.04186091791394607,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04186091791394607
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7423312883435583,
"acc_stderr": 0.03436150827846917,
"acc_norm": 0.7423312883435583,
"acc_norm_stderr": 0.03436150827846917
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5089285714285714,
"acc_stderr": 0.04745033255489123,
"acc_norm": 0.5089285714285714,
"acc_norm_stderr": 0.04745033255489123
},
"harness|hendrycksTest-management|5": {
"acc": 0.7378640776699029,
"acc_stderr": 0.043546310772605956,
"acc_norm": 0.7378640776699029,
"acc_norm_stderr": 0.043546310772605956
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8547008547008547,
"acc_stderr": 0.023086635086841403,
"acc_norm": 0.8547008547008547,
"acc_norm_stderr": 0.023086635086841403
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.72,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.72,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7982120051085568,
"acc_stderr": 0.014351702181636856,
"acc_norm": 0.7982120051085568,
"acc_norm_stderr": 0.014351702181636856
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6936416184971098,
"acc_stderr": 0.024818350129436593,
"acc_norm": 0.6936416184971098,
"acc_norm_stderr": 0.024818350129436593
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3407821229050279,
"acc_stderr": 0.015852002449862106,
"acc_norm": 0.3407821229050279,
"acc_norm_stderr": 0.015852002449862106
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.673202614379085,
"acc_stderr": 0.026857294663281413,
"acc_norm": 0.673202614379085,
"acc_norm_stderr": 0.026857294663281413
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6913183279742765,
"acc_stderr": 0.02623696588115327,
"acc_norm": 0.6913183279742765,
"acc_norm_stderr": 0.02623696588115327
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7037037037037037,
"acc_stderr": 0.025407197798890162,
"acc_norm": 0.7037037037037037,
"acc_norm_stderr": 0.025407197798890162
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.475177304964539,
"acc_stderr": 0.02979071924382972,
"acc_norm": 0.475177304964539,
"acc_norm_stderr": 0.02979071924382972
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4491525423728814,
"acc_stderr": 0.012704030518851488,
"acc_norm": 0.4491525423728814,
"acc_norm_stderr": 0.012704030518851488
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6397058823529411,
"acc_stderr": 0.029163128570670733,
"acc_norm": 0.6397058823529411,
"acc_norm_stderr": 0.029163128570670733
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6323529411764706,
"acc_stderr": 0.019506291693954847,
"acc_norm": 0.6323529411764706,
"acc_norm_stderr": 0.019506291693954847
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5909090909090909,
"acc_stderr": 0.04709306978661896,
"acc_norm": 0.5909090909090909,
"acc_norm_stderr": 0.04709306978661896
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.673469387755102,
"acc_stderr": 0.0300210562384403,
"acc_norm": 0.673469387755102,
"acc_norm_stderr": 0.0300210562384403
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7960199004975125,
"acc_stderr": 0.02849317624532607,
"acc_norm": 0.7960199004975125,
"acc_norm_stderr": 0.02849317624532607
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.81,
"acc_stderr": 0.03942772444036625,
"acc_norm": 0.81,
"acc_norm_stderr": 0.03942772444036625
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4939759036144578,
"acc_stderr": 0.03892212195333045,
"acc_norm": 0.4939759036144578,
"acc_norm_stderr": 0.03892212195333045
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8070175438596491,
"acc_stderr": 0.030267457554898458,
"acc_norm": 0.8070175438596491,
"acc_norm_stderr": 0.030267457554898458
},
"harness|truthfulqa:mc|0": {
"mc1": 0.39657282741738065,
"mc1_stderr": 0.017124930942023518,
"mc2": 0.5646058699056372,
"mc2_stderr": 0.015306312553856578
},
"harness|winogrande|5": {
"acc": 0.7624309392265194,
"acc_stderr": 0.011961298905803145
},
"harness|drop|3": {
"em": 0.006711409395973154,
"em_stderr": 0.0008361500895152437,
"f1": 0.086758598993289,
"f1_stderr": 0.0017937356641132749
},
"harness|gsm8k|5": {
"acc": 0.23730098559514784,
"acc_stderr": 0.011718409178739446
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] | open-llm-leaderboard/details_maywell__Synatra-7B-v0.3-dpo | [
"region:us"
]
| 2023-11-20T08:06:37+00:00 | {"pretty_name": "Evaluation run of maywell/Synatra-7B-v0.3-dpo", "dataset_summary": "Dataset automatically created during the evaluation run of model [maywell/Synatra-7B-v0.3-dpo](https://huggingface.co/maywell/Synatra-7B-v0.3-dpo) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_maywell__Synatra-7B-v0.3-dpo_public\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-11-20T08:03:37.008028](https://huggingface.co/datasets/open-llm-leaderboard/details_maywell__Synatra-7B-v0.3-dpo_public/blob/main/results_2023-11-20T08-03-37.008028.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.610854861666512,\n \"acc_stderr\": 0.03282789791741049,\n \"acc_norm\": 0.6184353715807913,\n \"acc_norm_stderr\": 0.03351856767139879,\n \"mc1\": 0.39657282741738065,\n \"mc1_stderr\": 0.017124930942023518,\n \"mc2\": 0.5646058699056372,\n \"mc2_stderr\": 0.015306312553856578,\n \"em\": 0.006711409395973154,\n \"em_stderr\": 0.0008361500895152437,\n \"f1\": 0.086758598993289,\n \"f1_stderr\": 0.0017937356641132749\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6006825938566553,\n \"acc_stderr\": 0.014312094557946709,\n \"acc_norm\": 0.6279863481228669,\n \"acc_norm_stderr\": 0.014124597881844461\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6278629755028878,\n \"acc_stderr\": 0.004823867761332464,\n \"acc_norm\": 0.8258315076677952,\n \"acc_norm_stderr\": 0.0037847921724660665\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5925925925925926,\n \"acc_stderr\": 0.04244633238353228,\n \"acc_norm\": 0.5925925925925926,\n \"acc_norm_stderr\": 0.04244633238353228\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6578947368421053,\n \"acc_stderr\": 0.03860731599316092,\n \"acc_norm\": 0.6578947368421053,\n \"acc_norm_stderr\": 0.03860731599316092\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.59,\n \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.59,\n \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.660377358490566,\n \"acc_stderr\": 0.02914690474779833,\n \"acc_norm\": 0.660377358490566,\n \"acc_norm_stderr\": 0.02914690474779833\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6875,\n \"acc_stderr\": 0.038760854559127644,\n \"acc_norm\": 0.6875,\n \"acc_norm_stderr\": 0.038760854559127644\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621505,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621505\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5953757225433526,\n \"acc_stderr\": 0.03742461193887248,\n \"acc_norm\": 0.5953757225433526,\n \"acc_norm_stderr\": 0.03742461193887248\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.28431372549019607,\n \"acc_stderr\": 0.04488482852329017,\n \"acc_norm\": 0.28431372549019607,\n \"acc_norm_stderr\": 0.04488482852329017\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.79,\n \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.79,\n \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5404255319148936,\n \"acc_stderr\": 0.03257901482099835,\n \"acc_norm\": 0.5404255319148936,\n \"acc_norm_stderr\": 0.03257901482099835\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4473684210526316,\n \"acc_stderr\": 0.04677473004491199,\n \"acc_norm\": 0.4473684210526316,\n \"acc_norm_stderr\": 0.04677473004491199\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5655172413793104,\n \"acc_stderr\": 0.041307408795554966,\n \"acc_norm\": 0.5655172413793104,\n \"acc_norm_stderr\": 0.041307408795554966\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.42328042328042326,\n \"acc_stderr\": 0.02544636563440678,\n \"acc_norm\": 0.42328042328042326,\n \"acc_norm_stderr\": 0.02544636563440678\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3888888888888889,\n \"acc_stderr\": 0.04360314860077459,\n \"acc_norm\": 0.3888888888888889,\n \"acc_norm_stderr\": 0.04360314860077459\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7451612903225806,\n \"acc_stderr\": 0.024790118459332208,\n \"acc_norm\": 0.7451612903225806,\n \"acc_norm_stderr\": 0.024790118459332208\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4729064039408867,\n \"acc_stderr\": 0.03512819077876106,\n \"acc_norm\": 0.4729064039408867,\n \"acc_norm_stderr\": 0.03512819077876106\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7818181818181819,\n \"acc_stderr\": 0.03225078108306289,\n \"acc_norm\": 0.7818181818181819,\n \"acc_norm_stderr\": 0.03225078108306289\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7929292929292929,\n \"acc_stderr\": 0.02886977846026704,\n \"acc_norm\": 0.7929292929292929,\n \"acc_norm_stderr\": 0.02886977846026704\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8393782383419689,\n \"acc_stderr\": 0.026499057701397443,\n \"acc_norm\": 0.8393782383419689,\n \"acc_norm_stderr\": 0.026499057701397443\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6076923076923076,\n \"acc_stderr\": 0.024756000382130952,\n \"acc_norm\": 0.6076923076923076,\n \"acc_norm_stderr\": 0.024756000382130952\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.2814814814814815,\n \"acc_stderr\": 0.02742001935094527,\n \"acc_norm\": 0.2814814814814815,\n \"acc_norm_stderr\": 0.02742001935094527\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6554621848739496,\n \"acc_stderr\": 0.030868682604121626,\n \"acc_norm\": 0.6554621848739496,\n \"acc_norm_stderr\": 0.030868682604121626\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.31788079470198677,\n \"acc_stderr\": 0.03802039760107903,\n \"acc_norm\": 0.31788079470198677,\n \"acc_norm_stderr\": 0.03802039760107903\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8073394495412844,\n \"acc_stderr\": 0.01690927688493607,\n \"acc_norm\": 0.8073394495412844,\n \"acc_norm_stderr\": 0.01690927688493607\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.49074074074074076,\n \"acc_stderr\": 0.034093869469927006,\n \"acc_norm\": 0.49074074074074076,\n \"acc_norm_stderr\": 0.034093869469927006\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7990196078431373,\n \"acc_stderr\": 0.028125972265654373,\n \"acc_norm\": 0.7990196078431373,\n \"acc_norm_stderr\": 0.028125972265654373\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.759493670886076,\n \"acc_stderr\": 0.027820781981149685,\n \"acc_norm\": 0.759493670886076,\n \"acc_norm_stderr\": 0.027820781981149685\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7085201793721974,\n \"acc_stderr\": 0.030500283176545843,\n \"acc_norm\": 0.7085201793721974,\n \"acc_norm_stderr\": 0.030500283176545843\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.6946564885496184,\n \"acc_stderr\": 0.040393149787245605,\n \"acc_norm\": 0.6946564885496184,\n \"acc_norm_stderr\": 0.040393149787245605\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7933884297520661,\n \"acc_stderr\": 0.036959801280988226,\n \"acc_norm\": 0.7933884297520661,\n \"acc_norm_stderr\": 0.036959801280988226\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04186091791394607,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04186091791394607\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7423312883435583,\n \"acc_stderr\": 0.03436150827846917,\n \"acc_norm\": 0.7423312883435583,\n \"acc_norm_stderr\": 0.03436150827846917\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5089285714285714,\n \"acc_stderr\": 0.04745033255489123,\n \"acc_norm\": 0.5089285714285714,\n \"acc_norm_stderr\": 0.04745033255489123\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7378640776699029,\n \"acc_stderr\": 0.043546310772605956,\n \"acc_norm\": 0.7378640776699029,\n \"acc_norm_stderr\": 0.043546310772605956\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8547008547008547,\n \"acc_stderr\": 0.023086635086841403,\n \"acc_norm\": 0.8547008547008547,\n \"acc_norm_stderr\": 0.023086635086841403\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.72,\n \"acc_stderr\": 0.045126085985421276,\n \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.045126085985421276\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7982120051085568,\n \"acc_stderr\": 0.014351702181636856,\n \"acc_norm\": 0.7982120051085568,\n \"acc_norm_stderr\": 0.014351702181636856\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.6936416184971098,\n \"acc_stderr\": 0.024818350129436593,\n \"acc_norm\": 0.6936416184971098,\n \"acc_norm_stderr\": 0.024818350129436593\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3407821229050279,\n \"acc_stderr\": 0.015852002449862106,\n \"acc_norm\": 0.3407821229050279,\n \"acc_norm_stderr\": 0.015852002449862106\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.673202614379085,\n \"acc_stderr\": 0.026857294663281413,\n \"acc_norm\": 0.673202614379085,\n \"acc_norm_stderr\": 0.026857294663281413\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6913183279742765,\n \"acc_stderr\": 0.02623696588115327,\n \"acc_norm\": 0.6913183279742765,\n \"acc_norm_stderr\": 0.02623696588115327\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7037037037037037,\n \"acc_stderr\": 0.025407197798890162,\n \"acc_norm\": 0.7037037037037037,\n \"acc_norm_stderr\": 0.025407197798890162\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.475177304964539,\n \"acc_stderr\": 0.02979071924382972,\n \"acc_norm\": 0.475177304964539,\n \"acc_norm_stderr\": 0.02979071924382972\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4491525423728814,\n \"acc_stderr\": 0.012704030518851488,\n \"acc_norm\": 0.4491525423728814,\n \"acc_norm_stderr\": 0.012704030518851488\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6397058823529411,\n \"acc_stderr\": 0.029163128570670733,\n \"acc_norm\": 0.6397058823529411,\n \"acc_norm_stderr\": 0.029163128570670733\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6323529411764706,\n \"acc_stderr\": 0.019506291693954847,\n \"acc_norm\": 0.6323529411764706,\n \"acc_norm_stderr\": 0.019506291693954847\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5909090909090909,\n \"acc_stderr\": 0.04709306978661896,\n \"acc_norm\": 0.5909090909090909,\n \"acc_norm_stderr\": 0.04709306978661896\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.673469387755102,\n \"acc_stderr\": 0.0300210562384403,\n \"acc_norm\": 0.673469387755102,\n \"acc_norm_stderr\": 0.0300210562384403\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7960199004975125,\n \"acc_stderr\": 0.02849317624532607,\n \"acc_norm\": 0.7960199004975125,\n \"acc_norm_stderr\": 0.02849317624532607\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.81,\n \"acc_stderr\": 0.03942772444036625,\n \"acc_norm\": 0.81,\n \"acc_norm_stderr\": 0.03942772444036625\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4939759036144578,\n \"acc_stderr\": 0.03892212195333045,\n \"acc_norm\": 0.4939759036144578,\n \"acc_norm_stderr\": 0.03892212195333045\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8070175438596491,\n \"acc_stderr\": 0.030267457554898458,\n \"acc_norm\": 0.8070175438596491,\n \"acc_norm_stderr\": 0.030267457554898458\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.39657282741738065,\n \"mc1_stderr\": 0.017124930942023518,\n \"mc2\": 0.5646058699056372,\n \"mc2_stderr\": 0.015306312553856578\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7624309392265194,\n \"acc_stderr\": 0.011961298905803145\n },\n \"harness|drop|3\": {\n \"em\": 0.006711409395973154,\n \"em_stderr\": 0.0008361500895152437,\n \"f1\": 0.086758598993289,\n \"f1_stderr\": 0.0017937356641132749\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.23730098559514784,\n \"acc_stderr\": 0.011718409178739446\n }\n}\n```", "repo_url": "https://huggingface.co/maywell/Synatra-7B-v0.3-dpo", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_11_20T08_03_37.008028", "path": ["**/details_harness|arc:challenge|25_2023-11-20T08-03-37.008028.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-11-20T08-03-37.008028.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_11_20T08_03_37.008028", "path": ["**/details_harness|drop|3_2023-11-20T08-03-37.008028.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-11-20T08-03-37.008028.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_11_20T08_03_37.008028", "path": ["**/details_harness|gsm8k|5_2023-11-20T08-03-37.008028.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-11-20T08-03-37.008028.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_11_20T08_03_37.008028", "path": ["**/details_harness|hellaswag|10_2023-11-20T08-03-37.008028.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-11-20T08-03-37.008028.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_11_20T08_03_37.008028", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-20T08-03-37.008028.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-11-20T08-03-37.008028.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-11-20T08-03-37.008028.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-11-20T08-03-37.008028.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-20T08-03-37.008028.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-11-20T08-03-37.008028.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-11-20T08-03-37.008028.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-11-20T08-03-37.008028.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-11-20T08-03-37.008028.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-11-20T08-03-37.008028.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-11-20T08-03-37.008028.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-11-20T08-03-37.008028.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-20T08-03-37.008028.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-11-20T08-03-37.008028.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-20T08-03-37.008028.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-20T08-03-37.008028.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-11-20T08-03-37.008028.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-11-20T08-03-37.008028.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-11-20T08-03-37.008028.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-20T08-03-37.008028.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-20T08-03-37.008028.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-20T08-03-37.008028.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-11-20T08-03-37.008028.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-20T08-03-37.008028.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-20T08-03-37.008028.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-20T08-03-37.008028.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-20T08-03-37.008028.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-11-20T08-03-37.008028.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-20T08-03-37.008028.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-20T08-03-37.008028.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-20T08-03-37.008028.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-20T08-03-37.008028.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-11-20T08-03-37.008028.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-11-20T08-03-37.008028.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-11-20T08-03-37.008028.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-11-20T08-03-37.008028.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-20T08-03-37.008028.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-11-20T08-03-37.008028.parquet", "**/details_harness|hendrycksTest-management|5_2023-11-20T08-03-37.008028.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-11-20T08-03-37.008028.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-11-20T08-03-37.008028.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-11-20T08-03-37.008028.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-11-20T08-03-37.008028.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-20T08-03-37.008028.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-11-20T08-03-37.008028.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-11-20T08-03-37.008028.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-11-20T08-03-37.008028.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-11-20T08-03-37.008028.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-11-20T08-03-37.008028.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-11-20T08-03-37.008028.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-11-20T08-03-37.008028.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-11-20T08-03-37.008028.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-11-20T08-03-37.008028.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-11-20T08-03-37.008028.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-20T08-03-37.008028.parquet", "**/details_harness|hendrycksTest-virology|5_2023-11-20T08-03-37.008028.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-11-20T08-03-37.008028.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-20T08-03-37.008028.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-11-20T08-03-37.008028.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-11-20T08-03-37.008028.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-11-20T08-03-37.008028.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-20T08-03-37.008028.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-11-20T08-03-37.008028.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-11-20T08-03-37.008028.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-11-20T08-03-37.008028.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-11-20T08-03-37.008028.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-11-20T08-03-37.008028.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-11-20T08-03-37.008028.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-11-20T08-03-37.008028.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-20T08-03-37.008028.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-11-20T08-03-37.008028.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-20T08-03-37.008028.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-20T08-03-37.008028.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-11-20T08-03-37.008028.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-11-20T08-03-37.008028.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-11-20T08-03-37.008028.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-20T08-03-37.008028.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-20T08-03-37.008028.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-20T08-03-37.008028.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-11-20T08-03-37.008028.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-20T08-03-37.008028.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-20T08-03-37.008028.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-20T08-03-37.008028.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-20T08-03-37.008028.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-11-20T08-03-37.008028.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-20T08-03-37.008028.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-20T08-03-37.008028.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-20T08-03-37.008028.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-20T08-03-37.008028.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-11-20T08-03-37.008028.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-11-20T08-03-37.008028.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-11-20T08-03-37.008028.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-11-20T08-03-37.008028.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-20T08-03-37.008028.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-11-20T08-03-37.008028.parquet", "**/details_harness|hendrycksTest-management|5_2023-11-20T08-03-37.008028.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-11-20T08-03-37.008028.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-11-20T08-03-37.008028.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-11-20T08-03-37.008028.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-11-20T08-03-37.008028.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-20T08-03-37.008028.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-11-20T08-03-37.008028.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-11-20T08-03-37.008028.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-11-20T08-03-37.008028.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-11-20T08-03-37.008028.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-11-20T08-03-37.008028.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-11-20T08-03-37.008028.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-11-20T08-03-37.008028.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-11-20T08-03-37.008028.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-11-20T08-03-37.008028.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-11-20T08-03-37.008028.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-20T08-03-37.008028.parquet", "**/details_harness|hendrycksTest-virology|5_2023-11-20T08-03-37.008028.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-11-20T08-03-37.008028.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_11_20T08_03_37.008028", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-20T08-03-37.008028.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-20T08-03-37.008028.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_11_20T08_03_37.008028", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-11-20T08-03-37.008028.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-11-20T08-03-37.008028.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_11_20T08_03_37.008028", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-11-20T08-03-37.008028.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-11-20T08-03-37.008028.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_11_20T08_03_37.008028", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-11-20T08-03-37.008028.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-11-20T08-03-37.008028.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_11_20T08_03_37.008028", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-20T08-03-37.008028.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-20T08-03-37.008028.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_11_20T08_03_37.008028", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-11-20T08-03-37.008028.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-11-20T08-03-37.008028.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_11_20T08_03_37.008028", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-11-20T08-03-37.008028.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-11-20T08-03-37.008028.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_11_20T08_03_37.008028", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-11-20T08-03-37.008028.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-11-20T08-03-37.008028.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_11_20T08_03_37.008028", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-11-20T08-03-37.008028.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-11-20T08-03-37.008028.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_11_20T08_03_37.008028", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-11-20T08-03-37.008028.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-11-20T08-03-37.008028.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_11_20T08_03_37.008028", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-11-20T08-03-37.008028.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-11-20T08-03-37.008028.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_11_20T08_03_37.008028", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-11-20T08-03-37.008028.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-11-20T08-03-37.008028.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_11_20T08_03_37.008028", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-20T08-03-37.008028.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-20T08-03-37.008028.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_11_20T08_03_37.008028", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-11-20T08-03-37.008028.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-11-20T08-03-37.008028.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_11_20T08_03_37.008028", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-20T08-03-37.008028.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-20T08-03-37.008028.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_11_20T08_03_37.008028", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-20T08-03-37.008028.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-20T08-03-37.008028.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_11_20T08_03_37.008028", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-11-20T08-03-37.008028.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-11-20T08-03-37.008028.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_11_20T08_03_37.008028", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-11-20T08-03-37.008028.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-11-20T08-03-37.008028.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_11_20T08_03_37.008028", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-11-20T08-03-37.008028.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-11-20T08-03-37.008028.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_11_20T08_03_37.008028", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-20T08-03-37.008028.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-20T08-03-37.008028.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_11_20T08_03_37.008028", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-20T08-03-37.008028.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-20T08-03-37.008028.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_11_20T08_03_37.008028", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-20T08-03-37.008028.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-20T08-03-37.008028.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_11_20T08_03_37.008028", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-11-20T08-03-37.008028.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-11-20T08-03-37.008028.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_11_20T08_03_37.008028", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-20T08-03-37.008028.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-20T08-03-37.008028.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_11_20T08_03_37.008028", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-20T08-03-37.008028.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-20T08-03-37.008028.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_11_20T08_03_37.008028", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-20T08-03-37.008028.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-20T08-03-37.008028.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_11_20T08_03_37.008028", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-20T08-03-37.008028.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-20T08-03-37.008028.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_11_20T08_03_37.008028", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-11-20T08-03-37.008028.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-11-20T08-03-37.008028.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_11_20T08_03_37.008028", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-20T08-03-37.008028.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-20T08-03-37.008028.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_11_20T08_03_37.008028", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-20T08-03-37.008028.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-20T08-03-37.008028.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_11_20T08_03_37.008028", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-20T08-03-37.008028.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-20T08-03-37.008028.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_11_20T08_03_37.008028", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-20T08-03-37.008028.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-20T08-03-37.008028.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_11_20T08_03_37.008028", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-11-20T08-03-37.008028.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-11-20T08-03-37.008028.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_11_20T08_03_37.008028", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-11-20T08-03-37.008028.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-11-20T08-03-37.008028.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_11_20T08_03_37.008028", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-11-20T08-03-37.008028.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-11-20T08-03-37.008028.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_11_20T08_03_37.008028", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-11-20T08-03-37.008028.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-11-20T08-03-37.008028.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_11_20T08_03_37.008028", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-20T08-03-37.008028.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-20T08-03-37.008028.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_11_20T08_03_37.008028", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-11-20T08-03-37.008028.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-11-20T08-03-37.008028.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_11_20T08_03_37.008028", "path": ["**/details_harness|hendrycksTest-management|5_2023-11-20T08-03-37.008028.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-11-20T08-03-37.008028.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_11_20T08_03_37.008028", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-11-20T08-03-37.008028.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-11-20T08-03-37.008028.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_11_20T08_03_37.008028", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-11-20T08-03-37.008028.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-11-20T08-03-37.008028.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_11_20T08_03_37.008028", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-11-20T08-03-37.008028.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-11-20T08-03-37.008028.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_11_20T08_03_37.008028", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-11-20T08-03-37.008028.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-11-20T08-03-37.008028.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_11_20T08_03_37.008028", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-20T08-03-37.008028.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-20T08-03-37.008028.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_11_20T08_03_37.008028", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-11-20T08-03-37.008028.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-11-20T08-03-37.008028.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_11_20T08_03_37.008028", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-11-20T08-03-37.008028.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-11-20T08-03-37.008028.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_11_20T08_03_37.008028", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-11-20T08-03-37.008028.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-11-20T08-03-37.008028.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_11_20T08_03_37.008028", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-11-20T08-03-37.008028.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-11-20T08-03-37.008028.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_11_20T08_03_37.008028", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-11-20T08-03-37.008028.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-11-20T08-03-37.008028.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_11_20T08_03_37.008028", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-11-20T08-03-37.008028.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-11-20T08-03-37.008028.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_11_20T08_03_37.008028", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-11-20T08-03-37.008028.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-11-20T08-03-37.008028.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_11_20T08_03_37.008028", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-11-20T08-03-37.008028.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-11-20T08-03-37.008028.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_11_20T08_03_37.008028", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-11-20T08-03-37.008028.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-11-20T08-03-37.008028.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_11_20T08_03_37.008028", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-11-20T08-03-37.008028.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-11-20T08-03-37.008028.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_11_20T08_03_37.008028", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-20T08-03-37.008028.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-20T08-03-37.008028.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_11_20T08_03_37.008028", "path": ["**/details_harness|hendrycksTest-virology|5_2023-11-20T08-03-37.008028.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-11-20T08-03-37.008028.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_11_20T08_03_37.008028", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-11-20T08-03-37.008028.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-11-20T08-03-37.008028.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_11_20T08_03_37.008028", "path": ["**/details_harness|truthfulqa:mc|0_2023-11-20T08-03-37.008028.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-11-20T08-03-37.008028.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_11_20T08_03_37.008028", "path": ["**/details_harness|winogrande|5_2023-11-20T08-03-37.008028.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-11-20T08-03-37.008028.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_11_20T08_03_37.008028", "path": ["results_2023-11-20T08-03-37.008028.parquet"]}, {"split": "latest", "path": ["results_2023-11-20T08-03-37.008028.parquet"]}]}]} | 2023-11-20T08:07:21+00:00 | []
| []
| TAGS
#region-us
|
# Dataset Card for Evaluation run of maywell/Synatra-7B-v0.3-dpo
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model maywell/Synatra-7B-v0.3-dpo on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-11-20T08:03:37.008028(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
| [
"# Dataset Card for Evaluation run of maywell/Synatra-7B-v0.3-dpo",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model maywell/Synatra-7B-v0.3-dpo on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-11-20T08:03:37.008028(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
]
| [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of maywell/Synatra-7B-v0.3-dpo",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model maywell/Synatra-7B-v0.3-dpo on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-11-20T08:03:37.008028(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
]
| [
6,
23,
31,
172,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
]
| [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of maywell/Synatra-7B-v0.3-dpo## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model maywell/Synatra-7B-v0.3-dpo on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-11-20T08:03:37.008028(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
]
|
f345807060135bcd5f4c589a27bf2b0fb04f3a66 | # Dataset Card for "esg-fine-risks"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | lewisbails/esg-fine-risks | [
"region:us"
]
| 2023-11-20T08:24:23+00:00 | {"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "val", "path": "data/val-*"}, {"split": "test", "path": "data/test-*"}]}], "dataset_info": {"features": [{"name": "id", "dtype": "string"}, {"name": "target", "dtype": "string"}, {"name": "context", "dtype": "string"}, {"name": "system", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 24908672.528996333, "num_examples": 6966}, {"name": "val", "num_bytes": 1390281.1275295396, "num_examples": 387}, {"name": "test", "num_bytes": 1433682.452532935, "num_examples": 401}], "download_size": 13297109, "dataset_size": 27732636.10905881}} | 2023-11-20T08:24:28+00:00 | []
| []
| TAGS
#region-us
| # Dataset Card for "esg-fine-risks"
More Information needed | [
"# Dataset Card for \"esg-fine-risks\"\n\nMore Information needed"
]
| [
"TAGS\n#region-us \n",
"# Dataset Card for \"esg-fine-risks\"\n\nMore Information needed"
]
| [
6,
17
]
| [
"passage: TAGS\n#region-us \n# Dataset Card for \"esg-fine-risks\"\n\nMore Information needed"
]
|
77ccbb7ebead0c1bc579c3e21218ab8db1ba7e91 | # Dataset Card for "AutomaticSpeechRecognition_LibriSpeech-TestClean"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | DynamicSuperb/AutomaticSpeechRecognition_LibriSpeech-TestClean | [
"region:us"
]
| 2023-11-20T08:26:25+00:00 | {"configs": [{"config_name": "default", "data_files": [{"split": "test", "path": "data/test-*"}]}], "dataset_info": {"features": [{"name": "file", "dtype": "string"}, {"name": "audio", "dtype": "audio"}, {"name": "label", "dtype": "string"}, {"name": "instruction", "dtype": "string"}], "splits": [{"name": "test", "num_bytes": 367735218.48, "num_examples": 2620}], "download_size": 350471879, "dataset_size": 367735218.48}} | 2023-11-20T08:28:15+00:00 | []
| []
| TAGS
#region-us
| # Dataset Card for "AutomaticSpeechRecognition_LibriSpeech-TestClean"
More Information needed | [
"# Dataset Card for \"AutomaticSpeechRecognition_LibriSpeech-TestClean\"\n\nMore Information needed"
]
| [
"TAGS\n#region-us \n",
"# Dataset Card for \"AutomaticSpeechRecognition_LibriSpeech-TestClean\"\n\nMore Information needed"
]
| [
6,
28
]
| [
"passage: TAGS\n#region-us \n# Dataset Card for \"AutomaticSpeechRecognition_LibriSpeech-TestClean\"\n\nMore Information needed"
]
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.